• 0 Posts
  • 26 Comments
Joined 2 months ago
cake
Cake day: May 11th, 2024

help-circle
  • What makes the “spicy autocomplete” perspective incomplete is also what makes LLMs work. The “Attention is All You Need” paper that introduced attention transformers describes a type of self-awareness necessary to predict the next word. In the process of writing the next word of an essay, it navigates a 22,000-dimensional semantic space, And the similarity to the way humans experience language is more than philosophical - the advancements in LLMs have sparked a bunch of new research in neurology.





  • I understand this perspective, because the text, image, audio, and video generators all default to the most generic solution. I challenge you to explore past the surface with the simple goal of examining something you enjoy from new angles. All of the interesting work in generative AI is being done at the edges of the models’ semantic spaces. Avoid getting stuck in workflows. Try new ones regularly and compare their efficacies. I’m constantly finding use cases that I end up putting to practical use - sometimes immediately, sometimes six months later when the need arises.





  • It saves me 10-20 hours of work every week as a corpo video producer, and I use that time to experiment with AI - which has allowed our small team to produce work that would be completely outside our resources otherwise. Without a single additional breakthrough, we’d be finding novel ways to be productive with the current form of generative AI for decades. I understand the desire to temper expectations, and I agree that companies and providers are not handling this well at all. But the tech is already solid. It’s just being misused more often than it’s being wielded well.