Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
684: Get More Language Context out of your LLM image

684: Get More Language Context out of your LLM

Super Data Science
Avatar
0 Plays1 year ago
Open-source LLMs, FlashAttention and generative AI terminology: Host Jon Krohn gives us the lift we need to explore the next big steps in generative AI. Listen to the specific way in which Stanford University’s “exact attention” algorithm, FlashAttention, could become a competitor for GPT-4’s capabilities. Additional materials: www.superdatascience.com/684 Interested in sponsoring a SuperDataScience Podcast episode? Visit JonKrohn.com/podcast for sponsorship information.
Recommended