ChatGPT can now see, hear, and speak; Amazon to Invest $4B in Anthropic; Meta to develop a ‘sassy chatbot’ for younger users; RAG vs. Finetuning LLMs - What to use, when, and why; LongLoRA: Efficient fine-tuning of long-context LLMs;

AI Unraveled: Latest AI News & Trends, GPT, Gemini, Generative AI, LLMs, Prompt, AI Bedtime Stories by Etienne Noumen

Episode notes

ChatGPT can now see, hear, and speak; Amazon to Invest $4B in Anthropic; Meta to develop a ‘sassy chatbot’ for younger users; RAG vs. Finetuning LLMs - What to use, when, and why; LongLoRA: Efficient fine-tuning of long-context LLMs;

Video: https://youtu.be/MoYJGe-pvwM

In today's episode, we'll cover ChatGPT's voice capabilities and image inclusion, Amazon's $4 billion investment in Anthropic, Meta's plan for various chatbot personas, the efficiency of LongLoRA for extending context sizes of pre-trained LLMs, the differences between RAG and Finetuning LLMs, Coinbase CEO's opposition to AI regulation, various AI-related news including Meta's chatbots and Google Pixel 8's AI camera, and the recommendation to expand AI knowledge with the book 'AI Unravele ... 

 ...  Read more
Keywords
chatgpt can now seechatgpt can now hearchatgpt can now speakamazon to invest 4 billions dollars in anthropicsassy chatbotlonglorarag vs finetuning llms