Google IO 2024 AI Announcements Recap

With the goal of keeping our readers updated on the latest releases and launches in AI world, we are brining you all the latest updates from top conference and events in the AI World. Google seems to be back in the game with a wave of announcements about new AI developments and the integration of its latest AI model across company products and the cloud.  here is a list of all the noteworthy announcements from Google IO:

  • Gemini Nano, Google’s on-device SLM, is getting a multimodality capability, and a new model: Gemini-1.5-Flash was launched, which was optimized for speed and efficiency. Additionally, a new text-to-image Imagen 3 is Google's highest-quality text-to-image generation model yet, was announced to be coming soon.
  • Gemini 1.5 Pro has now a 2M context-window size, which is 10X larger than the next model, Anthropic's Claude 3, at 200K context window. 
  • On open-source models, Google also announced Gemma 2, a new open model for responsible AI innovation, and PaliGemma, their first vision-language model. 
  • Google Workplace suite is now getting Gemini AI Integration, which can help craft documents and emails, sheets, slides or summarize long email threads.
  • Embedded into apps like Docs and Gmail is a new Gemini-powered AI tool, called "AI Teammate". This is like a coworker productivity coworker. 
  • Gems is a new feature that sets automated routines for things. Gems was announced at IO but not available yet. 
  • Google also announced Project Astra, a framework for universal AI agents that can be helpful in everyday life. The vision is for a real-time, multimodal AI assistant that can see the world and can answer questions to help the user to do almost anything. Google also announced at Next ‘24, Vertex AI Agent Builder, which enables developers to easily build and deploy enterprise-ready gen AI experiences. 
  • Google also announced Trillium, the sixth generation of Google Cloud TPU. Trillium TPUs achieve a 4.7X increase in peak compute performance per chip compared to TPU v5e and scale up to 256 TPUs in a single pod. Trillium TPUs are a part of Google Cloud's AI Hypercomputer, a supercomputing architecture designed specifically for AI workloads.

Read more on the details of these Google IO 2024 announcements here

Don't miss these stories: