Google won’t ship tech from Project Astra, its wide-ranging effort to build AI apps and “agents” for real-time, multimodal understanding, until next year at the earliest.
Google CEO Sundar Pichai revealed the timeline in remarks during Google’s Q3 earnings call Tuesday. “[Google is] building out experiences where AI can see and reason about the world around you,” he said. “Project Astra is a glimpse of that future. We’re working to ship experiences like this as early as 2025.”
Project Astra, which Google demoed at its I/O developer conference in May 2024, encompasses a range of technologies, from smartphone apps that can recognize the world around them and answer related questions to AI assistants that can perform actions on a user’s behalf.
In a prerecorded demo during I/O, Google showed a Project Astra prototype answering questions about things within view of a smartphone’s camera, like which neighborhood a user might be in or the name of a part on a broken bicycle.
The Information reported this month that Google was planning to launch a consumer-focused agent experience as early as this December — one capable of purchasing a product, booking a flight, and other such chores. That now seems unlikely — unless the experience in question is divorced from Project Astra.
Anthropic recently became one of the first companies with a large generative AI model able to control apps and web browsers on a PC. But, illustrating how challenging building AI agents can be, Anthropic’s struggles with many basic tasks.