O futuro da IA está chegando para nossos olhos: o aplicativo matador de VR e AR, com incógnitas matadoras –
I was immersed in a world of virtual things: 3D maps of my home and places I’d never been, videos wrapping around my head, and browser pages floating all around me. But I wasn’t alone. A friendly voice was there, listening to my questions and understanding what I was seeing. This companion seemed to see what I saw and hear what I heard. Was Google’s Gemini AI behind me, surrounding me or inside me? Where did my perceptions end and AI’s begin? I was demoing a future Samsung mixed-reality headset with Google’s Gemini AI 2.0 inside it. The headset won’t be out until later in 2025, but it’s as good a sign as any — not to mention a warning — of what’s coming soon in personal tech.
AI has listened to and responded to us for years. It’s heard our voice prompts, read our text prompts and scanned our photos, all through our laptops, phones and in the cloud. Next up, AI has its sights set on our eyes. These ideas aren’t new, but we’re on the verge of seeing companies flip the switch, making surprising things happen with headsets and glasses — some already available, others still on the horizon. Google’s Android XR chess move is just the first. Expect Meta, Apple, Microsoft and plenty of others to follow right along. Some already are.
From what I’ve already seen, it’ll make what we think about AI now seem like the opening act. Google’s Android Ecosystem President Sameer Samat sees AI and XR (the industry’s current abbreviation for “extended reality”, a space that covers VR, AR and AI-assisted wearables) becoming a natural fit. “It can actually help you control the UI. It can work on a problem collaboratively with you and take actions in that virtual space with you,” Samat told me. My demos in Android XR offered glimpses of this, showcasing an AI companion experience unlike anything I’ve tried before. It felt more personal, as if the AI was almost living in my head, seeing what I saw. That future is already here.