'I think you see the future first on Android' – Google's Android leader Sameer Samat

Sameer Samat, President of Android Ecosystem at Google
(Image credit: Lance Ulanoff / Future)

My conversation with Sameer Samat, Android’s daddy…er…the President of Android Ecosystem at Google, started with him quizzing me.

Perhaps he noticed my MacBook Pro or the Apple Watch on my wrist. I did make sure to at least record the interview with the lovely new Samsung Galaxy Z Fold 7 I’m testing, but Samat wanted to understand how and, maybe why, I’m using a Mac.

I stammered, unprepared for the interrogation, but Samat wasn’t asking me why I don’t use Android. Instead, he’s curious about the apps I use on my laptop and how I manage the world of my iPhone, MacBook, and Apple Watch.

“I asked because we’re going to be combining ChromeOS and Android into a single platform, and I am very interested in how people are using their laptops these days and what they’re getting done,” Samat explained.

I was relieved as we shifted to talking about Chrome, Chrome OS, Mobile CPUs in laptops, and how we both used to upgrade our own system RAM before finally getting to the topics I’m most interested in: Android, Gemini, Samsung, Galaxy AI, and Android XR.

It’s already been a big year for Google and Android, which Samat, a 16-year Google veteran, runs. Android 16 (code-name Baclava) was officially unveiled at Google I/O 2025 in May, bringing with it Material 3 Expressive, which Samat termed the platform’s “biggest design change in Android in three or four years.”

Android 16 arrives with more support for tablet-sized canvases, smoother animations, and far more customization, including the ability to take a photo, set it as your wallpaper, and then bleed the image’s color palette throughout the system into places like Gmail and even third-party apps.

Samat told me the goal was to make something “modern and delightful” but also “approachable and familiar,” avoiding a “who moved my cheese” scenario.

Making it yours

Perhaps that’s why, to me at least, Android 16 feels incremental, but in a good way. I recognize everything, but I can also see how I can make it my own.

For Samat, that last part is the goal.

“Android has always been about enabling you to personalize and customize your device. With the initial launch of Material Design, several years ago, we took it to the next level.”

The Android 16 experience, he tells me, is one that allows you to fit the ethos of what’s on your home screen with the rest of your phone.

You get Android 16, and you get Android 16

The other big Android 16 update this year is not about design or even utility, but access.

As I’m testing the Samsung Galaxy Z Fold 7, I am experiencing Android 16 for the first time. This is unusual because typically, a new Samsung flagship arrives with the previous generation Android platform, only getting the upgrade after the newest Google Pixels have arrived with the latest Android update, which can be months later.

Not this year.

“We have modified our entire development process over a year ago in order to release Android more frequently and to make sure that device manufacturers have access to the latest in a way that’s more timed with their big phone releases.”

Part of that process is called “Trunk Stable”, and it has made it possible for Google to provide more frequent Android versions. And the new “Android Drops” lets Google update the Android experience almost slipstream, without the need for an operating system update.

“We do these quarterly, typically, and give you a nice pop-up on your phone, which says, 'Hey, your phone just got better.’”

Making the choice

Sameer Samat, President of Android Ecosystem at Google

Sameer Samat, President of Android Ecosystem at Google (Image credit: Lance Ulanoff / Future)

I didn’t want to turn this into an Android versus iOS discussion, but I was curious about how Samat sees the choice and what he might say if someone asked him how to choose.

For Samat, though, choosing equates to switching (he’s probably right; people have likely already chosen sides).

“First, I think it’s really important to recognize that it isn’t easy to switch – it isn’t as easy to switch as it should be. One of the things we believe is that consumers should be able to move to whatever phone and platform they want without barriers. Take your data. Have your apps move over. Like, this should be simple. It’s 2025.”

Apple and Google do provide instructions and even apps for making the migration, but I get Samat’s point. There are risks with a shift, and it’s not just about moving from the iPhone to, say, a Samsung Galaxy or Google Pixel. It’s the ecosystem, and losing some of that connective tissue between devices.

That’s the macro view. At a micro level, the platform differences often boil down to a pair of colors: Green and Blue.

“Messaging is one area where consumers, particularly in the US, have been really concerned about green bubbles, and what that stems from is a difference in capability between those two bubbles. With RCS, which replaces SMS, you have a modern standard."

Samat detailed all the changes RCS brings, like consistent image quality on photos and videos, the ability to see when someone is typing on the other end of a chat, and read receipts.

“But I think it needs to go beyond that. We believe these platforms should make it much easier, using industry standards to move data between the two, and that is a lot of what we’re advocating for and pushing for,” he added.

If you make the effort to switch from iOS to Android, Samat is convinced it’ll be worth the effort. He told me he’s seen it with people who’ve done it.

“I think you see the future first on Android,” he told me.

Samsung’s remarkable Galaxy Z Fold 7 certainly makes a case for that. It’s now the apex foldable and one Apple will surely be chasing if and when it releases the iPhone Fold. But it’s more than that, and Samat pointed to the Gemini integration across Android devices.

The Gemini of the thing

Man using Gemini Live on an phone.

(Image credit: Shutterstock/Rokas Tenys)

Calling it “that Gemini Experience,” Samat says those who make the switch are “seeing over and over again what they’ve been missing.”

Like the proud poppa he is, Samat proceeded to demonstrate Gemini’s deep Android integration, starting with his personal quest to find a family car. I watched as he found a 2020 Honda Odyssey minivan, and then gave Gemini screen access so it could see what he was looking at, summarize, and help him understand his choices.

It’s a powerful demonstration. In part because this is what Apple promised and has yet to deliver with Apple Intelligence and Siri. Gemini just seems miles ahead.

Samat also showed me how Gemini could help you not only summarize a long YouTube video but break down its claims (with time stamps) by pulling in sources from the Web. Here, it seems Gemini is really leaning into its Google search roots.

“What I’m getting at,” Samat continued, “is that Gemini integrated in a phone really provides a much more helpful experience. And when consumers do move over from an iPhone, they instantly realize that they’ve been missing a lot of this.”

And as Apple races to catch up with Apple Intelligence, Android and Gemini are racing ahead with partners like Samsung, but it’s not all perfection there.

Too many AIs

Samsung Galaxy Z Fold 7 HANDS ON

(Image credit: Lance Ulanoff / Future)

Pick up a Samsung phone like the Galaxy Z Fold 7 and you’ll be confronted by three kinds of onboard intelligence: Samsung Galaxy AI, Bixby, and Gemini. Granted, Bixby has a narrow, system-based focus, but its continued existence doesn’t exactly help with the confusion. However, it’s the relationship between Galaxy AI and Gemini that I was hoping Samat could help sort out.

“The way I explain this to consumers across all the different Android devices out there, all the different flagships, you have Circle to Search, and you have Gemini. Circle to Search is the best multimodal search from Google. You can use it across anything that’s on your screen…It’s an amazing search capability you can access by long pressing the same button on any Android phone."

For us, as Google, we want to make sure those two pieces [Circle to Search and Gemini] are very clearly accessible, very clearly identifiable across all the different devices that consumers are considering.

Sameer Samat

"Gemini is the best assistant out there. Hold down the side key on any flagship Android phone, it’s gonna bring up Gemini, right to the context of where you’re at. Those two capabilities are built by Google,” said Samat.

He added, though, that Google doesn’t own the term “AI” and that other companies are developing their own AI capabilities, and will continue to do so.

“I think that’s great,” he said, “If the features are great, it’s more value for the consumers and more innovation. But I think for us, as Google, we want to make sure those two pieces [Circle to Search and Gemini] are very clearly accessible, very clearly identifiable across all the different devices that consumers are considering.”

In a way, Samsung appears to have achieved that balance, even if they do not always clarify exactly where Galaxy AI ends and Gemini begins.

Seeing through Glass

Gogole IO 2025

(Image credit: Future)

As the master of all things Android, Samat is also in charge of Android XR, Google’s still-young extended reality platform that’s currently being used to develop headsets like Samsung’s Project Moohan.

Samat, though, wanted to clarify why Google is even approaching this space.

“First, our mission with Android is to transform computing to empower everyone. That means making computing accessible to people around the world. There’s 3 billion-plus Android devices. Sometimes people’s first and only computer in their life is an Android device. That’s important, but there’s also the high-end of computing, where you push the boundaries, and you push the envelope of what’s possible…. you open up new possibilities for computing.”

Samat called the area exciting and reminded me (as if I needed it) that Google is not new to this space. Obviously, Google pushed the envelope too early with Google Glass, but that was a learning experience for Google, and Samat says they never stopped working on it.

“One of the things I love about working at Google is that it is a very self-reflective place.

We do make mistakes, of course, but I think the main point is that we have a process internally called ‘Retrospectives’, and these are done for many things.

If we make a mistake when we launch something, we do a blameless Retrospective and we all get around a table and talk about what went wrong, not for the purpose of finding fault, but for what did we learn and what can we do differently?”

Samat was not talking about Google Glass from a remove. He recounted how he wore them on a trip to Disneyland with his 5-year-old, knowing that he’d be “socially awkward.” However, when he went on a ride with his child, he held onto the kid with one hand and the railing for dear life with the other, and was still able to film the thrill ride experience with Google Glass.

“So, there were parts that were amazing, and there were parts that were clearly not right…I think the vision is correct, but…it just wasn’t ready,” said Samat.

We all get around a table and talk about what went wrong, not for the purpose of finding fault, but for what did we learn and what can we do differently?”

Sameer Samat

In the case of Google Glass and the path forward, it’s clear that “step changes” in technology, including processors and materials, but perhaps most importantly, AI, make this the moment for Android XR and glasses built around it.

“So, one of the learnings, for example, quite obvious in retrospect, is that glasses and watches are for many people jewelry. They’re functional, but they’re also something you want to have on your person.”

Which is why Google partnered up with Warby Parky and Gentle Monster, two brands that understand the marriage of fashion and function.

Beyond fashion, though, it’s undeniably AI that will transform the wearable space.

“Add AI part to that, when there is a natural form factor for the AI Era, because [the glasses] can have a camera. The AI, with your permission, can see what you’re seeing,” Samat said, offering examples like the AI translating a sign you’re looking at or helping you learn by interpreting a diagram on a whiteboard.

“That technology was not with us back when we did the first iteration, and I think that will be a huge part of what people find really useful and helpful.”

You might also like

TOPICS
Lance Ulanoff
Editor At Large

A 38-year industry veteran and award-winning journalist, Lance has covered technology since PCs were the size of suitcases and “on line” meant “waiting.” He’s a former Lifewire Editor-in-Chief, Mashable Editor-in-Chief, and, before that, Editor in Chief of PCMag.com and Senior Vice President of Content for Ziff Davis, Inc. He also wrote a popular, weekly tech column for Medium called The Upgrade.

Lance Ulanoff makes frequent appearances on national, international, and local news programs including Live with Kelly and Mark, the Today Show, Good Morning America, CNBC, CNN, and the BBC. 

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.