It’s the end of the road for mobile phones.
Now 18 years removed from Steve Jobs’ landmark iPhone launch, technological progress is at a standstill. Apart from incrementally better cameras and marginally faster processors, we’ve hit the physical limitations of what is possible with these slabs of glass and metal living in our pockets and dominating our social lives.
Silicon Valley foresaw this demise over a decade ago and began laying the groundwork for the next stage in personal computing: smart glasses. Initially it manifested as Frankensteinian science experiments like Google Glass in 2014 or short-lived fads like Snapchat’s Spectacles in 2016.
By 2021’s Meta Rayban Stories, it still looked very much like a niche product. The biggest headline was that a legacy brand like Rayban was willing to take such a publicized step into the unknown, but the product itself was not particularly noteworthy.
The mainstream breakout moment was really with 2023’s Meta Rayban Gen1, which continues to be the best-selling pair of smart glasses today. With a reasonably capable camera and strong Bluetooth audio, you now had a device that both looked good on your face and had real utility.
The success of the Meta Rayban Gen1 blew the doors wide open on Mark Zuckerberg’s biggest play: a futuristic ecosystem for smart glasses. Of course, that wouldn’t be possible with the hardware in the Gen1’s. If you wanted something futuristic, you needed a visual interface, and it needed to be high-fidelity.
Enter the Display

When Zuckerberg introduced the Meta Rayban Display glasses in September 2025, it felt like a tremendous leap forward. Here was a pair of fairly normal looking glasses that somehow had all the basic utilities of your average smart phone. They were a little chunkier than I’d like, to be fair, but the amount of stuff they could do was downright astounding.
As the name implies, it was the display technology that was the real breakthrough. I’ve owned several pairs of smart glasses over the years, and until that moment, the most sophisticated consumer model could only render monochrome screens. Even Realities launched their G1 glasses in 2024, and this year, startups Halliday and Rokid also both launched competing designs. All three products render their floating interfaces in a pixelated green text, which is cool in its own retro kind of way.

But here was Zuck with a floating semi-transparent full-color display in his field of vision – entirely invisible to onlookers but crystal-clear to the wearer. He was not only browsing messages on Facebook Messenger, Instagram, and Whatsapp, but he could also get on video calls and see the person’s face while simultaneously sharing his POV with them. There was also Meta AI of course, which theoretically gives you an always-available assistant for many of the other use cases that the limited app suite couldn’t cover. It would have been his iPhone moment, if not for some terrible wifi issues plaguing the Menlo Park demo.
A next-gen controller
Of course, a display must have a corresponding input device, and where other smart glasses rely on your voice and a few taps on their frames, Meta’s flagship model comes with a separate wrist band that allegedly reads your mind.

Dubbed the “Neural Band,” it senses the electromagnetic impulses that your brain sends to your dominant hand, interpreting your discrete taps and swipes as commands. Critically, the band does not rely on computervision to see what your hand is doing – your hand does not even need to be raised in order for the commands to register.
The band is an impressive controller on its own, but when coupled with the glasses it allows for some truly fascinating scenarios. It’s accurate enough that you could have your hand hidden in your jacket pocket and be discretely scrolling through your messages without anyone knowing. A future software update even promises a form of handwriting support that detects the letter shapes that you trace out with your fingertip and autocorrects them into full messages. Imagine the Senate hearings!
To be clear, the Display glasses are not standalone devices, so your phone does still need to be within Bluetooth range. But as software support matures, I can imagine a near future where my clunky old phone spends most of the day in my pocket or backpack. If the majority of your phone time is spent on messaging, this is technically already possible now. There is one critical caveat though: for now, the supported apps are Meta-only, so there’s no Viber or Telegram. On the media consumption side, that means there’s also no Youtube or Tiktok. Doomscrollers need not apply.
A view of the future

The Meta Rayban Display glasses cost $800, are only available in the United States, and currently require an in-store visit for purchase. They come with transition lenses out of the box, but you can get them with prescription lenses for an additional fee. As they are both expensive and challenging to acquire – I had to get mine through the gray market – it is unlikely that we will see them rewriting the fabric of society anytime soon.
But as they become cheaper and more widely available, I think they’ll trigger the biggest fundamental shift in consumer technology since Steve Jobs’ seminal keynote all those years ago.
If you think about it, the form factor of modern mobile phones is unnatural and inconvenient. They’re slippery, inorganic rectangles that are hard to hold and don’t conform to our bodies at all. Through the years, our pockets, bags, and bedside tables have all adjusted to accommodate them and their associated accessories. Heck, even our rules of social etiquette have had to adjust for those sad phone addicts who spend dinnertime staring intently at the device in their lap as if it was an instruction manual on how to chew. We’ve learned to wrap our lives around the care and use of these phones, because they entertain us and (ostensibly) allow us to be productive.
But a lightweight pair of smart glasses? They’re superior in so many ways. They protect your eyes both from harsh sunlight and the blue light from your monitors, they correct your vision if you require prescriptions, and they make you look good. And that’s not even talking about a vastly improved user experience, assuming that the software support is up to par.
Imagine traversing a new city with walking directions displayed right in your POV, or talking to a foreigner and seeing real-time subtitles floating below their face. Or just being able to take videos with both hands free – incredibly useful when walking your dog or carrying groceries or unboxing a sketchy Shopee package. (Don’t worry, young influencer, you can still take selfies by removing the glasses and pointing them at yourself!) All of these features are currently already working in the Metas, and I expect that both the hardware and software ecosystem will just keep growing from this point on.
More than anything, I am looking forward to a world where people spend more of their time facing each other, instead of constantly looking down. It’s not just bad for your neck, it’s probably bad for your soul as well.

Granted, these glasses will likely allow for a new social faux pas – one where you are not entirely sure if the person you are with is actually listening to you or if they’re reading an ebook while facing in your general direction. There’s no outward sign that the glasses are displaying anything, so unless you’re looking for that subtle downward eye movement, it’d be almost impossible to tell.
So, maybe it’s not exactly the social utopia that I was originally pitching. But hey, at least everyone’s posture will get better.
