Meta Ray-Ban Display
Posted4 months agoActive3 months ago
meta.comTechstoryHigh profile
skepticalmixed
Debate
80/100
Ar GlassesMetaWearable Technology
Key topics
Ar Glasses
Meta
Wearable Technology
Meta released its Ray-Ban Display glasses with AI capabilities and a wristband controller, sparking debate among HN users about its practicality, privacy concerns, and potential impact on society.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
5m
Peak period
129
0-12h
Avg / period
26.7
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Sep 17, 2025 at 8:30 PM EDT
4 months ago
Step 01 - 02First comment
Sep 17, 2025 at 8:35 PM EDT
5m after posting
Step 02 - 03Peak activity
129 comments in 0-12h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 23, 2025 at 2:20 AM EDT
3 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45283306Type: storyLast synced: 11/22/2025, 11:47:55 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
reference: https://www.youtube.com/watch?v=hhZdWvnF3do
That's just like, your opinion, man.
Too often HN threads devolve into the same tired comparisons about laserdisks and Palm Pilots. The only precedent we have for a product like this failing is Vision Pro, and this is nothing like that. Your comment was jumping to a conclusion that I think many would disagree with.
The glasses seem pointless to me for now. I’m surprised he didn’t add a booty zoom in view. We thought of that idea way back in middle school. Seems like something he’d vibe with.
Why do I need to pay $800 for this? I already paid a grand to have a phone disrupt my every waking moment!
They account for 30% of the global market. They own key brands, license key premium names, and control key distributors like sunglass hut and LensCrafters.
Their cost to manufacture vs sale price shows a clear ability to price like a monopoly. As does their ability to box out competitors.
The $10 look alikes are not identical. They generally are cheaper materials, not polarized or coated, etc.
Again, you are getting confused by branding vs monopoly. They sell luxury goods and can mark them at wild premiums, same as Hermès and Ferrari. None of them are monopolies. Very far from it.
No, it doesn't. It shows there exists demand for their products at that price point.
>As does their ability to box out competitors.
They have none. Anyone can go to various websites and order cheaper sunglasses that work just as well, or go to Costco and buy them for $25.
That's funny because the ones sold on my street are $10 and they definitely have the rayban logo
So unless you have a rare medical condition AND you're buying plastic lens glasses, I think you're worrying for nothing.
I had the idea of wearables to solve this, as many years ago I had the Myo gesture control armband. They were very early with this product too, and from what I had read, most of that team got acquired/absorbed into Magic Leap
At one point I was tracking a company researching beaming images straight on your eye. I think they were MS related, but not sure. After a while they stopped updating, so I guess that went nowhere? It seemed really promising.
Skip to around 53:00
And if you really get into them, maybe it's worth spending the time to learn chords instead, like for a court stenographer?
IMHO the tell on why there's a delay is the original comment expressing wonderment at Zuckerberg demonstrating 30 WPM.
i.e. it sucks.
It's nice technology, engineering, glad they had the courage, sure its useful for its purpose.
However, in practice, humans being humans, the odds I regularly put on a glove, to get 30 WPM, on my glasses computer...very low.
(also, looking back at the original comment...neural interface? wtf? It's not neural...)
I doubt it has enough accuracy for a virtual keyboards (since keyboards require precise absolute input and it measures relative), besides, most people aren't experienced with single-hand typing.
A bespoke gesture based shorthand would be optimal, but then users would need to spend months learning this new shorthand.
But (almost) everyone already has experience with handwriting, which is a single hand relative input method. It's the easiest option for people to quickly pick up and enjoy.
Though, it's far from perfect, you can see he is struggling to trick his muscle memory into writing without a pen, and he needs to do it on a solid surface (I'm not sure if that's a technology limitation, or a muscle memory limitation).
It can therefore translate it to a handwritten stroke and then do classical handwriting to text conversion.
You've got to type with your shoulders if you want to avoid RSI!
Typing can also work, but handwriting is simply faster and easier to decode.
sEMG signals correlate with *muscle* activation. When your fingers move, the actuators are the muscles in your forearm, and the tendons relay the force on the joint. Placing the band higher up on the forearm would actually give you better signals, but a wrist placement is much more socially acceptable.
NVIDIA, obviously and Meta are definitely on this list.
But Meta's business is clearly getting more and more sweet data from its users. How anyone can not see past this being a surveillance tool for a vast amount of data is unbelievable to me.
maybe this is not something that you understand, especially if you're in the US, as there it's common to move farther than the distance between Madrid to Budapest, as an example, but for a lot of people I know, like me, who live more than a 1000km from their childhood friends and 3/4 of the family, any innovation that helps us meet more often and do more things together is welcome.
forget the glasses. it's a step in a direction. there will be many more steps. if you have not already, I urge you to watch Mark's interview at the Acquired event, he talks about his vision there.
do they need money to make all of this happen? of course. you can be part of this as well just by buying META stock.
in the EU I do not need to log in with my facebook account anyway.
If it was open source with zero tracking, I might consider it. As it is, HARD PASS.
I've and many other people have found targeted ads to be good. ;)
Facebook effectively achieved that vision and then blew it up in the name of money. It was a service that showed you the posts of your connected friends. Those days are long gone. They burned peoples trust in them. If you want to connect with far away friends and family then FB is no longer the appropriate tool. It should have been.
Although I'll say this as someone who has moved far away on more than one occasion: keeping in touch with the people you left behind via digital means is no replacement for being with them and a distraction from making new friends. People do the same stuff for the most part, and one persons new baby looks pretty much the same as another's. You really don't need to keep constant touch, you'll cover 20 years of important things in a 30 minute conversation down the pub when you go home.
It really is a shame that such a useful tool for staying connected with the real people in your life warped into the monster it is now. If Google's social media had caught on (and not inevitably went the same way in the name of maximizing attention) it probably would have been close to optimal. I get that they aren't running a charity, but its sad to think about how many benefits social media brought compared to the addiction machine it is now.
This isn't going anywhere.
I mean... have you ever used a phone?
Yes, mobile phones use touchscreens, and billions of people have smartphones, that is correct. Yes the audience of HN is far removed, not gonna argue that. Because that's not what we're talking about.
Grandparent very correctly points out that mobile phones haven't replaced traditional keyboards, in fact there's probably more keyboards being sold now than at any point in history before, that's because phone touchscreen haven't replaced keyboards, they're just a new interface for a new device. 15 years later other devices are still using other interfaces, and the actual places where it has been replaced are not that many. Only point of sale machines and cars come to mind having replaced keyboards (and I'm being very generous, honestly I wouldn't even call that keyboards) with touchscreens, and some car brands are even starting to walk it back.
It has replaced all of your keyboards every time you ever input text on your phone.
> why have mechanical keyboards become so damn popular and not "keyboards on screens?"
Keyboards in general have become more popular, as more and more people get computing devices. I'm willing to bet the increase of keyboards on screens is much greater than the increase of mechanical keyboards; there are far more smartphones than mechanical keyboards.
Both are popular. Literally 100s of millions of people use keyboards on screens, so I'm not even really sure what you think you are trying to say. The desktop computer market isn't likely to move away from mechanical keyboards, but the phone/tablet market did a long time ago. Something with more accuracy, or that can be used for touch typing, without taking up device real estate, could definitely blow up in that market.
It's a very similar hobby to collecting post-marks.
> those laser beam projected keyboards blown up
Because no serious company never made one and what was nowhere near usable beyond "Checks what another useless thing I got!"
Two, just try doing this now, moving your hand around like its writing with a pen and see how it feels (without holding a pen). It's super uncomfortable and feels really weird and also looks really weird.
People are really sensitive to looking weird and feeling weird and especially being singled out for being weird or looking weird. Also, there is a huge subset of society now who will not buy anything made by Meta. I think this product is doomed to failure honestly. Happy to eat my words when I see the subway filled with people wearing this dystopian specs.
So that means this is just adding 2 more gadgets, both of which I now need to wear?
Nah. Not happening.
Neat gestures though.
Yeah, I see where this is going. (And here I am wanting less gadgets.)
https://www.theverge.com/2023/8/3/23818462/meta-ray-ban-stor...
OTOH, for me the Quest killer app is Ace. I can practice pistol shooting any time I want, which keeps me using the headset every day. For the glasses, the killer app might be translation. Now, I couldn't say if that will 'translate' into widespread user retention, or — like Ace — only really keep a smaller community engaged (I don't think most users need translation services on a regular basis).
But hey, at least it's not all faked
live demonstrations are tough - i wish apple would go back to them.
Pitches can be spun, data is cherry picked. But the proof is always in the pudding.
This is embarrassing for sure, but from the ashes of this failure we find the resolve to make the next version better.
For an internal team sure absolutely, but for public-facing work, prerecorded is the way to go
Not doing it live would've been an embarrassment. I don't think the thought ever crossed anyone's mind, of course we'd do it live. Sure the machines were super customized, bare bones Windows installs stripped back to the minimum amount of software needed for just one demo, but at the end of the day it sure as hell was real software running up there on stage.
Their actual result was pretty bad, but, ya know, work in progress I guess.
Zuckerberg handling it reasonably well was nice.
(Though the tone at the end of "we'll go check out what he made later" sounded dismissive. The blame-free post-mortem will include each of the personnel involved in the failure, in a series of one-on-one MMA sparring rounds. "I'm up there, launching a milestone in a trillion-dollar strategic push, and you left me @#$*&^ my @#*$&^@#( like a #@&#^@! I'll show you post-mortem!")
I bet the device hardware is small/cheap and susceptible to interference
Edit0: ie without internet access the ai is unable to produce an answer other than some prerecorded ones I guess
In the live showcase the presenter even mentions that the wifi must have been bad for the ai to repeat the answer
In the glasses is just a client to the ai. Like there is no ai in your phone when you talk to chatgpt, you are querying it and it will not keep talking to you if you cut off the wifi
The prerecorded responses I speculated about would have been things like "i'm having some connectivity problems, I'm unable to chat at this time, I'll let you know when I'm back." - the same kind of prerecorded things your earbuds tell you when they're low on power.
I own a pair of Meta glasses, and the response when they don't have connectivity is "this function is not available at this time".
Edit0: and what are you even doing? Where do you think this is going?
Unless you think they've added some inference logic on the device to slightly re-state the last answer they got from the cloud, it's clear that the glasses were connected and receiving the same useless answer from the cloud.
* side note, but it can also sound like "pear" to me this second time
Mad props to the presenter for holding it together though.
I fully expect the AI to suck initially and then over many months of updates evolve to mostly annoying and only occasionally mildly useful.
However, the live stage demo failing isn't necessarily supporting evidence. Live stage demos involving Wifi are just hard because in addition to the normal device functionality they're demoing, they need to simultaneously compress and transmit a screen share of the final output back over wifi so the audience can see it. And they have to do all that in a highly challenging RF environment that's basically impossible to simulate in advance. Frankly, I'd be okay with them using a special headset that has a hard-wired data link for the stage demo.
So, the failure was apparently with the glasses Zuckerberg's wearing on stage not establishing a two-way video call while simultaneously streaming it's own interface for the live stream and big-screen. He said it worked dozens of times in rehearsal and one notable difference was that for the real demo hundreds of other wifi devices present in the room.
I have quite a bit of experience producing live keynote demos at large tech events, so I don't think I've confused about this. As an aside, when we're being shown "Zuckerberg's POV" through the glasses I believe that's actually something custom put together for demos because the normal glasses don't even have a mode which shows the wearer's POV. Creating that view requires sending both the internal output of the glasses, which is the corner inset overlay AND the full screen output of the glasses live camera - which are then being composited together backstage to create the combined image we see representing what Zuckerberg sees. Sending all of that while establishing a two-way video call is a lot for a resource constrained mobile device.
Maybe the tech wasn't quite fool proof and they tried to fake it and then the fake version messed up.
(I didn't have control over temperature settings.)
That's...interesting. You'd think they'd dial the temperature to 0 for you before the demo at least. Regardless, if the tech is good, I'd hope all the answers are at least decent and you could roll with it. If not....then maybe it needs to stay in R&D.
https://news.ycombinator.com/item?id=19567011
And a quora link (sorry):
https://www.quora.com/If-floating-point-addition-isnt-associ...
regina dugan's f8 keynote 8 years ago
where they announced they were working on a 'haptic vocabulary' for a skin interface as well as noninvasive brain scanning technologyu\
Zuck really has cracked this one.
To Downvoters:
Give credit where credit is due.
I think you are going to realize in a few years why tens of billions was poured into Reality Labs and Oculus.
Version 2 or 3 of these glasses is going to set Meta ahead of the rest (except at least Apple).
802 more comments available on Hacker News