Glasses-Free 3d Using Webcam Head Tracking
Posted3 months agoActive2 months ago
assetstore.unity.comTechstory
calmmixed
Debate
60/100
3d TechnologyHead TrackingWebcam Applications
Key topics
3d Technology
Head Tracking
Webcam Applications
A Unity package uses webcam head tracking to create a glasses-free 3D effect, sparking discussion on its effectiveness and limitations.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
N/A
Peak period
36
Day 5
Avg / period
18
Comment distribution108 data points
Loading chart...
Based on 108 loaded comments
Key moments
- 01Story posted
Oct 18, 2025 at 8:07 AM EDT
3 months ago
Step 01 - 02First comment
Oct 18, 2025 at 8:07 AM EDT
0s after posting
Step 02 - 03Peak activity
36 comments in Day 5
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 29, 2025 at 3:02 AM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45626685Type: storyLast synced: 11/20/2025, 5:27:03 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
We built this because we wanted 3D experiences without needing a VR headset. The approach: use your webcam to track where you're looking, and adjust the 3D perspective to match.
Demo: https://portality.io/dragoncourtyard/ (Allow camera, move your head left/right)
It creates motion parallax - the same depth cue your brain uses when you look around a real object. Feels like looking through a window instead of at a flat screen.
Known limitations: - Only works for one viewer at a time - Needs decent lighting - Currently WebGL only
We're still figuring out where this is genuinely useful vs just a novelty. Gaming seems promising, also exploring education and product visualization.
Happy to answer questions!
ok took around 2min for me to load, then works.
First thing, there is no loading indicator and it takes too long to start so I though it was broken a few times before I realized I had to wait longer.
Second thing, although it was clearly tracking my head and moving the camera it did not make me feel like I'm looking into a window into a 3d scene behind the monitor.
These kinds of demos have been around before. I don't know why some work better than others.
some others:
https://discourse.threejs.org/t/parallax-effect-using-face-t... https://www.anxious-bored.com/blog/2018/2/25/theparallaxview...
And we will check out these links, appreciate you sharing it!
I'd say I'm not the only one who misses this technology in games, because a used New 3DS XL costs at least $200 on eBay right now, which is more than what I paid new.
I always thought 3D would combine really nicely with a ray traced graphics full of bright colors and reflections, similar to all those ray tracing demos with dozens of glossy marbles.
And then there was the time travel arcade game (also by Sega) that used a kind of Pepper's Ghost effect to give the appearance of 3D without glasses. That was in the early 90s.
I think the idea of 3D displays keeps resurfacing because there's always a chance that the tech has caught up to people's dreams, and VR displays sure have brought the latency down a lot but even the lightest headsets are still pretty uncomfortable after extended use. Maybe in another few generations... but it will still feel limiting until we have holodeck-style environments IMO.
Yes I believe you are right in that the tech is catching up with concepts that seemed futuristic in the past. For example the hardware today supports much more than it would have been able to do, say, 5-10 years ago.
Our hypothesis is that the current solutions out there still require the consumer to buy something, wear something, install something etc. - while we want to build something that becomes instantly accessible across billions of devices without any friction for the actual consumer.
Maybe VR doesn't need that many games because the small handful of good ones have so much depth and replay value. I guess I just talked myself into a $700 VR kit and possibly a $700 GPU upgrade, depending on whether or not my RTX 3060 is up to task.
Fully agreed that if you want 100% full 6DOF immersion - go and pay hundreds or even thousands of dollars to wear a heavy and cumbersome headset on your head. We're not disputing that or thinking of competing with that.
What we're saying is that there may be a much larger market consisting of people who are not ready to commit to pay so much money to wear something that will give them motion sickness after 10 minutes.
If you're developing a VR game your market consists of 50 million people around the world who owns a VR headset. That's great. But since you already built the VR world in 3D, you could also open up the market to billions of people who want to play your game but on their own devices.
Admittedly, it won't be the same experience, but it could be a "midpoint". Not everyone can afford and is willing to pay for a VR headset.
Presumably developers could have combined this with parallax head tracking for an even stronger effect when you move your head (or the console), but as far as I know no one did.
1. Each eye sees the object from a different angle.
2. Both eyes see the object from a different angle when the object is moved relative to your head.
The 3DS does only #1. TFA does only #2. Presumably if you did both, you could get a much stronger effect.
I think the New 3DS had the hardware to do both in theory, but it probably would have made development and backwards compatibility overly complicated!
What we're thinking is to enable this technology - as long as there is a camera and a screen - instantly accessible across billions of devices.
That means that the 3D effect would be applicable not only for games built for that specific console - but for any and all games that are already in a 3D environment.
Systems like trackir, which require dedicated hardware.
Also, TrackIR is just an IR webcam, IR leds, and a hat with reflectors. You can DIY the exact same setup easily with OpenTrack, but OpenTrack also has a neural net webcam-only tracker which is, AFAIK, pretty much state of the art. At any rate it works incredibly robustly.
Actually I have already used it to implement the same idea as the post, with the added feature of anaglyph (red/blue) glasses 3D. The way I did it, I put an entire lightfield into a texture and rendered it with a shader. Then I just piped the output of OpenTrack directly into the shader and Robert, c'est votre proverbial oncle. The latency isn't quite up to VR standard (the old term for this is "fishtank VR"), but it's still quite convincing if you don't move your head too fast.
8yo me, who instinctively tried to look behind the display's field of view during intense gaming sessions, would appreciate this feature very much. My belief is that if it shifted the pov to a lesser degree than in the demo, people generally wouldn't notice, but still subconsciously register this as a more immersive experience.
I'm also glad that the web version doesn't try to cook my laptop - good work.
If you click "Menu" and then "Settings" you can play around with e.g. the sensitivity. Ideally we'd automatically optimize the calibration according to for example what device you are using, but that's something we would do a bit more long-term.
Appreciate it!
I can see this quite useful for educational demonstrations of physics situations and mechanical systems (complex gearing, etc.). Also maybe for product simulations/demonstrations in the design phase — take output from CAD files and make a nice little 3D demo.
Maybe have an "inertia(?)" setting that makes it keep turning when you move far enough off center, as if you were continuing to walk around it.
The single-viewer limitation seems obvious and fundamental, and maybe a bit problematic for the above use cases, such as when showing something to a small group of people. One key may be to take steps to ensure it robustly locks onto and follows only one face/set of eyes. It would be jarring to have it locking onto different faces as conditions or positions subtly change.
The intertia-idea wouldn't be too difficult to implement, but its usefulness would probably depend on the application area.
Yep exactly. Usually it locks onto one person's face but it can also jump around, so there are still optimizations we can do there - but generally it's supposed to be for one person. If you compare to VR headsets, two people can't wear the same VR headset anyway!
Do you remember which other implementations you've seen that worked really well?
We had as many people come test as we could, and we found that 90% of them didn’t get a sense of depth, likely because it lacked stereo-vision cues. It only worked for folks with some form of monocular vision, incl myself, who were used to relying primarily on other cues like parallax.
That's cool, what were the benefits/drawbacks of that compared to other VR headsets would you say?
Good memories
I think the problem is in real life you have an enormous number of other visual cues that tell you that you're not really seeing something 3D - focus, stereoscopy (not for me though sadly), the fact that you know you're looking at the screen, inevitable lag from cameras, etc.
I can't view the videos because of their stupid cookie screen, but I wouldn't be too excited about this. The camera lag especially probably impossible to solve.
Well worth taking a look at
I just tried this demo and it is cool, but nowhere as good.
https://www.youtube.com/watch?v=6trOg2IK2Zg
One of the big issues with that phone was that in order to do dynamic perspective, you're having to run a 3D render at 60fps constantly. That's a huge power hog, and prevents you from doing many of the power savings techniques you otherwise could on a normal phone -- shutting down the GPU, reduced refresh rate, heck, even RAM backed displays.
OK, that's a no then.
Would you be able to clarify?
But you're still behind the window.
I did see a demo of 3D without glasses on a full monitor that DID make it look like it was coming out of the screen at CES, it requires a $3000 monitor though: https://www.3dgamemarket.net/content/32-4k-glasses-free-3d-g...
Also the 3DS obviously.
However the result wasn’t that useful because we humans have 2 eyes.
The 3d effect is very compelling if you close/cover one eye.
But that becomes annoying quickly.
The best results I had was sneering a little hair gel in the center of my left eye glassses. Then it felt like I was using 2 eyes but really my right eye was seeing the pinball table clearly and fooling my brain.
https://www.optiqb.com/
but cannot seem to find the actual repo anywhere
https://youtu.be/4zZfsyHEcZA?si=BE2I991zEVxPEt9F&t=57
Seemed to have some different names depending on region (Looksley's Line Up, Tales in a Box: Hidden Shapes in Perspective). I recall it working very well at the time.
update: just tried to open the site again now and its gone but leads to some kind of shop?
update2: oh use the link in the comment for the demo: https://portality.io/dragoncourtyard/
[1] - https://www.youtube.com/watch?v=P07nIcczles (actually, this one was using a paper tracker because the face tracker had a big impact on fps)
You can do this with a Kinect for head tracking and people do that with homemade pinball machines using a TV as the table, and a Kinect to track your head, so it looks like the table is 3D down into the TV.
But I don't think it creates the full 3D effect with things looking like they are coming way out of the screen and like a tangible thing you can reach out and touch.
Related idea, but not the same, might be my iOS and Android app that uses your phone's AR for head tracking and then sends that data to your PC for smooth sim game head tracking. https://smoothtrack.app
source code at https://github.com/guyromm/Window
imho this should be useful for driving/flight sims, giving the player the ability to lean inside the vehicle, changing their viewpoint on the surroundings.