60 Years After Gemini, Newly Processed Images Reveal Details
Posted4 months agoActive4 months ago
arstechnica.comResearchstoryHigh profile
excitedpositive
Debate
20/100
NasaGemini Space ProgramSpace Photography
Key topics
Nasa
Gemini Space Program
Space Photography
Newly processed images from the Gemini space program reveal incredible details 60 years after the mission, sparking discussion about the technology used and the historical significance of the program.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
3d
Peak period
47
72-84h
Avg / period
15.6
Comment distribution78 data points
Loading chart...
Based on 78 loaded comments
Key moments
- 01Story posted
Sep 13, 2025 at 8:43 AM EDT
4 months ago
Step 01 - 02First comment
Sep 16, 2025 at 4:40 AM EDT
3d after posting
Step 02 - 03Peak activity
47 comments in 72-84h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 18, 2025 at 10:41 AM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45231614Type: storyLast synced: 11/20/2025, 6:45:47 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
Project Gemini was NASA's second human spaceflight program (1961-1966), preceding Apollo. It developed spaceflight techniques such as orbital rendezvous and docking, essential for the Moon landing.
Gemini is also a lightweight internet protocol and associated ecosystem (the Gemini Protocol), designed as a middle ground between Gopher and the modern web (HTTP/HTTPS), emphasizing simplicity and privacy.
It is also the name of Google's multimodal AI model, successor to Bard (announced 2023).
This looks gorgeous. I'm extremely tempted to splurge on this, and the Apollo, books...
https://airandspace.si.edu/collection-objects/camera-hasselb...
Even on shittier cameras, like a Holga 120 leaking everywhere with a plastic lens, the results with medium format film is always surprising and gives you a lot of leeway.
While this is true now, it took a surprisingly long time to get there. The dynamic range of professional medium format negative films is still respectable. Perhaps not so much in a low light, but it's very immune to overexposure.
Also, you can buy a cheap medium-format camera in a good condition and experience that "huge sensor" effect, but unfortunately there are no inexpensive 6x6 digital cameras.
Technically larger than 6x6 film sensors have existed since the 80s or 90s at least but are typically only used for government things… Some digital aerial systems use huge sensors.
Can you say a little more about this? Modern lenses boast about 7-elements or aspherics, but does that actually matter in prime lenses? You can get an achromat with two lenses and an apochromat with three. There have definitely been some advances in glass since the space program, like fluorite versus BK7, but I'm wholly in the dark on the nuances.
Sony's "run of the mill" F2/28 can take stunning pictures, for example. F1.8/55ZA is still from another world, but that thing is made to be sharp from the get go.
The same thing is also happening in corrective glasses too. My eye numbers are not changing, but the lenses I get are much higher resolution then the set I replace, every time. So that I forget that I'm wearing corrective glasses.
Even back in their prime, haha, the Cooke lens leaned into their glass manufacturing by calling it the Cooke Look. All of the things that gave it that look are things modern lenses would consider as issues to correct.
All boils down what you want to achieve and what emotion you're trying to create with your photography. Film emulation has gone a long way, but emulating glass is not possible the same way (since you don't have any information about what happened to your photons in their way to your sensor), and lenses are important part of the equation, and will forever be, I think.
We all prefer different toolsets due to our differing needs and preferences. Understanding it removes a lot of misunderstanding, anger and confusion from the environment.
But, reaching there requires experience, maturity and some insight. Tool suitability is real (you can't drive a screw with a pair of pliers), but the dialogue can be improved a ton with a sprinkle of empathy and understanding.
Edit: this is just for prosumer style cameras. If you look at phone sized optics that’s a whole other ballgame.
Current methods use optical flow and gyroscopes to align images, but I imagine future methods to use AI to understand movement that doesn't work well for optical flow (ie. Where a specular reflection 'moves' on a wine glass).
And I feel that these old analogue photos contain even more magic in the base material, digital reconstruction notwithstanding
1965 was two weeks ago?
I can only assume that the image on the left is a low resolution scan produced for this web article, and that there must be a much better scan somewhere else.
So, what improved is probably our digitalization tools and with some post, you can reveal a lot of detail.
My attempt: https://i.imgur.com/QZDDEB5.png
Imagine:
1. Film -> Method 1 -> Photo #1
2. Film -> Method 2 -> Photo #2
Instead you tried:
3. Photo #1 -> Method 3 -> Photo #2
Which instead gives you a badly edited Photo #1. You don't have the source code, so to speak.
https://imgur.com/a/YC2iBHX
Looking at it "correctly" the man's image is obvious.
I enjoy image-art for this 'eye of the observer' opportunity.
Gemini would likely have been similar, save for using Cyclizine during re-entry rather than scopolamine/dextroamphetamine.
Detailed info about the Apollo medkit: http://heroicrelics.org/info/csm/apollo-medical-kit.html
Less-detailed history of NASA medkits, mentioning Gemini: https://www.spacesafetymagazine.com/spaceflight/space-medici...
https://www.imdb.com/title/tt8760684/
ISTR an astronaut saying that you didn't 'get into' a Mercury capsule, instead you 'put it on'.
https://tothemoon.im-ldi.com/
Presumably the stacking was of 100s of frames would involve combining several seconds (8.33 seconds would be 200 frames of film at 24 FPS) to generate a higher-resolution image. Success would depend on how much camera and subject movement occurred over that time, though image stabilisation should help somewhat with the former.
I suspect a clearer explanation of the process was omitted from the article, perhaps due to poor editing.
This almost always implies digital image processing, and depending on the goal / intent, various filters or masks may be applied. De-noising typically relies on including only visual elements appearing in most (but not all) frames of the stack, eliminating glitches such as satellite flares, meteors, terrestrial-based light sources, aircraft, sensor noise, or even radiation spots.
"Take Better Night Sky Photos with Image Stacking"
<https://photographylife.com/night-sky-image-stacking>
"Focus Stacking" (Wikipedia)
<https://en.wikipedia.org/wiki/Focus_stacking>
Again, from TFA the source was motion-film footage from fixed-position cameras which could be used to generate images with much greater resolution than any individual frame. Note that 16mm film grain is typically fairly large, but with stacking and post-processing, smaller details can be inferred.
For the 16mm movie cameras in this case, they probably selected frames from rigidly mounted cameras with little subject motion to get a good result out of it. Glenn strapped in tight in his tiny cockpit probably provided them with a decent number of frames they could stack without introducing much motion blur. In fact, you can see a bit of blurring at the end of one of the white straps center frame in that shot: https://cdn.arstechnica.net/wp-content/uploads/2025/09/03b-G...
It's a pretty common technique in astrophotography. Here's a page that goes into some more detail in that context: https://clarkvision.com/articles/image-stacking-methods/
I'm still here! I was ten years old during most of the Gemini era. I remember this stuff. I haven't forgotten.
I found the headline image confusing - thought it was a Sontaran (Dr. Who baddie) in there! Aldrin's face and the earth reflection are quite confusing to the eye.
https://tothemoon.im-ldi.com/
All the of the photographs from these missions are public domain and always have been.