Roc Camera
Posted2 months agoActive2 months ago
roc.cameraTechstoryHigh profile
skepticalmixed
Debate
80/100
Camera TechnologyAuthenticity VerificationBlockchain/zk Proofs
Key topics
Camera Technology
Authenticity Verification
Blockchain/zk Proofs
The Roc Camera is a DIY camera that uses ZK proofs to verify the authenticity of photos, but the community is divided on its effectiveness and practicality.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
1h
Peak period
109
0-6h
Avg / period
17.8
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Oct 23, 2025 at 10:54 PM EDT
2 months ago
Step 01 - 02First comment
Oct 23, 2025 at 11:59 PM EDT
1h after posting
Step 02 - 03Peak activity
109 comments in 0-6h
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 27, 2025 at 11:33 AM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45690251Type: storyLast synced: 11/22/2025, 11:00:32 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
How do you stop someone from taking a picture of an AI picture? It will still come from the sensor.
But a fixture that takes a good enough screen + enough distance to make the photographed pixels imperceptible is likely just a medium hurdle for a motivated person.
You probably can't fully avoid it but adding more sensors (depth) will make such a fixture quite a bit more expensive.
Tariffs shouldn’t prevent buying stuff, you just have to, y’know, pay a tariff on import.
In this case, a Japanese made camera will incur a 15% tariff.
I just checked and you can still send goods between Japan and the US. There are still merchants selling the exact mentioned camera on eBay that will ship to the US.
Here is the exact camera mentioned, offered by a Japanese seller, that will ship to the United States: https://www.ebay.com/itm/317445808304
Can you source your claim that absolutely no courier is capable of shipping goods into the US? I can’t find anything using google, or on any courier websites.
FedEx does have information about how to correctly fill out the forms for the purposes of tariffs, but does not mention that they will not accept shipments.
Many of the postal and courier systems that suspended service have since set up the systems they need, and are happily moving packages into the US, but it tends not to make the news.
https://en.wikipedia.org/wiki/Cottingley_Fairies
It was a real moment with objects that Bishop Berkeley could have kicked.
Interestingly it wasn't the OCR that was the problem but the JBIG2 compression.
Scanning an image would be much easier to dupe though - scanners are basically controlled perspective/lighting environments so scanning an actual polaroid vs an ai generated polaroid printed on photo paper would be pretty indistinguishable I think.
Adding something like a LIDAR and somehow baking that data into the meta data could be fun
On a side note, the best way to attack this particular camera is probably by attacking the software.
In old movies, going back to the 1930s and 40s, back-projection is usually seen when characters are driving in a car, and you can usually spot it. These days, not so much.
Seems like it.
> a photo scanner that makes the zero knowledge proofs
Presumably at some point the intention is to add other sensors to the camera e.g. for depth information.
[0]: https://authenticity.sony.net/camera/en-us/
[1]: https://petapixel.com/2023/10/26/leica-m11-p-review-as-authe...
https://spec.c2pa.org/specifications/specifications/2.2/inde...
* https://petapixel.com/2010/12/01/russian-software-firm-break...
* https://www.elcomsoft.com/presentations/Forging_Canon_Origin...
So I wouldn't automatically assume that a product like this would be better designed, but I would think there's a chance it might have been!
Other than that it's a 16MP Sony CMOS, I'd expect a pretty noisy picture...
It would be more interesting if the software was open source.https://sfconservancy.org/blog/2021/mar/25/install-gplv2/ https://sfconservancy.org/blog/2021/jul/23/tivoization-and-t... https://events19.linuxfoundation.org/wp-content/uploads/2017...
> What are the camera's specs?
> The camera has a 16MP resolution, 4656 x 3496 pixels. It uses a Sony IMX519 CMOS sensor.
Uses the standard MIPI/CSI interface, which is not authenticated or anything of the sort.
You can also buy HDMI-to-CSI adapters https://thepihut.com/products/hdmi-to-csi-adapter-for-raspbe... - should be easy enough to pipe your own video feed in as a substitute.
In this case you get the signature and it confirms the device and links to a tamper proof snapshot of the code used to build its firmware.
This attitude really rubs me the wrong way, especially on a site called Hacker News.
I think we absolutely should be supporting projects like this (if you think they're worth supporting), else all we're left with is giant corporation monoculture. Hardware startups are incredibly difficult, and by their nature new hardware products from small companies will always cost more than products produced by huge companies that have economies of scale and can afford billions of losses on new products.
So yes, I'm all for people taking risks with new hardware, and even if it doesn't have the most polished design, if it's doing something new and interesting I think it's kinda shitty to just dismiss it as looking like "a 3D printed toy with some cute software".
I don't mean to disregard the technical feat, but I question the intent.
I've been strugling to fight the urge to by a "Kodak Charmera" for a month now, don't tempt me again!
If it was a hardware start-up, the camera would be $80 built with custom purpose made hardware.
Once you decide to launch a hardware product composed of completed consumer hardware products, you are already dead. All the margin is already accounted for.
I support it but I recognize it is a 3D printed toy with some cute software... toys can be interesting too. Not everything needs to be a startup.
It's just that even in the realm of hardware by small teams built upon Pi boards this is very overprice and poor construction and cheap components for what it is.
Selling for $400 there are case solutions other than a cheap 3D print, and button choices other than the cheapest button on the market.
I wouldn't mind if it was 3D printed if it wasn't done with like a layer height of 0.28, half transparent so it looks weird, and intended for outdoor use where 3D prints are porous and water will seep through. The housing needs at the very least some spray painting and a clearcoat.
What I do mind is the cheapest off the shelf diy button lmao. They are like cents a piece, just add a fucking metal one that are like a few cents more if you're selling a $400 camera, cheapass. I wouldn't be surprised if the software side with the "proof" being a similarly haphazardly brittle implementation as the construction.
It’s possible that this could have value in journalism or law enforcement.
Just make it look the part. Make it black and put some decent lens on it.
I guess you could have a unique signing key per camera and blacklist known leaked keys.
They got cracked with a year or two. Not sure if they still offer the capability.
But I feel like the only way to accomplish fool-proof photos we can trust in a trustless way (i.e. without relying on e.g. the Press Association to vet) is to utterly PACK the hardware with sensors and tamper-proof attestation so the capture can’t be plausibly faked: multi-spectral (RGB + IR + UV) imaging, depth/LiDAR, stereo cameras, PRNU fingerprinting, IMU motion data, secure GPS with attested fix, a hardware clock and secure element for signing, ambient audio, lens telemetry, environmental sensors (temperature, barometer, humidity, light spectrum) — all wrapped in cryptographic proofs that bind these readings to the pixels.
In the meantime however, I'd trust a 360deg go-pro with some kind of signature of manafacture. OR just a LOT of people taking photos in a given vicinity. Hard to fake that.
Before long, it might be somewhat "easy" to prove anything.
It's not feasible or desirable for our hardware devices to verify the information they record autonomously. A real solution to the problem of attribution in the age of AI must be based on reputation. People should be able to vouch for information in verifiable ways with consequences for being untrustworthy.
The problem is quality takes time, and therefore loses relevance.
We need a way to break people out of their own human nature and reward delayed gratification by teaching critical thinking skills and promoting thoughtfulness.
I sadly don't see an exciting technological solution here. If anything it's tweaks to the funding models that control the interests of businesses like Instagram, Reddit, etc.
Also, "truth" is clearly something that requires more resources. It is a lifelong endeavour of art/science/learning. You can certainly luck into it on occasion but most of us never will. And often something fictional can project truth better than evidence or analysis ever can. Almost everything turns into an abstraction.
One may "luck" into truth by being born in a poor neighborhood or by living in a warzone, and having eyes and a camera. Or by being rich and invited to a club and having a microphone.
Truth is everywhere, but capturing it is expensive. The tax on truth is the easy spread and generation of lies. The idea that the fictional can encapsulate truth is of course true, but it doesn't mean everything is better an abstraction. Losing a leg is more powerful as a reality than as an abstraction. Peddlers of falsehoods, then, only win when truth can be abstracted.
Moreover: People who read literature read it knowing it stands in for truth. People who watch TikTok believe it is true, and are disenchanted when shown otherwise. More power resides in a grain of truth than a mountain of falsehood; so any tool for proving veracity will always have an outsized value against tools for generating fakes.
The last redoubt of propagandists when faced with the threat of truth is to claim that no one cares anymore what's true. But that's false. In fact, that's when they begin to fool themselves. It's not that no one in China or Russia values the truth, for instance. It's just that they say what they're told to say, and don't believe a word of it.
We do not need "proof". We lived without it, and we'll live without it again.
I grew up before broadband - we survived without photographing every moment, too. It was actually kind of nice. Social media is the real fluke of our era, not image generation.
And hypothetically if these cryptographic "non-AI really super serious real" verification systems do become in vogue, what happens if quantum supremacy beats crypto? What then?
You don't even need to beat all of crypto. Just beat the signing algorithm. I'm sure it's going to happen all the time with such systems, then none of the data can be "trusted" anyway.
I'm stretching a bit here, but this feels like "NFTs for life's moments". Designed just to appease the haters.
You aren't going to need this stuff. Life will continue.
Crime scene photographs won't be evidence anymore. You photograph your flat (apartment) when you move in to prove that all the marks on the walls were already there and that won't be evidence anymore. The police mistreat you but your video of it won't be evidence either. etc
Why wouldn't they be?
Testimony is evidence.
If you’ve got a photo of a public figure, but it doesn’t match the records of where they were at that time, it’s now suspicious.
It's not enough that the photograph is signed and has metadata. Someone has to interpret that metadata to decide authentic versus not. One can have an "authentic" photo of a rear projection screen. It wouldn't be appropriate to have an "authentic" checkmark next to this photo if it claims to not be a photo of a rear projection screen. The context matters to authenticity.
Secondly, the existence of such "authentic" photos will be used to call all non-authenticated photos into doubt.
So it doesn't even really solve any problem, but creates new problems.
Attestation systems are not inherently in conflict with repurposeability. If they let you install user firmware, then it simply won’t produce attestations linked to their signed builds, assuming you retain any of that functionality at all. If you want attestations to their key instead of yours, you just reinstall their signed OS, the HSM boot attests to whoever’s OS signature it finds using its unique hardware key, and everything works fine (even in a dual boot scenario).
What this does do is prevent you from altering their integrity-attested operating system to misrepresent that photos were taken by their operating system. You can, technically, mod it all you want — you just won’t have their signature on the attestation, because you had to sign it with some sort of key to boot it, and certainly that won’t be theirs.
They could even release their source code under BSD, GPL, or AGPL and it would make no difference to any of this; no open source license compels producing the crypto private keys you signed your build with, and any such argument for that applying to a license would be radioactive for it. Can you imagine trying to explain to your Legal team that you can’t extract a private key from an HSM to comply with the license? So it’s never going to happen: open source is about releasing code, not about letting you pass off your own work as someone else’s.
> must be based on reputation
But it is already. By example:
Is this vendor trusted in a court of law? Probably, I would imagine, it would stand up to the court’s inspection; given their motivations they no doubt have an excellent paper trail.
Are your personal attestations, those generated by your modded camera, trusted by a court of law? Well, that’s an interesting question: Did you create a fully reproducible build pipeline so that the court can inspect your customizations and decide whether to trust them? Did you keep record of your changes and the signatures of your build? Are you willing to provide your source code and build process to the court?
So, your desire for reputation is already satisfied, assuming that they allow OS modding. If they do not, that’s a voluntary-business decision, not a mandatory-technical one! There is nothing justifiable by cryptography or reputation in any theoretical plans that lock users out of repurposing their device.
Are there systems that do prevent photographing a display? Like accompanying the photo with an IR depth map?
This is actually one of the theoretical predictions from Eliezer Yudkowsky, who says that as information becomes less and less verifiable, we're going to need to re-enter a pre-information-era - where people will have to know and trust the sources of important information they encounter, in some cases needing to hear it first hand or in person.
Like, how is this any different than having each camera equipped with a vendor controlled key and then having it sign every photo?
If you can spoof the sensor enough to reuse the key, couldn't you spoof the sensor enough to fool a verifier into believing your false proof?
Nyquist–Shannon sampling theorem.
But, if the sony sensor also measures depth information, this attack vector will fall flat. Pun intended.
The only real solution I can think of is just to have multiple independent parties photograph the same event and use social trust. Luckily this solution is getting easier now that almost everyone is generally no further than 3 feet away from multiple cameras.
I was trying to take a picture of a gecko the other day, and it missed half of the event while the app was loading.
Both cameras still allow “staging” a scene and taking a shot of that. Both cameras will both say that the scene was shot in the physical world, but that’s it.
I would argue that slide film is more “verifiable” in the ways that matter: easier to explain to laypeople how slide film works, and it’s them that you want to convince.
If I was a film or camera manufacturer I would try and go for this angle in marketing.
I think the point of this movement toward cryptographically signing image sensors is so people can confidently prove images are real on the internet in a momentary click, without having to get hold of the physical original and hiring a forensic lab to analyze it.
That’s beside the broader point that OP made: it doesn’t matter since you can just point a verifiable camera at a staged scene (or reproduction of an AI image) and have an image of something that doesn’t represent reality. You can cryptographically sign, or have an original slide, of an image that is faked outside the camera.
It's an emerging field, and attack vectors like that are hurdles to be solved. You can make faking more difficuly, for example, with a depth sensor.
Using cryptographically signed photos is not even new, most of the major camera manufacturers are offering it, or are working on offering it at this point. The reality is that even with things like sensor depth data proving that a scene is in 3 dimensions, you are still able to manipulate the actors in a scene, still able to selectively include or exclude elements, still able to pick the image that seems to show something that it doesn't, still able to editorialize in a text description of a scene etc.
The time-proven solution for this is to rely on institutional reputation. While every news source has had lapses, I am far more likely to trust the reality and neutrality of, say, the AP over Fox News regardless of the presence of a signature.
Disclosure: I used to freelance for the AP as a photojournalist.
Not trolling. Genuinely don’t understand.
https://www.amazon.com/Camera-Digital-Toddler-Christmas-Birt...
This is one attempt.
The light sensor must have a key built into the hardware at the factory, and that sensor must attest that it hasn't detected any tampering, that gets input into the final signature.
We must petition God to start signing photons, and the camera sensor must also incorporate the signature of every photon input to it, and verify each photon was signed by God's private key.
God isn't currently signing photons, but if he could be convinced to it would make this problem a lot easier so I'm sure he'll listen to reason soon.
The real issue that photographers grapple with, emotionally and financially, is that pictures have become so thoroughly commodified that nobody assigns them cultural value anymore. They are the thumbnail you see before the short video clip starts playing.
Nobody has ever walked past a photograph because they can't inspect its digital authenticity hash. This is especially funny to me because I used to struggle with the fact that people looking at your work don't know or care what kind of camera or process was involved. They don't know if I spent two hours zoomed in removing microscopic dust particles from the scanning process after a long hike to get a single shot at 5:30am, or if it was just the 32nd of 122 shots taken in a burst by someone holding up an iPad Pro Max at a U2 concert.
This all made me sad for a long time, but I ultimately came to terms with the fact that my own incentives were perverse; I was seeking the external gratification of getting likes just like everyone else. If you can get back to a place where you're taking photographs or making music or doing 5 minute daily synth drills for your own happiness with no expectation of external validity, you will be far happier taking that $399 and buying a Mamiya C330.
This video is about music, but it's also about everything worth doing for the right reasons. https://www.youtube.com/watch?v=NvQF4YIvxwE
But at the same time it's true that some vital public activities aren't rewarded by the system atm. Eg. quality journalism, family rearing, open source, etc. Often that's an issue of privatized costs and socialized rewards. Finding a way to correct for this is a really big deal.
But aren't you now feeding back to the system? Why would there need to be a financial reward and incentive for everything?
I do realize "contributing free value" is perceived by some as free value a third party can capture and financially profit from" which might the reason for thinking of how to then cycle some of that value back?
Tabloid press is fantastically profitable, but fake news over time will erode a great deal of social trust.
Closed source software might be individually advantageous but collectively holds back industrial progress. It's a similar reason to why patents were first introduced for physical goods.
And yes people voluntarily without kids should have to pay significantly more social contributions.
I don't know anyone who understand economics would say this, unless you're talking about very specific meanings of 'value'. I'm not trying to be pedantic, I know what you mean, but these comments are not insightful or helpful.
The problem with the linked product is it’s basically DRM with a baked in encryption key. And we have seen time and time again that with enough effort, it’s always been possible to extract that key.
People "at large" absolutely don't care about AI slop, even if they point and say eww when it's discussed. Some people care, and some additional people pretend they care, but it just isn't a real issue that is driving behavior. Putting aside (for now) the idea of misinformation, slop is socially problematic when it puts artists out of work, but social media slop is just a new, sadder, form of entertainment that is generally not replacing the work of an artist. People have been warning about the downfall of society with each new mode of entertainment forever. Instagram or TikTok don't need to remove slop, and people won't care after they acclimate.
Misinformation and "trickery" is a real and horrific threat to society. It predates AI slop, but it's exponentially easier now. This camera, or something else with the same goal, could maybe provide some level of social or journalistic relief to that issue. The problem, of course, is that this assumes that we're OK with letting something be "real" only when someone can remember to bring a specialty camera. The ability of average citizens to film some injustice and share it globally with just their phone is a remarkably important social power we've unlocked, and would risk losing.
I fear, your statement is impossible to be denied its validity, when "Tung Tung Tung Sahur"-Trading-Cards and "Tralalero Tralala"-T-Shirts are a thing.
I think this is true. In general I think enough population of the market actually does not care about quality as long as it exceeds a certain limited threshold.
There's always been market for sub-par product. That's one of the features of the market I think. You can always find what is the cheapest, lowest quality offering you can sell at a profit.
I'd say we've already mostly lost that due to AI. We might gain it back if cryptographic camera signatures become commonplace (and aren't too easy too crack).
I fully agree, I just don't know how that could work.
I think GenAI will kill the internet as we know it. The smart thing is (and always has been) to be online less and build real connections to real people offline.
True.
> There is absolutely a market for social media that bans AI slop.
There’s a market for social media that bans slop, period. I don’t think it matters how it was made.
Also, that market may not be large. Yes, people prefer quality, but (how much) are they willing to pay for it?
That said in theory TPMs are proof against this: putting that to the test at scale, publicly, would be quite useful.
Everyone, me more than most, doesn’t want their picture taken, or to be in the background of other photos. When someone can take thousands of pictures an hour, and upload them all to some social media site to be permanently stored… idk it’s shifted from a way to capture a moment to feeling like you’re being survieled.
A bit hyperbolic, but it’s the best way to describe what I’m feeling
I don't mind 4x5 so much because just taking the photo is so much effort that the associated ordeal of developing and scanning isn't out of proportion. But for 35mm and medium format, there's a hugely disproportionate investment of time and money for a small number of photos.
I don't deny that for a whole range of reasons, some people might take better or more meaningful photos using old cameras. Limitations can feed into the artistic process. I just think it's a bit silly to romanticize the cost and inconvenience of film, or to think that photos taken using film are somehow inherently more interesting or valuable.
In contrast, a 35mm camera is very convenient and you can expose an entire 30 frame roll of film in a few minutes. But getting high quality scans of all those frames requires either a lot of time or a lot of money. (Consumer flatbeds give poor results for 35mm, so your best bet is putting the negative on a light table and using a digital camera and macro lens. But that’s a physically fiddly process, the ‘scan’ needs manual spotting for dust, and if you’re shooting color negatives you also have to do some work to get the colors right.)
Back in the day, most users of 35mm cameras were satisfied with waiting a week to get a set of prints with absolutely no creative control over the printing process, but that’s not what most people want now.
That's it. That's the verification?
So what happens when I use a Raspberry Pi to attach a ZK proof to an AI- generated image?
318 more comments available on Hacker News