I wonder with the advent of generative-AI, whether or not we'll re-enter an era of privacy when images can no longer be trusted. Perhaps I'm being too hopeful.
There are some places that are extremely restrictive of public photography, like the UAE. It is comforting about when your picture can't be taken without permission. Of course, this doesn't stop their government from taking pictures, and that's a big problem when you have no way to effectively provide your own video or photographic evidence without having violated the law.
That trust was indeed why photo hoaxes were such a staple of pop culture [0].
Even as photo manipulation was mildly easy to do, the basic assumption was people wouldn't spend their time building a convincing lie.
Now we've been already in the "What's a photo" area since the first Google Pixel with extensive computational photography at the snap of a button, and advanced editing tools even for those who won't bother opening Photoshop.
TBH I'm pretty relieved we've hit that stage where the first reaction will be to wonder if it's "real".
There necessarily exists a period of time between when the photo was taken and it was certified though. Outside of tainting every photo (IRL watermarks somehow?) or every device (not that this _won't_ happen, but if the industry heavyweights behind DRM have only gotten it this far then I don't have a lot of faith in some signing scheme being the final straw), every photo you would want to certify must at some point be untrusted.
Given that fact, the most you could prove is that some individual claims to be the original source, not that the image is worth a damn, right? If so, what exactly does a "crypto" solution add (we can already sign images and publish timestamped hashes)?
There are dedicated apps (for insurance companies etc.) that will certify the photo by taking it straight from the camera feed and sign it.
That still allows to shoot fake subjects. Let's say you build the image on a high resolution screen and shoot it from the app with low resolution, you get a certified fake photo. Making the room too dark for good ISO, or purposefully shake the phone for instance should be enough to significantly lower the details if the app accepts the shot.
But at least, part of the pipeline is secured, in the sheer "this photo was not edited afterwards" sense.
Part of truth is socially mediated. No picture is the "truth" as you say, all cameras edit the photo. What matters is what the photo means and that's always a social question. Crypto governance and consensus is one way we are exploring how to represent truth in the internet.
<https://www.smithsonianmag.com/history/how-the-rise-of-the-c...>