This New Device Establishes Content Authenticity Using Any Digital Camera

5 days ago 16

A black device labeled "ATOM" with a red atom symbol and "H1" text on a blue background. The device has a textured surface and a glowing green button.

As a research physicist from MIT, Stuart Sevier learned a lot about reality, technology, and perhaps most importantly, the perception of reality. He veered off his hardcore academic track to pursue the concept of reality from a more engineering-based perspective, ultimately founding Atom Images and working with a talented team to build the Atom H1, a tool built for photographers to capture trusted, authentic images in a world where the line between real and fake is becoming blurrier by the day.

The jump from physics researcher to engineering hardware and software for photographers seems dramatic initially. However, after speaking to Sevier, the through-line between studying the nature of reality as a physicist and capturing it with a camera becomes clear.

After doing work on media independence and supporting educational content online, Sevier began to wonder, with the rise of generative AI, “if we can’t tell what’s real and who made it, [little else] matters.”

“In the pyramid of needs, the most basic thing is ‘Where did it come from? Who made it? Did a human make it?'” Sevier explains. “My physics engineering brain started to combine with my independent media brain and user experience brain, and so I thought about potential solutions.”

Sevier and Atom Images want to address this most basic and increasingly pressing issue: How do people know when something is real and whether a human created it?

 "Take" with a camera capturing scenes, "Host" with uploaded images on ATOM RIP, and "Share" with a book cover showing the U.S. Capitol, illustrating a transition from capture to global sharing.

Why AI ‘Fake’ Detectors Don’t Work

One option that resonates with many is fake detection technology. Wouldn’t it be great if it were easy to toss an AI-generated image into an app, and it would say, “Fake.” These platforms exist but are exceptionally inconsistent, unreliable, and not improving at the same rate as generative AI technology.

“It was clear to me a year and a half ago that it was just going to be an arms race for detecting fake stuff, and that if we were really going to succeed, we needed to shift,” Sevier says. “Instead of trying to label and detect fake stuff, we needed to market things as real from the beginning and then encode the original screen captures and who made them, then make a flexible format so it could be passed around and edited.”

If this sounds similar to how the Content Authenticity Initiative (CAI) discusses authentic versus fake content and the best ways to ensure authenticity across multimedia platforms, that’s not a coincidence. Atom Images is a member of the CAI.

“It’s a classic game of cat and mouse,” Sevier says of fake detection. “You know that at a basic level, all these generative AI platforms are denoising and rebuilding images. So they take images, add noise, and then build an image back out. So there are signatures associated with that, but they’re increasingly difficult to detect and people can learn to directly evade them.”

This is a “classic security problem,” the physicist and engineer explains. “As soon as someone learns you’re putting up a lock, they learn to break the lock.”

Even the best detection tools these days are poor, per Sevier, and their detection rates are not very good. “They’re like 70, 80 percent for the best ones.”

“For the public to trust it, you need to be in the upper 90 percent,” he adds. That’s just how reliably a system can label a fake image, but that does not even consider the issue of false positives, which damage public trust.

“If you are sometimes labeling real images as AI images, it falls apart really, really fast, and the trust is lost,” Sevier adds.

A man holding a decorated cake stands in a conversation with two other men in a room filled with shelves and boxes. The setting appears to be a casual gathering or celebration.In this case, Sevier took a photo, and then used artificial intelligence to add a cake. If Atom Images’ technology exclusively relied upon metadata, not a ground truth comparison, the image might not be flagged. Since Atom Images has access to the ground truth photo thanks to the H1, it is immediately apparent which pixels are different.
A group of people smiling and laughing in front of a wall filled with shelves of various products and a basketball hoop. They appear to be enjoying a social event or celebration, with one person in a patterned shirt holding a balloon.The blue frame shows where the crop was pulled from, while the red area shows what pixels are different.

Where the Atom Images H1 Fits Into Content Authenticity at Capture

While some camera makers are part of the CAI, too, and implementing C2PA standards and tools into select models, Sevier admits the adoption has been a bit slow. Further, “I don’t think you should have to go buy a new $5,000 camera to get some of these [content authenticity] solutions.”

That’s where the Atom Images H1 comes in. It is a $239 (eventually $299 after early bird pricing ends on December 2, 2024) device that connects to practically any digital camera — Sevier and Atom Images have tested 30 bodies across many manufacturers with no issues — and signs image files with a secure digital fingerprint.

Speaking of fingerprints, this concept underpins how the device works. Every single camera’s image sensor has a unique “fingerprint.”

Every image sensor, CMOS or CCD, has a unique noise pattern. This is not just that all Sony 61-megapixel sensors have the same identifying pattern, as that wouldn’t help with the H1, but all individual sensors in every camera are unique.

“What the [H1] is using are tiny little differences baked into the electronics during the manufacturing process,” Sevier explains. “So there’s a fixed pattern, high-frequency noise signature present in every sensor.”

 "Camera pairing," "Sensor fingerprinting," "Image authentication," and "Reality tether," each with brief explanations and corresponding icons.

During the H1’s initial calibration process, users snap a few images with a uniform light exposure, and the H1, using Atom Images’ patented technology, learns a specific camera’s unique noise pattern. Once this is done, it will never need to be repeated, and it will be possible to know if any image was captured using that specific camera.

A person wearing a cap uses a camera fitted with an ATOM H-branded device. The person's finger is on the camera's shutter button, and a coiled cable extends from the device. The background is blurred greenery.The Atom Images H1 can be put on top of a camera. However, as long as the connection remains secure, photographers can decide how to implement the device into their kit.

Hardware Meets Software

This hardware means little if there is no software to back it up. In Atom Images’ case, images are captured by the camera, authenticated, and signed by the H1, and are then available to be uploaded into a web platform.

From there, users can compare an image against the database and learn much quickly. If the image hasn’t been edited or had its C2PA manifest stripped, it is straightforward to determine whether it is authentic. That’s a cakewalk.

However, once the image has been edited, the situation becomes more complicated. If a C2PA chain remains intact, it is not too challenging. In situations when that information isn’t available, the H1’s ground truth image becomes vital. When a photo no longer matches the ground truth image captured through the H1 itself, it is flagged. It becomes a matter of comparing an image to a real, verified photo.

If a photo has a green flag, it is either not edited enough to trip the system’s checks because it looks like the ground truth, or the edits are determined to be minimal.

Three side-by-side images labeled Green, Yellow, and Red, showing varying levels of image editing. Green is slightly edited, Yellow has moderate changes, and Red has significant manipulation. Each has an icon indicating its edit status.Atom Images’ platform uses a flagging system that, instead of trying to label fake images, shows how images compare against a ground truth photo in the system.

For example, cropping an image doesn’t immediately flag it in the system, but if the user crops a person out of the frame, that matters. There’s a difference between cropping 10% off the side and losing background clutter and cropping 25% of the frame and taking people or subjects out of the shot.

Granted, as Sevier notes, the H1 can only authenticate the ground truth captured by the photographer — the photographer inherently makes determinations about what to include and, just as importantly, omit from a photo at the time of capture. Sometimes what the photographer doesn’t include in the original photo matters a lot to the story, and no software will ever be able to address that concern.

Two men are engaged in conversation in a casual indoor setting. One wears a black t-shirt and light pants, while the other wears glasses, a black jacket, and khaki pants. A women’s restroom sign is visible in the background.This image has been cropped, which is flagged, and users can see what portion of the frame has been removed and decide if it matters to them.

However, the H1 and the accompanying software can compare image edits and variations against a signed verified ground truth photograph. This means that it will be evident when someone changes the look of an image (amber flag), removes a person, or changes something at the pixel level (red flag).

Sevier showed PetaPixel a demo of the development site, showing that the system labels areas where pixels have changed versus the ground image. The flagging system is in place to show people when something is different than the ground truth image in the system and explain how it has changed. It is ultimately up to individuals to determine how different is too different.

“If you have a ground truth version of a photograph, you can tell if something has been changed,” Sevier says. “But if you’ve never seen the photo before, you’re at a loss trying to make heads or tails of it.”

The Atom Images H1 creates the necessary ground truth image at the moment of capture with the camera itself, and the software backend gives users the tools to check pictures.

 vintage trucks, a water tower, a fire station, wooden sculptures, a white van, a person smiling, a solar eclipse, a group of people indoors, and the U.S. Capitol building.

A Hardware Company With Software Company Aspirations

Naturally, the utility of such a platform relies heavily on hardware adoption. Sevier knows that until all cameras have C2PA technology built-in, it’s an uphill battle. As for the H1, it is much cheaper than a new camera.

However, since it is external hardware, it comes with necessary workflow effects.

“Since the beginning I’ve been obsessed with ensuring [the H1] can’t be a big burden on people. Adoption is already the biggest hurdle, and the more we can make it simple, the better,” he explains.

There is the initial calibration process, which is a minor friction point. Once that’s done, the H1 itself must be physically connected to the user’s camera so that images can be captured, processed, authenticated, signed, and saved to the SD card inserted into the H1.

The required physical connection is a non-starter for some. But the H1 can be put on top of the camera, slipped in a pocket, attached to a rig or cage, etc. However, it must be connected to work. A wireless version would be great, but the technology isn’t quite there yet.

Sevier is quick to note that the H1 will never slow down the camera. As long as the photographer is writing images to an internal memory card, the H1 won’t slow the camera down or prevent it from shooting. The H1 may take time to catch up, especially for people who shoot in big bursts, but the device isn’t going to interfere with someone capturing full-resolution RAW images at 120 frames per second on a Sony a9 III.

“We would hate for our technology to be the reason why you missed out on a once-in-a-lifetime photograph,” Sevier says, noting that the H1 does not get between the photographer and capturing images.

Although Sevier says he’ll be pleased if many photographers who care about content authenticity buy the H1 and use it, the ultimate goal is to demonstrate to camera companies that authentication matters, people demand it, and they should utilize the technology themselves.

Some companies like Leica, Sony, and Nikon are already doing it, but others, including some major smartphone makers, have hesitated. The H1 works with smartphones, by the way.

“If we’re a successful hardware company, great. I’m happy to do that. But I’m also very happy if what ends up happening is that we convince camera companies that they should put this into their camera lines and we can step back to become an authentication as a service company and a software platform for authentication and provenance,” Sevier says, noting that this latter scenario is his “hope.”

“We’ll see where it goes,” Sevier adds.

A woman is taking a photo outdoors with a DSLR camera. She is holding the camera up to her eye, and her face is partially visible. She appears focused on capturing the shot. The background is blurred, showing some greenery.

A New Weapon in the Fight for Authenticity

Many hurdles stand in the way of widespread content authenticity: hardware adoption, software adoption, content platforms integrating solutions, and much more. However, while these problems are daunting, the need for content authenticity is only increasing. Generative AI shows no signs of slowing down or getting worse.

Sevier thinks part of why people are returning to film photography is not just for the experience but for its authenticity.

“I think there’s a huge number of people excited about film because they distrust digital photography, so I think the H1 can help get that energy back into serious digital photography. I think people will find it exciting because we’re bringing that ground truth back.”

A graphic showing four squares labeled A. Authenticate, B. Bind, C. Compare, and T. Trust. Icons include a heart, lock, photo with a heart, and a web form, illustrating a process from image authentication to trust.

Much like a film negative captures the exact light that hit the film at a precise moment, the H1 creates a digital negative — a permanent, immutable ground truth that this person used that camera to capture an exact image.

Getting that ground truth requires hardware, and making the authentic image useful involves software. Atom Images is trying to tackle both these problems, and while there is work to be done, Sevier is “excited” to see how the H1 may change how people use their cameras.

“I think it could be a renaissance for photography, because I think generative AI has kind of pushed the authentication and security and provenance to the forefront of need. The H1 will address issues of authenticity, classic PhotoShopping, and image theft. I’m excited to see where the H1 takes photography.”

“I think it’s a new paradigm in trust,” Sevier concludes.

Pricing and Availability

The Atom Images H1 is undergoing extensive beta and field testing now and is available to preorder for $239 in anodized black or brushed aluminum. After December 2nd, the price will increase to $299. Atom Images expects the first H1 units to arrive to customers in early 2025.


Image credits: Atom Images

Read Entire Article