Back in October, I had the opportunity to speak at Qcon SF on combating AI-generated fake images, as part of a fantastic track including Ryan Dahl (creator of Node.js) and Miško Hevery (creator of Angular.) The talk explored my work with Starling Lab on a system to register images and share attestations.
The talk was recorded, and the slides and recording are available on qconsf.com to registered attendees. QCon will publish the video on April 24, 2024, but until then, below is a teaser and part of an interview with me. QCon also wrote an article summarizing the talk.
Because images and videos can now be produced with artificial intelligence, it’s increasingly difficult to determine which are real and which are fake. One approach has been to look for “tells” that reveal an image is AI-generated. However, as AI continues to improve, this approach creates an arms race between AI photo generation and AI detection, a race that AI detection is unlikely to win.
In this presentation, you’ll learn:
- How the traditional archival process can help
- How cryptographic hashes can prevent tampering
- Why a timestamping service regularly printed data in the New York Times classified section (!)
- How digital signatures can help investigators “go to the source” even if the data is sent by an untrusted third-party
What’s the focus of your work these days?
Now that the major hype around blockchains seems to be over, and all the trendy people have moved on to AI, I think there’s an opportunity to take some parts of blockchain technology – reliable, cheap, and useful things like cryptographic hashes, digital signatures, and timestamping services – and apply them to everyday life. In many cases, these cryptographic tools are already being used under the hood, without the general public being aware of it. For instance, some credit cards use digital signatures to prove that the credit card chip was present at the terminal. And in software, digital signatures and hashes are used to guarantee that code hasn’t been tampered with in the download process. These tools are very low-cost and can be used now. But there are many problems in everyday life that aren’t using cryptography, but which would greatly benefit from it. Bridging the gap between cryptography and everyday business problems has been the focus of my most recent work.
What’s the motivation for your talk at QCon San Francisco 2023?
One of my most recent clients, Starling Lab, which was co-founded by the Stanford Department of Electrical Engineering and the USC Shoah Foundation, has been working on the problem of image verification: how do we know which images are real, and which are fake?
Image verification has become an increasingly important societal problem due to the rise of AI and citizen journalism. Starling Lab uses an archival approach, meaning that rather than looking for the “tells” of AI in an image, the Lab encourages journalists, news organizations, and open source investigators to record information about images that are believed to be valid. For instance, a photographer in the field might use their cell phone to digitally sign and timestamp a photograph that they just took. The motivation for my talk was to share both the fascinating problem of image verification and the cutting-edge technology that we can use to combat fake images.