The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for deepfakes
Biden's AI advisor, Ben Buchanan, said a way to verify White House releases is "in the works."
This year, Biden was subject to an AI deepfake used to misinform voters.
"We recognize the potential for harm," Buchanan told BI.
The White House is increasingly aware that the American public needs a way to tell that statements from President Joe Biden and related information are real in the new age of easy-to-use generative AI.
People in the White House have been looking into AI and generative AI since Biden became president in 2020, but in the last year, the use of generative AI has exploded with the release of OpenAI's ChatGPT. Big Tech players such as Meta, Google, Microsoft, and a range of startups have raced to release consumer-friendly AI tools, leading to a new wave of deepfakes — last month, an AI-generated robocall attempted to undermine voting efforts related to the 2024 presidential election using Biden's voice.
On Thursday, the Federal Communications Commission declared that such calls are illegal. Yet, there is no end in sight for more sophisticated new generative-AI tools that make it easy for people with little to no technical know-how to create fake images, videos, and calls that seem authentic.
That's a problem for any government looking to be a trusted source of information. Ben Buchanan, Biden's Special Advisor for Artificial Intelligence, told Business Insider that the White House is working on a way to verify all of its official communications due to the rise in fake generative-AI content.
Buchanan said the aim is to "essentially cryptographically verify" everything from the White House, whether a statement or a video.
While last year's executive order on AI created an AI Safety Institute at the Department of Commerce tasked with creating standards for watermarking content to show provenance, the effort to verify White House communications is separate. And Buchanan said it's "a longer process," though it is "in the works."
Ultimately, the goal is to ensure that anyone who sees a video of Biden released by the White House can immediately tell it is authentic and unaltered by a third party.
"This is a case where we recognize the potential for harm," Buchanan said. "We're trying to get ahead of it."
Are you a tech employee or someone with a tip or insight to share? Contact Kali Hays at khays@insider.com or on secure messaging app Signal at 949-280-0267. Reach out using a non-work device.
Read the original article on Business Insider