US regulators to regulate political deep fakes before 2024 election.
US regulators to regulate political deep fakes before 2024 election.
The Growing Concern Over Deep Fake Political Ads and the Role of Blockchain Technology
The rise of deep fake technology has raised concerns about its potential impact on political advertising, with the United States Federal Election Commission (FEC) taking a unanimous vote on August 10th to address the issue. Deep fakes refer to the use of artificial intelligence (AI) to create realistic fake videos or images that often depict individuals saying or doing things they never actually did.
The FEC’s petition aims to regulate deep fakes in political ads ahead of the 2024 elections, specifically targeting the use of AI to manipulate the public’s perception of political opponents. Robert Weissman, president of Public Citizen, the advocacy organization behind the petition, stressed the gravity of the situation, calling deep fakes a “significant threat to democracy.”
Already, instances have been observed where candidates incorporate fake, AI-generated images into their campaigns to manipulate public opinion. For example, during the Republican nomination race, Florida governor Ron DeSantis spread three images showing former United States president Donald Trump embracing Dr. Anthony Fauci. These manipulated images highlight the need for regulations to prevent the spread of deceptive information.
In the FEC meeting, Public Citizen sought clarification on existing laws that aim to prevent fraudulent misrepresentation in political campaigns and whether AI-generated deep fakes are covered under these regulations. Lisa Gilbert, the executive vice president of Public Citizen, emphasized the urgency of addressing deep fakes and other deceptive uses of AI in election ads, stating that each passing day makes the regulation more crucial.
- California commission outlines disclosure requirements for cryptocurrency campaigns.
- Alex Mashinsky on bail with electronic monitoring, protective order signed by judge.
- India’s central bank proposes AI-powered conversational payments system, wants to use crypto for digital document signing. UK recruiting for CBDC academic advisory group.
The FEC’s decision to push the petition forward is an encouraging sign that regulators are taking the threat posed by AI-generated deep fakes to democracy seriously. The next step in the process is a 60-day public comment period where stakeholders can express their concerns and insights. In the words of Craig Holman, a lobbyist for government affairs with Public Citizen, this period provides a critical forum for policy advocates, experts, and voters to discuss the potential deluge of deep fake ads in upcoming election cycles.
It is important to note that this is not the first time Public Citizen has raised concerns about deep fakes. Their initial petition, filed in July, emphasized that deep fakes have the potential to swing election results. This petition garnered support from members of both chambers of the U.S. Congress, bolstering the call for action.
The prevalence and potential impact of deep fake political ads underscore the need for innovative solutions to combat the spread of misinformation and preserve the integrity of democratic processes. One technology that has shown promise in this regard is blockchain.
Blockchain, the underlying technology behind cryptocurrencies like Bitcoin, is a decentralized and immutable ledger that records transactions in a transparent and tamper-evident manner. While its most notable application has been in the financial sector, blockchain has the potential to address the challenges posed by deep fakes in political advertising.
By leveraging blockchain technology, political ads can be securely recorded, providing an immutable record of their origin and content. This would allow voters and regulators to verify the authenticity of political advertisements, mitigating the risks of deep fake manipulation. Candidates and political parties could register their ads on the blockchain, ensuring transparency and accountability in the political landscape.
Moreover, with the use of smart contracts on blockchain platforms, predefined rules and criteria could be established to ensure compliance with advertising regulations. These smart contracts can automatically analyze the content of political ads, flagging any potential deep fakes or misleading information. This would reduce the burden on regulators and make the process of identifying deceptive ads more efficient.
However, it is important to note that while blockchain technology offers potential solutions, it is not a panacea for all the challenges associated with deep fakes. Implementing blockchain-based solutions would require collaboration among regulators, tech companies, and political organizations. Additionally, there may be trade-offs between privacy and transparency that need to be carefully considered when designing such systems.
In conclusion, the FEC’s move to address deep fakes in political ads is a crucial step in safeguarding the integrity of democratic processes. The potential use of blockchain technology to combat deep fakes is an exciting development, offering transparency, accountability, and verifiability in political advertising. As technology continues to evolve, it is essential for regulators, policymakers, and stakeholders to collaborate and explore innovative solutions to protect the democratic foundations of our societies.