Government
3 min read

North Carolina Moves to Crack Down on AI Deepfake Videos: What You Need to Know

North Carolina lawmakers are considering new regulations to combat the spread of AI-generated deepfake videos. Learn what these proposed laws mean, why they matter, and how they could impact you.

North Carolina Moves to Crack Down on AI Deepfake Videos: What You Need to Know

Artificial intelligence has brought incredible advancements, but it’s also opened the door to new challenges—one of the most pressing being the rise of deepfake videos. In North Carolina, lawmakers are taking action to address this issue head-on, with new proposals aimed at regulating AI-generated content and protecting citizens from digital deception.

The Deepfake Dilemma: Why Regulation Matters

Imagine receiving a video of a loved one asking for help, only to discover it was a convincing fake. Or picture a political ad that appears authentic but is entirely fabricated. These scenarios are no longer science fiction. Deepfakes—AI-generated audio and video that mimic real people—are becoming increasingly sophisticated and harder to detect.

North Carolina’s House Bill 934 is a direct response to these concerns. The bill would make it a crime to distribute audio or video of someone “acting in a manner that the person did not actually speak or act.” In other words, if you share a deepfake that falsely represents someone, you could face legal consequences.

What’s in the Proposed Law?

House Bill 934 outlines clear penalties for those who create or distribute deepfakes. Violators could be charged with a Class 1 misdemeanor, which may result in prison time. Additionally, victims would have the right to sue for $1,000 in damages each time the deepfake is redistributed. This approach not only punishes offenders but also empowers victims to seek justice.

The bill is bipartisan, reflecting widespread concern about the impact of deepfakes on public trust, especially as elections approach. Lawmakers are particularly worried about the technology’s potential to deceive voters and influence political outcomes.

More Than One Bill: A Broader Push for AI Accountability

House Bill 934 isn’t the only proposal on the table. Another measure, House Bill 375, seeks to introduce harsher penalties for those using AI to generate explicit content and requires political candidates to disclose when their ads are AI-generated within 90 days of an election. While this bill hasn’t yet come up for a vote, it signals a broader legislative effort to keep pace with rapidly evolving AI technology.

Real-World Impact: Scams and Misinformation

The urgency behind these bills is clear. There have already been cases in North Carolina where residents were scammed by deepfakes, including a recent incident where a Pittsboro resident lost thousands of dollars to a scammer posing as her grandson. As deepfakes become more convincing, the risk of fraud and misinformation grows.

How You Can Stay Safe

While lawmakers work on new regulations, individuals can take steps to protect themselves:

  • Be skeptical of unexpected requests for money or sensitive information, even if they appear to come from someone you know.
  • Verify the source of videos and audio clips before sharing or acting on them.
  • Stay informed about the latest scams and digital safety tips.

Looking Ahead

Both House Bill 934 and House Bill 375 still need approval from the House, Senate, and Governor to become law. Previous attempts to regulate deepfakes in North Carolina have stalled, but growing public awareness and bipartisan support may help push these measures forward.


Key Takeaways:

  1. North Carolina is considering new laws to regulate AI-generated deepfake videos.
  2. House Bill 934 would criminalize the distribution of deceptive deepfakes and allow victims to sue for damages.
  3. Additional legislation targets AI-generated explicit content and political ads.
  4. Deepfakes pose real risks, including scams and election interference.
  5. Staying vigilant and verifying information can help protect you from digital deception.
Source article for inspiration