South Korea’s main opposition leader, Yoon Seok-youl, was mistaken when he initially thought a martial law announcement was an artificial intelligence-generated ‘deepfake,’ or a manipulated video that’s often incredibly hard to differentiate from reality. Deepfakes make use of machine learning algorithms to replace the likeness of one person with another in video and audio files, often making it appear as though they have said or done something they have not. This incident highlights increasing concerns about the potential misuse of AI technologies and the importance of detection measures and legal regulations to address its consequences. While in this case, the announcement turned out to be real, it underscores the need for awareness about both the potential implications of deepfakes and the reality of misinformation in the digital age. Stay informed and verify sources to avoid falling into the trap of false information.