Marco Rubio impersonator contacted officials using AI voice deepfakes – computer security experts explain what they are and how to avoid getting fooled

by Matthew Wright, Rochester Institute of Technology and Christopher Schwartz, Rochester Institute of Technology

Someone used artificial intelligence technology to produce a voice clone, or deepfake, of Secretary of State Marco Rubio and used it to leave voice messages for several officials, according to a report in The Washington Post on July 8, 2025. The report cited a State Department cable stating that the impostor “contacted at least five non-Department individuals, including three foreign ministers, a U.S. governor, and a U.S. member of Congress” using the messaging app Signal.

The impostor left voice messages for at least two of the officials and also used AI technology to mimic Rubio’s writing style in text messages, according to the report. The impersonator’s communications began in mid-June.

The FBI warned in a May 15 alert about an “ongoing malicious text and voice messaging campaign” in which “malicious actors have impersonated senior US officials.” The alert noted that the campaign includes “vishing” attacks. Vishing is a portmanteau of the words voice and phishing, and refers to using voice deepfakes to trick victims into giving information or money, or compromising their computer systems.

The ability to clone a person’s voice is increasingly within reach of anyone with a computer.

As cybersecurity researchers, we see that ongoing advances in deep-learning algorithms, audio editing and engineering, and synthetic voice generation have meant that it is increasingly possible to convincingly simulate a person’s voice.

Even worse, chatbots like ChatGPT are capable of generating realistic scripts with adaptive real-time responses. By combining these technologies with voice generation, a deepfake goes from being a static recording to a live, lifelike avatar that can convincingly have a phone conversation.

Cloning a voice

Crafting a compelling, high-quality audio copy of someone’s voice used to require artistic and technical skills, powerful hardware and a substantial sample of the target voice. Unfortunately, that is no longer the case.

Today, a rapidly growing industry offers accessible services capable of generating moderate- to high-quality voice clones for a modest fee. Some of the most advanced tools now require just a minute or even a few seconds of voice data to produce synthetic audio that’s convincing enough to fool listeners, sometimes even the target speaker’s closest family and friends.

Researchers have been able to clone voices with as little as five seconds of recording.

Protecting against scams and disinformation

British multinational design and engineering company Arup reported to police in Hong Kong in January 2024 that it had been scammed out of US$25 million by fraudsters who used “fake voices.” According to a survey in 2024 of more than 3,000 adults in the U.K., more than a quarter said they had been targeted by a voice deepfake scam in the previous 12 months.

We at the DeFake Project of the Rochester Institute of Technology, the University of Mississippi and Michigan State University, and other researchers are working hard to be able to detect video and audio deepfakes and limit the harm they cause. There are also straightforward and everyday actions that you can take to protect yourself.

Be mindful of unexpected calls, even from people you know well. This is not to say you need to schedule every call, but it helps to at least email or text message ahead. Also, do not rely on caller ID, since that can be faked, too. For example, if you receive a call from someone claiming to represent your bank, hang up and call the bank directly to confirm the call’s legitimacy. Be sure to use the number you have written down, saved in your contacts list or that you can find on Google.

Additionally, be careful with your personal identifying information, like your Social Security number, home address, birth date, phone number, middle name and even the names of your children and pets. Scammers can use this information to impersonate you to banks, realtors and others, enriching themselves while bankrupting you or destroying your credit.

Here is another piece of advice: know yourself. Specifically, know your intellectual and emotional biases and vulnerabilities. This is good life advice in general, but it is key to protecting yourself from being manipulated. Scammers typically seek to suss out and then prey on your financial anxieties, your political attachments or other inclinations, whatever those may be.

This alertness is also a decent defense against disinformation using voice deepfakes. Deepfakes can be used to take advantage of your confirmation bias, or what you are inclined to believe about someone.

If you hear an important person, whether from your community or the government, saying something that either seems very uncharacteristic for them or confirms your worst suspicions of them, you would be wise to be wary.

This article was updated on July 8, 2025, to include news that someone used voice cloning technology to impersonate Rubio.The Conversation

Matthew Wright, Professor and Chair of Cybersecurity, Rochester Institute of Technology and Christopher Schwartz, Research Scientist in Cybersecurity, Rochester Institute of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Leave a Reply

Your email address will not be published. Required fields are marked *