The Next Generation of Identity Theft is Identity Hijacking

In 2024, AI will run for President, the Senate, the House, a Governor of several states.  Not as a named candidate, but by pretending to be real candidates.  What started as fake pictures of politicians has evolved to fake recordings of candidates' voices.  Fake videos are not far behind.  

In 2023 we saw fake pictures of former President Trump fighting police in front of a NYC courthouse.  Those were widely shared on social media (often without the original attribution that they were meant to be an example of what AI generated images can do).  Now we have AI recreating President Biden's voice as part of a robocall campaign to trick people into not voting in the primaries.

We've gone from worrying about politicians lying to us to scammers lying about what politicians said....and backing up their lies with AI generated fake "proof".  Make no mistake about it, this is a scam.  Not to try to steal money, but to steal votes.  This same technology has been used to recreate the voices of kids claiming to have been abducted so fraudsters could extort a ransom for kidnappings that never happened.  It's "identity hijacking", the next generation of identity theft, in which your digital likeness is recreated and taken where you didn’t want to go.

Fake videos of politicians giving speeches that never happened or falsely confessing to crimes are not far behind.  Don't think so?  Ask yourself why the screen actors guild went on strike.  One key demand is that AI not be used to recreate actors, because the fakes are realistic enough that the actors would never need to be hired again.  Social media will be the delivery system of many political scams, but as the latest robocalls showed there are other ways to reach out and trick voters.

There are several efforts underway to combat this, for those who want to check if a call or video is genuine.  Most have significant shortcomings.  One promising area that CTM is focused on is to continuously prove that the words you hear were actually said by the person they claim to be from.  These word fingerprints break disinformation scams and identity hijacking.

If we build the right defenses, AI may be able to run for office, but we can keep it from winning

portions of this blog were first used in an interview with TheStreet.com