
"Driving the news: OpenAI rolled out its new Sora iOS app last week, powered by the company's updated, second-generation video-creation model. The app is unique in that it allows users to upload photos of themselves and others to create AI-generated videos using their likenesses - but app users need the consent of anyone who will be shown in a video."
"The flip side: The number of reported impersonation scams has skyrocketed in the U.S. in recent years - and that's before AI tools have came into the picture. In 2024, Americans lost $2.95 billion to imposter scams where fraudsters pretended to be a known person or organization, according to the Federal Trade Commission. Between the lines: AI voice scams - which have a lower barrier to entry given how advanced the technology already is - have already taken off."
"Earlier this year, scammers impersonated the voices of Secretary of State Marco Rubio, White House chief of staff Susie Wiles, and other senior officials in calls to government workers. Last week, a mother in Buffalo, New York, said she received a scam call in which someone pretended to be holding her son hostage and used a likeness of his voice to prove he was there."
OpenAI launched Sora, an iOS app that uses a second-generation video-creation model to generate videos from uploaded photos of people. The app requires consent from anyone whose likeness appears and offers controls to specify permitted scenarios for generated characters. Generated videos are easy to share and can have watermarks removed, increasing potential misuse. Impersonation scams have surged, with Americans losing $2.95 billion in 2024 to imposter fraud. AI voice scams have emerged, including impersonations of senior officials and distressing hostage-style calls that use voice likenesses to convince targets.
Read at Axios
Unable to calculate read time
Collection
[
|
...
]