Fake Obama created using AI video tool - BBC News

  • 🎬 Видео
  • ℹ️ Описание
Invalid campaign token '7mSKfTYh8S4wP2GW'
Fake Obama created using AI video tool - BBC News 4.5
Researchers at the University of Washington have produced a photorealistic former US President Barack Obama.
Artificial intelligence was used to precisely model how Mr Obama moves his mouth when he speaks.
Their technique allows them to put any words into their synthetic Barack Obama’s mouth.



Скачать — Fake Obama created using AI video tool - BBC News

Скачать видео
💬 Комментарии к видео
Автор

As a film student, I think its amazing. As a citizen I am worried

Автор — Louis Moreno

Автор

Must have been equally easy to create Osama bin Laden videos.

Автор — itsjustmyopinion

Автор

just don't be evil. I see how this can get dangerous.

Автор — Simisani Moyo

Автор

"Like all technology, it can be used for a negative purpose." What positive purpose could this technology possibly have?

Автор — Ryan MacFarlane

Автор

He who figures out how to use this for porn shall become the first Trillionaire :P

Автор — The Awesome Man

Автор

It was only a matter of time before this happens for both videos and voice and it really will discredit footage either video or voice as evidence when it can be faked easily enough but worse yet is a lot of fake news thats going around now, this could make it much worse and the public have little trust in the system as it is.

Автор — Paul Aiello

Автор

The development of tech has gone too far. This is scary stuff. Most importantly, this is violation of privacy.

Автор — Jan

Автор

The most worrying now is anyone can simply create a video of you saying something against the government or an individual that will destroy your identity and lock you up in prison....

Автор — Bitcoin Earner

Автор

Videoshop: putting words in my mouth since 2019

Автор — S L

Автор

Or as the spys would say: in the future there will be software to fool people into thinking a real video is fake and a fake video is real.

Автор — internet mail

Автор

It used to be the saying..."if i didn't see it with my own two eyes, i wouldn't have believed it". This will no longer apply. We are entering a time where the system HAS to fall apart (democracy) in order to rebuilt into something more primitive again. Once you can control public opinion in an election, it is all over....period!

Автор — Pils Nrimgaard

Автор

Crazy. I already knew this was already happening but to see it up front like this is scary.

Автор — What did u just say?

Автор

Oh boy, nothing terrifying about this.

Автор — Simon Carlile

Автор

"People will use it for negative reasons, we should work to make sure that never happens""". Reeeaaallly that has ever worked. I honestly think it never has worked ever.

Автор — Chris Moltisanti

Автор

How many years has the US intelligence community had this technology?

Автор — John Miranda

Автор

Seems quite reckless to create the tool, suggest a solution for identifying videos made using the tool, but then not actually create the solution...

Автор — Jamil Jami

Автор

You better reverse engineer that shit lol. Why is this even a thing what's the point of creating it?

Автор — Piyapol Tientanyakit

Автор

This should be illegal. I have no idea why people can do that!

Автор — Nerdy Snailie

Автор

Identifying edited videos is simply not viable. The fake video can always be downsampled to mask manipulation artifacts as simple compression artifacts. Plus the technology is already producing realistic reconstructions, and it didn't even exist just a few years ago. In 5 years when the technology is unimaginably more advanced, we will just have to cope with the fact that all such information is potentially faked.

Автор — Andy Barrette

Автор

Technology is going beyond our imagination a few years back

Автор — Ashik Rahman