Could you spot a fake? Some worry video could impact 2020 election as new form of ‘fake news’

Video Playback Not Supported

Play Video

Play

Mute

Current Time 0:00

/

Duration Time 0:00

Loaded: 0%

Progress: 0%

Stream TypeLIVE

Remaining Time -0:00

Playback Rate

1

Chapters

Chapters

descriptions off, selected

Descriptions

subtitles off, selected

Subtitles

captions settings, opens captions settings dialogcaptions off, selected

Captions

Fullscreen

Autoplay:
On / Off

This is a modal window.

This video is not supported on your platform. Please make sure flash is installed.

Captions Settings Dialog

Beginning of dialog window. Escape will cancel and close the window.

Text Color White Black Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Opaque Background Color Black White Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Transparent Window Color Black White Red Green Blue Yellow Magenta Cyan Transparency Transparent Semi-Transparent Opaque

Font Size 50% 75% 100% 125% 150% 175% 200% 300% 400%

Text Edge Style None Raised Depressed Uniform Dropshadow

Font Family Proportional Sans-Serif Monospace Sans-Serif Proportional Serif Monospace Serif Casual Script Small Caps

Defaults Done

Fake videos: The latest trend in ‘fake news’

Are you still watching?

Yes, ContinueNo

Raleigh, N.C. — In an era of "fake news" you will soon be seeing videos that are fakes. Recent developments in artificial intelligence are now making it possible for average people to alter video.

"We are seeing videos that are faked to a degree that would rival Hollywood," said North Carolina State University professor of computer science James Lester.

The videos are made with computer algorithms which have the ability to study hours of online video, noting unique movements of a person’s mouth.

"You’re seeing the ability of a system to create something that didn’t happen," said Lester. " That means you as an individual, when you watch the news, when you get a Facebook feed, when somebody sends you a video, it’s very difficult for you to know whether this is depicting an event that actually happened or this is depicting an event that somebody wants you to see."

Video Playback Not Supported

Play Video

Play

Mute

Current Time 0:00

/

Duration Time 0:00

Loaded: 0%

Progress: 0%

Stream TypeLIVE

Remaining Time -0:00

Playback Rate

1

Chapters

Chapters

descriptions off, selected

Descriptions

subtitles off, selected

Subtitles

captions settings, opens captions settings dialogcaptions off, selected

Captions

Fullscreen

This is a modal window.

This video is not supported on your platform. Please make sure flash is installed.

Captions Settings Dialog

Beginning of dialog window. Escape will cancel and close the window.

Text Color White Black Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Opaque Background Color Black White Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Transparent Window Color Black White Red Green Blue Yellow Magenta Cyan Transparency Transparent Semi-Transparent Opaque

Font Size 50% 75% 100% 125% 150% 175% 200% 300% 400%

Text Edge Style None Raised Depressed Uniform Dropshadow

Font Family Proportional Sans-Serif Monospace Sans-Serif Proportional Serif Monospace Serif Casual Script Small Caps

Defaults Done

What can be done to help consumers identify fake videos?

Are you still watching?

Yes, ContinueNo

One of the first and most notable videos of this kind was made by researchers at the University of Washington. They created a program that studied hours of online speeches by former President Barack Obama. This knowledge allowed the computer to create a model that could mimic Obama and create realistic videos of the former president speaking.

Lester, who has been studying artificial intelligence since the 1980s, said he was surprised at just how real the fabricated videos look.

"These generative algorithms which are really quite new, the kinds that are driving these developments are astonishing, so maybe three years these have existed. The likelihood that this will have very significant societal implications is very high."

Video Playback Not Supported

Play Video

Play

Mute

Current Time 0:00

/

Duration Time 0:00

Loaded: 0%

Progress: 0%

Stream TypeLIVE

Remaining Time -0:00

Playback Rate

1

Chapters

Chapters

descriptions off, selected

Descriptions

subtitles off, selected

Subtitles

captions settings, opens captions settings dialogcaptions off, selected

Captions

Fullscreen

This is a modal window.

This video is not supported on your platform. Please make sure flash is installed.

Captions Settings Dialog

Beginning of dialog window. Escape will cancel and close the window.

Text Color White Black Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Opaque Background Color Black White Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Transparent Window Color Black White Red Green Blue Yellow Magenta Cyan Transparency Transparent Semi-Transparent Opaque

Font Size 50% 75% 100% 125% 150% 175% 200% 300% 400%

Text Edge Style None Raised Depressed Uniform Dropshadow

Font Family Proportional Sans-Serif Monospace Sans-Serif Proportional Serif Monospace Serif Casual Script Small Caps

Defaults Done

Synthesizing Obama: Lip sync learned from audio

Are you still watching?

Yes, ContinueNo

You don’t have to be a computer scientist to make these kinds of videos anymore. An anonymous developer built a program called FakeApp, which became very popular when the latest version was posted on Reddit in January.

The app allows people with slightly above average computer knowledge and equipment to create "faceswap" videos where one face is laid on top of another. These are much more advanced than what you may be used to on social media apps like Snapchat. Much like the Obama video, the app uses artificial intelligence to study movements of the face and mouth to create accurate swaps.

The videos created with FakeApp are called "deepfakes" and often involve adult videos. Hobbyists have filled the internet with "deepfakes" where the faces of famous actors are inserted into pornographic films.

Reddit took note of the trend and banned a big chunk of the "deepfake" community.

Video Playback Not Supported

Play Video

Play

Mute

Current Time 0:00

/

Duration Time 0:00

Loaded: 0%

Progress: 0%

Stream TypeLIVE

Remaining Time -0:00

Playback Rate

1

Chapters

Chapters

descriptions off, selected

Descriptions

subtitles off, selected

Subtitles

captions settings, opens captions settings dialogcaptions off, selected

Captions

Fullscreen

This is a modal window.

This video is not supported on your platform. Please make sure flash is installed.

Captions Settings Dialog

Beginning of dialog window. Escape will cancel and close the window.

Text Color White Black Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Opaque Background Color Black White Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Transparent Window Color Black White Red Green Blue Yellow Magenta Cyan Transparency Transparent Semi-Transparent Opaque

Font Size 50% 75% 100% 125% 150% 175% 200% 300% 400%

Text Edge Style None Raised Depressed Uniform Dropshadow

Font Family Proportional Sans-Serif Monospace Sans-Serif Proportional Serif Monospace Serif Casual Script Small Caps

Defaults Done

Would recognize this WRAL-TV weather forecast as a fake video?

Are you still watching?

Yes, ContinueNo

The 2020 election

Right now, it’s pretty easy to spot a fake video, but the technology is moving at such a rapid pace that many in the field are growing concerned with how these videos could impact the presidential election in two years.

"You would readily admit, I think, that these are not first rate video. There are issues with lighting. There is flicker. You can tell that it’s synthetic. On the other hand, you can also tell that just over the horizon, it’s about not to have these issues," Lester said.

The programs and apps could allow people to manipulate video of the candidates, getting them to say or do things they didn’t say or do.

"Politically, that’s a very dangerous area to be living in because it means that we can’t trust what we see," Lester said.

Jeff Ward, who considers himself a technologist, said the videos will require voters to pay closer attention during the campaign season.

"Developing a culture of skepticism is needed for however many years we are going to be dealing with this," he said.

Video Playback Not Supported

Play Video

Play

Mute

Current Time 0:00

/

Duration Time 0:00

Loaded: 0%

Progress: 0%

Stream TypeLIVE

Remaining Time -0:00

Playback Rate

1

Chapters

Chapters

descriptions off, selected

Descriptions

subtitles off, selected

Subtitles

captions settings, opens captions settings dialogcaptions off, selected

Captions

Fullscreen

This is a modal window.

This video is not supported on your platform. Please make sure flash is installed.

Captions Settings Dialog

Beginning of dialog window. Escape will cancel and close the window.

Text Color White Black Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Opaque Background Color Black White Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Transparent Window Color Black White Red Green Blue Yellow Magenta Cyan Transparency Transparent Semi-Transparent Opaque

Font Size 50% 75% 100% 125% 150% 175% 200% 300% 400%

Text Edge Style None Raised Depressed Uniform Dropshadow

Font Family Proportional Sans-Serif Monospace Sans-Serif Proportional Serif Monospace Serif Casual Script Small Caps

Defaults Done

Bad attempt: Check out this comical fake video

Are you still watching?

Yes, ContinueNo

What can be done?

Experts in the field are pointing a finger at tech companies like Facebook to take a lead in stopping fake videos from spreading.

"There are tools that we can use to try to determine whether a video is fake or not, whether it is the way the data is compressed before it is stored or the pixelization," Ward said.

He also said there are not many tools now available to those who may want to take legal action against someone using their image in a fake video.

"We may need to go back to the drawing board to figure out how these fit into the conventional legal categories,” he said. “Which ones can be used to best deter this from happening because the real problem is once it’s out there (and) a lot of the damage is done."

Source Article