Wednesday, July 24, 2024

Wrong Klitschko: what speaks against a deepfake



Status: 06/28/2022 10:08 am

Berlin Mayor Giffey thinks she fell in love with an AI-generated image of kyiv Mayor Vitali Klitschko. A contrasts-The evaluation raises doubts about this fear.

Only one thing is clear: Franziska Giffey did not speak to kyiv Mayor Vitali Klitschko on Friday, as initially assumed. The mayor of Berlin had already noticed this during the video call, which then ended. But how the hitherto unknown managed to engage Giffey in a half-hour conversation is a mystery.

The Senate Chancellery itself made the unusual process public that same day. On Twitter, he delivered a dystopian-looking explanation: He speculated that the wrong Klitschko was a so-called deepfake, that is, an artificial intelligence-generated image of the politician. An evaluation of Political magazine ARD Contrastes now it raises doubts that such technology has actually been used.

The photos are from an old interview.

According to the Senate Chancery, there are no video recordings of the conversation. contrasts however, he was able to examine five photos taken by the Senate Chancery during the video call. According to the metadata, they were created between 4:59 p.m. and 5:15 p.m., provided the time on the camera was set correctly.

The shots aren’t perfect, sometimes the screen is small and the wrong Klitschko is out of focus. However, analysis suggests the footage was not created by a computer, but was instead copied from an interview Klitschko gave to Ukrainian journalist Dmitry Gordon in early April. Gordon released it on YouTube at the time.

No deepfake no template

In fact, a source is also required for a deepfake: at least a photo, better a video, preferably even a lot of video material. The results generated in real time so far have not been very convincing; sooner or later, telltale image errors will appear if the computer miscalculates accordingly.

Based on existing facial images, a computer collects data to learn how a person frowns, blinks and moves their lips. In the end, the AI ​​tries to simulate these learned facial expressions and create its own images, but for example to match a new text from the speaker. So it is not a copy of the template, but a new creation.

On the left, Vitali Klitschko as seen by Franziska Giffey; on the right, while she was talking to a Ukrainian journalist.

Image: Berlin Senate Chancellery, Dmitrij Gordon/YouTube

It is not a simulation, just a copy.

The problem with the Senate Chancery photos: they don’t seem to show any computer simulated images. really found contrasts matches in the source footage for all five photos from Giffey’s video call, all within the first five minutes or so of the existing video. Not only is Klitschko’s facial expression identical in each case, but so is the background or part of the background that Klitschko is currently covering with his head. An image showing a silent Klitschko was also used twice; apparently the material was used multiple times by counterfeiters.

While the findings aren’t definitive proof that the fake Klitschko wasn’t a deepfake, they do suggest that more pre-existing material may have been rearranged. It seems conceivable that minor stutters and uneven frame rates, which are common in video calls, hide minor inconsistencies. The fake Klitschko would still be a fake, but a less sophisticated “shallow fake,” at least from a technological point of view.

Fake-Klitschko spoke Russian

In any case, the result was so overwhelming that Giffey spoke with the stranger for about 30 minutes and four other people who were present in Berlin, according to the Senate Chancellery, did not initially notice anything either. It was the content alone that aroused skepticism: the alleged Klitschko, for example, demanded that German security authorities help transport the young Ukrainians back to Ukraine.

In fact, Klitschko also speaks German. At the request of the fake Klitschko, the conversation was conducted in Russian, according to the Senate Chancery, and an unseen person on the other end translated over and over again afterwards. The language barrier is apparently not an explanation for the initially successful forgery: Giffey and another person in the room are said to understand Russian. Consequently, the soundtrack is also an extensive forgery.

Mayors of Vienna and Madrid also affected

Berlin’s ruling mayor is not the only one who has met a fake Klitschko for a video call in recent days. According to a photo, the mayor of Vienna, Michael Ludwig, met the same counterfeiters. The mayor of Madrid, José Luis Martínez-Almeida, also had such an appointment.

It is still not clear who is behind all this. Russian comedians Lexus and Vovan have drawn attention with similar actions in the past. “Harry Potter” author JK Rowling was recently tricked into posing as Ukrainian President Volodymyr Zelenskyy on a video call. The duo used the opportunity for pro-Putin propaganda and posted an edited video online. The two have also played pranks on various Western politicians over the years.

Ebenezer Robbins
Ebenezer Robbins
Introvert. Beer guru. Communicator. Travel fanatic. Web advocate. Certified alcohol geek. Tv buff. Subtly charming internet aficionado.

Share post:


More like this

Top Reasons to Buy Instagram Likes from

Buying Instagram followers is a strategy some individuals and...

Green Glamour: How to Achieve Eco-Friendly Acrylic Nails

In the vibrant world of beauty and nail care,...

The Future Of Horse Racing In The Digital Age  

Horse racing, a sport steeped in tradition and history,...

How to Sell CS:GO Skins for Real Money

CS:GO skins have become not just an ordinary design...