Deepfakes 2.0 - How Neural Networks are Changing our World
Add Share
AI/Machine Learning • October 2020
Deepfakes 2.0 - How Neural Networks are Changing our World
Explore more
Language
English
Level
Beginner
Length
37 minutes
Type
online conference
About
Deepfakes 2.0 - How Neural Networks are Changing our World
About

Imagine looking into a mirror, but not to see your own face. Instead, you are looking in the eyes of Barack Obama or Angela Merkel. Your facial expressions are seamlessly transferred to that other person's face in real time.

The TNG Hardware Hacking Team has managed to create a prototype that transfers faces from one person to another in real time based on Deepfakes. Neural networks detect faces within a video input, then translate them and integrate them back to the video output. Through this technique it is possible to project deceptively real imitations onto other people.

About speakers
About speakers
Martin Förtsch
Principal ConsultantTNG Technology Consulting GmbH
Martin Förtsch studied computer science and is working as a software consultant for Munich based IT consulting company TNG Technology Consulting GmbH. Occupational his focus areas are Agile Development with Java, search engine technologies and databases. As an Intel Software Innovator he is strongly involved in the development of software for gesture control with 3D-cameras like Intel RealSense and built an Augmented Reality wearable device based on this technology.
Thomas Endres
Associate PartnerTNG Technology Consulting GmbH
In his role as an Associate Partner for TNG Technology Consulting in Munich, Thomas Endres works as an IT consultant. Besides his normal work for the company and the customers he is creating various prototypes - like a telepresence robotics system with which you can see reality through the eyes of a robot, or an Augmented Reality AI that shows the world from the perspective of an artist. He is working on various applications in the fields of AR/VR, AI and gesture control, putting them to use e.g. in autonomous or gesture controlled drones. But he is also involved in other open source projects.
Details
Language
English
Level
Beginner
Length
37 minutes
Type
online conference