0.1708
900 319 0030
x

Science & Technology

iasparliament Logo
June 19, 2018

What do you know about “Deep fake” technique and discuss the potential threats posed by it. (200 words)

Refer – Live mint

Enrich the answer from other sources, if the question demands.

1 comments
Login or Register to Post Comments

IAS Parliament 6 years

KEY POINTS

Deep fakes

·         The term derives from the portmanteau of “deep learning” and “fake”.

·         They are videos made with the help of artificial intelligence that appear genuine but depict events or speech that never happened.

·         An aspiring video-faker can simply feed images of a given person to a machine-learning algorithm, which in turn learns to overlay their features onto the body someone in another video, thereby creating a simulacrum of the original person doing things she hasn’t done.

·         Without precautions, they could prove highly disruptive—to people and politics alike.

Threats               

·         These are fairly easy to produce.

·         Individuals – Deep fakes are put to some devious uses such as embarrassing an ex, extorting strangers or simply inducing havoc.

·         Individuals (more often women) into fake porn undermine their agency, reduce them to sexual objects.

·         Inflicts reputational harm to individuals that can devastate careers even if the videos are ultimately exposed as fakes.

·         Blackmailers might use fake videos to extract money or confidential information from individuals.

·         Democracy – Hoax films of officials making divisive statements, acting corruptly or otherwise behaving badly may have the capability to alter the results of democratic elections in India.

·         National Security – People tend to lend more credence to videos and deep fakes can easily misled them and have the capability to instigate violence and affect law and order and thereby have an effect on national security.

·         Evidence – Video evidence has become a pillar of the criminal-justice system. Deep fakes could feasibly render such evidence suspect.

Way Ahead

·         Technologists will need to put more efforts to detect and flag manipulated videos automatically.

·         Platform companies, which bear ultimate responsibility for the content they serve up, must come to grips with the legal and commercial implications of this threat, while working to educate the public.

·         In the short term, society needs to adapt to an unsettling new reality that seeing doesn’t necessarily means to believe it, and the illusions get more perfect by the day. 

ARCHIVES

MONTH/YEARWISE - MAINSTORMING

Free UPSC Interview Guidance Programme