Firms may find it difficult to screen applicants now that deepfake is involved. The FBI has warned that employers interviewed people who have used face-swapping technology to simulate someone else, and are also handing over stolen personal information as their own.
People who use deepfake – technology that uses artificial intelligence to make it appear as if the person is doing or saying things that they are not really – interviewing people about working remotely or working from home in information technology, programming, databases and more software roles, according to an FBI public service announcement. Employers noticed some telltale signs of digital tricks when the lip movements and facial expressions did not match the sound of the interviewee, especially when she was coughing or sneezing.
Deepfaking interviewers also tried to pass on personally identifiable information stolen from someone else to undergo past verification.
This is the latest deepfake application that entered the mainstream in 2019 with political coup. Since then, hobbyists have used deepfake for milder stunts like housekeeping or replacing the extremely serious Crusader in the Cloak with a more jovial one .other people’s faces and voices, and placing victims in embarrassing situations such as pornography or a case
But the threat of using deepfake for political purposes remains as it did when FacebookPresident of Ukraine Volodymyr Zelensky in March. The EU has just strengthened its legislation on disinformation to: but their use in mundane situations like job interviews shows how easy it is to get hold of and use this scam technology.