Fake IT workers deployed by North Korea are using AI technology, including voice-changing tools, to trick western companies into hiring them, Microsoft has said.
The US tech firm said a signature Pyongyang money-raising ruse is being enhanced by AI, which is helping create fake names and alter stolen IDs to increase the credibility of false applicants for IT and software development jobs.
The scam typically involves state-backed fraudsters applying for remote IT work in the west, using fake identities and the help of “facilitators” in the country where the company targeted is based. Once hired, they send their wages back to Kim Jong-un’s state and have even been known to threaten to release sensitive company data after being fired.
According to a blogpost from Microsoft’s threat intelligence unit, Pyongyang is using AI to bolster the effectiveness of its ploy.
Microsoft listed a number of AI-related scams in use by North Korean groups, called Jasper Sleet and Coral Sleet in line with the convention of cybersecurity analysts giving monikers to unnamed clusters of assailants.
The tech company said the scammers had used voice-changing software during remote interviews to mask their accents, allowing them to pass as western candidates. They also use the AI app Face Swap to insert the faces of North Korean IT workers into stolen identity documents and to generate “polished” headshots for CVs.
“Jasper Sleet leverages AI across the attack lifecycle to get hired, stay hired, and misuse access at scale,” Microsoft said.
Last year, Microsoft said it had disrupted 3,000 Microsoft Outlook or Hotmail accounts used by fake North Korean IT workers.
Microsoft said the fake workers had used AI platforms to generate “culturally appropriate” name lists and matching email address formats to construct false identities for job applications. The company said an example prompt might be “create a list of 100 Greek names” or “create a list of email address formats using the name Jane Doe”.
They also use AI to scour job postings for software and IT-related roles on jobs platforms such as Upwork, then use the skill requirements listed on those ads to craft more effective applications. Upwork has said it takes “aggressive action to … remove bad actors from our platform”.
Once hired, the fake workers then use AI to write emails, translate documents and generate code as they attempt to stave off being discovered as a fraud or sacked for poor performance, Microsoft said.
Companies have also been urged to carry out job interviews for IT workers on video or in person to head off the threat. Microsoft added that interviewers can spot a deepfake video or image via a series of “tells”, such as pixellation at the edges of faces, eyes, ears and glasses – and inconsistencies in how light interacts with an AI-generated face.

6 hours ago
10

















































