Pornographic images found on children’s games in Google Play store

Some 63 apps affected by malicious code called ‘AdultSwine’, cybersecurity firm finds

Pornographic images were found on scores of children's game apps which were in the Google Play store, caused by a malicious code called "AdultSwine".

The discovery of the pictures, which masqueraded as advertisements, by cybersecurity firm Check Point comes as Google faces pressure to do more to protect children, with parents and advertisers accusing its video site YouTube of failing to prevent inappropriate and violent content from being seen.

Researchers at Check Point found that 63 apps were affected.

A Google spokesperson said: “We’ve removed the apps from Play, disabled the developers’ accounts, and will continue to show strong warnings to anyone that has installed them. We appreciate Check Point’s work to help keep users safe.”

READ MORE

Check Point said AdultSwine caused pornographic images to pop up during the children's games, including in a car-racing game named after the Disney cartoon character Lightning McQueen, which according to Google data had been downloaded at least 500,000 times.

In addition to displaying pornographic content, AdultSwine can also show pop ups that state a virus has infected the device, prompting the user to download another app claiming to be able to remove it. But the fake security app also contains malicious software, or malware.

AdultSwine can also cost victims money by tricking users into allowing it to send them premium text messages that charge the mobile phone’s account.

Protection mechanism

A security system called Google Play Protect is supposed to defend customers using Google’s Android operating system from malicious codes by scanning apps for malware. Google says it also vets every app developer in Google Play.

However, with the number of apps on Google Play estimated at more than 3m, according to Statista, lurking malicious code is sometimes spotted and reported only by users or cyber security firms.

Check Point researchers said at least one review left by a parent on Google Play’s store in November had complained about pornographic pictures, which had appeared when his four-year-old was playing a game.

Google has previously responded to fears that overreliance on users reporting inappropriate content was leaving children vulnerable. In December, YouTube said that Google was planning to increase the number of staff reviewing videos to more than 10,000.

Last year, an outcry over disturbing videos aimed at children forced YouTube to terminate the channel ToyFreaks, where a father posted clips of his young daughters screaming in terror, which had gained 8 million followers.

Copyright The Financial Times Limited 2018