Twitter responsible for half of child abuse material UK inv…

The IWF only released figures for the number of incidents that had been verified as child abuse by human analysts, rather than the total number of reports flagged by its scanning software or from the public, meaning the actual numbers are likely to be higher.

Microsoft’s Bing search engine had the second highest number of incidents with 604 recorded between 2016 and 2018, followed by Amazon with 375, and Google with 348.

The IWF found 72 incidents of abuse being openly hosted on Facebook, 18 on its sister site Instagram and 22 on YouTube.

Twitter said it had concerns over how the IWF had compiled its figures.

A spokesperson said: “We have serious concerns about the accuracy of these figures and the metrics used to produce them.

“The IWF has used one data standard across all services, social platforms, file hosting platforms, and search engines, which isn’t a reliable metric, nor does it reflect the scale of proactive work we do in this area.

“We will continue to work with the IWF to address their concerns and improve the accuracy of their data, so that it reflects the full picture of our proactive work to remove egregious child sexual exploitation content from our service. “This work is complex and the offenders are often sophisticated bad actors, which is why it is essential to ensure any data released is robust, accurate, and reflective of the critical work being done in this space.”

Microsoft also queried the data and said that it had made considerable efforts to remove such material from Bing since 2018

A spokesman for Microsoft added: “The data, taken from 2018 and without consideration of improvements made as a result of reports and routine diligence over the course of the current year, are from unverified or raw end-user reports, and are therefore not an accurate measure of the actual prevalence of child sexual exploitation and abuse images on the platform.”

Susie Hargreaves OBE, the CEO of the IWF, said: “Our data is accurate and recorded fairly and consistently regardless of where we find child sexual abuse material. 

“We’re also very happy to make it available to an independent hotline inspection team, comprising a law enforcement auditor and High Court judge, for scrutiny.

“Every time we find an image or a video of a child being sexually abused we perform an assessment against UK law using highly trained, human, analysts. We then trace the content to the host country, and company, before working with partners around the world to get it removed. Our data is trusted by police, governments and internet companies internationally.”

The vast majority of the abuse material the IWF finds is every year is on the open web in the form of websites set up purposely to host child abuse.

In 2018, the organisation found and took down 105,047 URLs hosting child abuse images. 

The IWF, which was founded in 1996, is largely funded by annual membership subscriptions from tech companies, with Facebook, Amazon, Microsoft and Google among its highest contributors, paying more than £80,000 to the organisation.

Twitter is also a member of the IWF but pays a lower membership fee of between £27,000 and £54,000, according to the IWF’s website.

The figures are the first time the IWF has disclosed how many images and links its investigators have found on mainstream tech sites.

Most social media and search sites automatically scan images to check they are not known abuse images before they are uploaded, meaning the vast majority of images are caught before being published.

The NSPCC said the fact that thousands of abuse images were still getting through and being openly hosted on popular sites highlighted the urgent need for a Government regulator to scrutinise companies’ child protection measures. 

The Government has said it plans to introduce a statutory duty of care on tech giants to better protect its users, a measure campaigned for by the Telegraph.

Algolia Custom Site Search

Be the first to comment

Leave a Reply

Your email address will not be published.