Online platforms that fail to tackle child sexual abuse material could lose bank accounts
Follow VerifyMy on LinkedIn to keep up to date with the latest safety tech and regulatory developments.
British banks could lead the way in using their influence over their customers to tackle child sexual abuse materials (CSAM) found online by withdrawing financial services.
In a move which mirrors the existing policy of MasterCard towards the adult content industry, it has been proposed that banks around the world should refuse to offer services to any websites which facilitate the publication and sharing of CSAM.
The idea was first presented to the United Nations Office on Drugs and Crime earlier this summer. To implement this in the UK, the scope of an existing regime in place to prevent the financing of terrorism and organised crime, operated by the Joint Money Laundering Intelligence Taskforce (JMLIT) of banks, security services and regulators, could easily be extended.
Mastercard’s move was directed towards adult websites which were not rigorously checking the age of the performers in either pre-recorded or live content and ensuring that all those depicted had given consent. This policy, known as “AN 5196”, was a reaction to criticism of the payment network for facilitating both underage pornography and so-called “revenge porn”, where the material is published without the consent of the user. It was evident immediately that a business-critical global network of this sort could drive far more rapid change than almost any individual government.
If UK banks or indeed those in any major economy were to follow suit and apply this same form of pressure by withdrawing services from any platform which deliberately or inadvertently allowed illegal content, it is likely to have an even more dramatic effect, as there would be no option to switch to an alternative payment provider.
If adopted by UK financial services, this will create an urgent challenge for all sites which allow user-generated content to review it before it is published to ensure that no one underage is featured at any time in their output. Typically, this has been a very expensive and labour-intensive process, with human moderators struggling to keep pace with the volume of content uploaded, even when AI-based tools are used in parallel. Just reviewing content reported by users has led to massive backlogs and illegal material, which has only been identified through automated systems remaining online, pending human review.
VerifyMyContent was developed when MasterCard first introduced its new rules and has been conceived to apply a balance of automated and manual review to leverage limited human resources to the maximum degree possible. This means a small team can review a significant amount of material without ignoring flagged content or resorting to random sampling, which can allow much illegal content through the net.
As a safety tech provider, VerifyMy’s mission is to provide frictionless, trustworthy solutions for online platforms to maintain their integrity, protect their reputation and safeguard their customers.
Find out how VerifyMy can help your business with its online safety and compliance challenges.