Here’s how blockchain could help crack down on abusive imagery

  • Blockchain could be an effective and efficient solution for helping to rid the internet of abusive imagery.
  • Tackling abusive imagery can help victims heal.

Last month, India was shaken to the core by an alleged gang rape of a fifteen-year-old girl, which the perpetrators reportedly filmed and shared online. Sadly, this type of crime and its documentation occur in every country. For survivors of sexual violence, knowing that images of their ordeal exist and circulate online can cause even more emotional damage. The images may also be used to blackmail and silence the victim.

Cloud storage platforms and social media networks, where photos and videos are stored every time they are posted or shared, typically do not tackle this issue, citing respecting user privacy. Developments in technology, however, can provide a solution that will enable cloud platforms to remove illegal imageries, limiting user privacy concerns.

The COVID-19 pandemic and recent social and political unrest have created a profound sense of urgency for companies to actively work to tackle racial injustice and inequality. In response, the Forum’s Platform for Shaping the Future of the New Economy and Society has established a high-level community of Chief Diversity and Inclusion Officers. The community will develop a vision, strategies and tools to proactively embed equity into the post-pandemic recovery and shape long-term inclusive change in our economies and societies.

As businesses emerge from the COVID-19 crisis, they have a unique opportunity to ensure that equity, inclusion and justice define the “new normal” and tackle exclusion, bias and discrimination related to race, gender, ability, sexual orientation and all other forms of human diversity. It is increasingly clear that new workplace technologies and practices can be leveraged to significantly improve diversity, equity and inclusion outcomes.

The World Economic Forum has developed a Diversity, Equity and Inclusion Toolkit, to outline the practical opportunities that this new technology represents for diversity, equity and inclusion efforts, while describing the challenges that come with it.

The toolkit explores how technology can help reduce bias from recruitment processes, diversify talent pools and benchmark diversity and inclusion across organisations. The toolkit also cites research that suggests well-managed diverse teams significantly outperform homogenous ones over time, across profitability, innovation, decision-making and employee engagement.

The Diversity, Equity, and Inclusion Toolkit is available here.

The majority of people who share and view abusive images do so on completely legal, popular social media platforms and through ordinary messaging services. Research from the National Center for Missing and Exploited Children measured the growth rate of child sexual abuse imagery on the internet. The results are horrendous: a frightening increase from 3,000 reports of such images in 1998, to 1.0 million in 2014, and to 18.4 million in 2018. Social media networks provide both viewer participation and sufficient storage space, meaning that large files, e.g. videos, can be easily stored and shared. Hence the key to cutting the supply of such content is to remove it from the cloud.

A technological solution
Technology exists, however, to assist cloud storage platforms to remove all illegal and abusive images from their databases, and prevent the addition of new ones. This solution combines both blockchain technology and a technology called PhotoDNA, which has been developed by Microsoft and Dartmouth College. PhotoDNA creates a unique digital fingerprint of a digital image or video. This digital fingerprint remains mostly the same even if the image is cropped, resized, changed with filters, or manipulated in some other way. Importantly, while it is easy to generate a fingerprint from an image, it is impossible to reverse-engineer the image from a fingerprint. There is therefore no risk of fingerprints being misused to covertly disseminate images. In the case of sexual exploitation images, this is of particular importance.

Blockchain is a decentralized and distributed database that allows multiple parties, who do not necessarily trust each other, to create a trusted source of truth they can share, update and work with. Unlike a centralized database, with blockchain no single party is in charge of or owns the database. Everybody owns it equally.

Some law enforcement agencies already have databases of child sexual exploitation imagery and their digital fingerprints, however they usually do not share them with anyone. What they could use however, is a way to coordinate their activities, be it locally or globally, in order to remove these images regardless of their jurisdiction. It is here that blockchain technology is a game changer. It can allow for a global coordinated database, accessible by all, and yet owned by none, in order to scan and remove images – regardless of where in the world they are posted. Using both technologies, the agencies could share the fingerprints they have already collected as part of a joint global effort. Each agency could then access this information, even if it doesn’t have any fingerprints to share.

Being blockchain-based, the database is highly secure by nature. Specifically in our use case, there are no incentives for hackers to attack it, as the database will not hold any images, only fingerprints of images. As I explained above, the digital fingerprint by itself is meaningless since it cannot be used to recreate the image, nor does it link back to the original image. In this context, its only value is to offer a way of labelling or categorizing images without actually looking at or sharing them.

Cloud storage platforms would get read-only access to this blockchain database of digital fingerprints of illegal images. Generating an image’s digital fingerprint is fast and easy. So is the process of looking up a fingerprint within the database. Therefore, cloud platforms could generate a digital fingerprint of any image that is uploaded and check whether it exists in the database. In the same manner, this process would allow them to scan already existing stored data and look for illegal content that had already been uploaded. Local law enforcement agencies would be required to provide instructions on the next steps, once an image has been identified as illegal. The following chart sums up the screening procedure:

Detecting illegal images from a user’s point of view.

Orbs

Image: Detecting illegal images from a user’s point of view.

Importantly, the entire process will be completely transparent, since it will be recorded on the blockchain, enabling visibility of each new piece of information being added to the chain. Enforcement agencies will be able to see who added which fingerprints. Cloud platforms will know exactly which fingerprints they need to screen in which jurisdictions. Enforcement agencies will easily be able to check whether their restrictions are properly enforced by cloud platforms. This alone would be a major reason for using blockchain.

To be sure, no solution is perfect and this solution alone cannot eradicate exploitative imagery. Users would most likely be averse to having their content screened – or even tagged or removed. And once illegal content is discovered by a cloud platform, the onus could be on that platform to take the next step and coordinate actions with law enforcement, opening up a larger debate on freedom of speech and censorship.

Additionally, technology cannot stop highly determined perpetrators. These individuals will continue posting abusive images of children on the illegal corners of the internet, also known as the darknet. There will never be one single solution that will eliminate sexual images of children from the internet.

Still, while we can not eliminate the problem, we can make a significant dent in it. Perpetrators who are very technologically savvy will have to be tracked down and stopped in other ways. However for most of the online users who do not have the technological expertise to access the darknet, we can, and should – easily and quickly – remove these images. Importantly, since the blockchain is immutable, every transaction on it is documented forever. Meaning that if someone did try to upload illegal images, was rejected and then tried to delete those images from their own systems, the blockchain would still have a record that they attempted to upload illegal content. This, I imagine, could be a very useful tool for law enforcement.

As a Co-Founder of Orbs, the largest blockchain infrastructure company in Israel, it was important for me to use our technology for good and promote blockchain for social impact. Consequently, I founded the Hexa Foundation, a nonprofit that promotes the use of blockchain technology for social impact. At the Foundation we focus on educating governments on the potential added value and efficiencies of blockchain, and on executing blockchain projects that create social impact. We are currently in talks with governments and enforcement agencies to explore the opportunity of using blockchain as a solution that will help cloud platforms remove these images.

Change will not be simple. As in many solutions that require coordination, the hardest part here is aligning interests and politics of enforcement agencies globally. There is reason for optimism here: Since no one claims ownership over this database, and it is shared amongst all players, the incentive to participate will hopefully be greater than in other consortium efforts in which politics can sometimes hamper advancement. Still, in order to kickstart progress, at least one major agency would need to lead the way. Some of the agencies that we are in touch with are showing great interest and are hopefully willing to take the lead. Wide adoption, however, will take time. It will also take a mindset shift and new training to ensure that a wider set of policing agencies understand the role new technologies like blockchain could play in crime prevention

It is blockchain which allows for coordination between bodies in a way that was not previously possible, and thus creates an effective and efficient solution – and one that I hope can help the healing process of victims, globally.

Abusive images of children are in themselves a form of abuse. Every person who shares or views such images becomes complicit in the original abuse, and further damages the victim. There have been many reports of survivors haunted by the continued circulation of images recorded years ago. It can make it extremely difficult if not impossible for them to perceive their suffering and trauma as being in the past. Instead, their ordeal just continues. By succeeding to remove one part of the problem we actually can, today, make an impact for the better.