1 step forward, 2 steps back? Why automated content filtering (still) doesn’t work for startups

May 14, 2020

Just because startups provide leading innovation in the fight against COVID-19 does not mean they can magically make automated content filters work.

After making the hard-learnt lesson that content filters don’t work, we ask policy makers to not take 2 steps back and ask startup entrepreneurs to figure it out anyway. Instead, we encourage taking a big step forward by making the no general monitoring obligation the bedrock of the Digital Services Act.

“Cometh the hour, cometh the startup”

COVID-19 has created an extraordinary situation requiring extraordinary solutions. It prides us to see startups rise to the challenge. Velmio, an Estonian pregnancy health app, did a 180° turn and developed a corona-tracker app to monitor the outbreak. Doctolib, a French consultation management software for doctors, is waiving subscription fees during this moment of crisis.

There will be positive and negative learnings from the COVID-19 epidemic, but it should not mean that we throw overboard hard-learned lessons from pre-COVID-19 times. 

COVID-19 hasn’t made content filters work!

Although automated content filtering is being reassessed during the COVID-19 epidemic,  startup communities have shown why they are undesirable as a measure to tackle undesirable third-party content posted online: 

  • From an innovation standpoint, startup ecosystems thrive in a colourful and diverse platform economy. Automated content filtering risks preemptively closing the door on many outside-the-box and crazy ideas that could lead to the next big startup. 
  • From a technical perspective, content filters still do not work anywhere near as good as they should. In some cases, like with 3D printing or with VR/AR, automated content filtering solutions simply do not exist. 
  • From a financial perspective, employing thousands of content moderators is something that bigger players may afford. For startups, this is a potential deal-breaker.

We’ve been here before in the debate on the copyright directive. Heini Zachariassen, founder of Vivino, told us: “Content filters for user uploads is a bad idea. We have 600 million uploads. We need the current liability regime of the E-Commerce Directive. (ECD)” The current review of the ECD is a prime opportunity to take a clear stand and re-confirm the no general monitoring obligation as a bedrock for startups.

Back to basics: Understanding the Problem

At the heart of the debate on third-party content lies an old conundrum: How to compel platforms to deal with third-party content that may be illegal, harmful or undesirable for legitimate reasons?

The E-Commerce Directive, a tried and tested legislation that is critical for the digital economy, laid down the rules of the game. While the law may be aging, writers of the inheriting Digital Services Act should consider why it’s worked so well in the last decades.

The framework gives legal certainty to entrepreneurs. They have a set of obligations to fulfill (like implementing a notice-and-action mechanism) and in return receive a limited liability exemption. Automated content filtering would do the opposite. It would leave entrepreneurs trying to implement a technology that is inherently insufficient, exposing them to disproportionate legal risk.

Technology is not an end in itself

There are avenues to explore that do not lead down the slippery slope of automated content filtering. Updating the intermediary liability framework so that it incentivises proactive behaviour from platforms is one key lever to be pulled. Regulatory sandboxes could be another.

We want to use COVID-19 as a catalyst to consider the possibilities of digital solutions. Rather than making laws that mandate a technical solution, technology should inform the laws. Good laws are not technically prescriptive but rather set the right incentives. 

Automated content filtering should not be required as the solution for the over 12,000 platform startups in Europe alone. When it comes to automated content filtering, let us hope that we do not forget the hard-learned lessons from the past mandate and take two-steps back as a consequence of the COVID-19 outbreak. Put the no general monitoring obligation at the centre of the Digital Services Act and take a big leap into the future with startups.