Philip F. Howard: Social media companies need more regulation | DW Global Media Forum | DW | 12.05.2021
  1. Inhalt
  2. Navigation
  3. Weitere Inhalte
  4. Metanavigation
  5. Suche
  6. Choose from 30 Languages


Philip F. Howard: Social media companies need more regulation

To prevent growing political polarization on social media, "minimal and clear regulations" are desperately needed. This is what Oxford Professor of Internet Studies Philip F. Howard told DW in an interview.

Do digital tools like Facebook, Twitter or TikTok have a positive impact on democracy - for example in places like Russia?

New social media tools have different kinds of impact in different political moments. If they arrive at a time when democracy advocates are looking for ways of working around censorship and surveillance, they often get used in ways that an authoritarian regime can't anticipate and doesn't know how to react to.

Once those tough regimes do build up the capacity to use the features of tools like Facebook, Twitter or TikTok, they get put into service keeping political elites in power.

A hand is shown holding a smartphone displaying the Facebook and Instagram logos

There are still no binding agreements globally to hold social media platforms responsible for spreading false information

Autocrats often use social media campaigns to stabilize their power and manipulate their citizens. How important is the internet for journalists in this context? 

The internet is important in several ways for journalists. Often, digital tools are the means by which journalists outside a crisis country tell the world about what's going on. Digital interviewing, photos from activists and victims in-country get out and help journalists tell the story of the crisis.

For the journalists working within an authoritarian regime, the internet provides a way of publishing their news stories and commentary essays.

GMF compact: How 'social' is social media during a pandemic?

Social media can also have a significant influence on socio-political developments, as the #MeToo movement - among many others - has recently shown. Their algorihms can, however, also ignite and accelerate hate and intolerance. So, do legislators need to come up with new media laws or regulations to prevent this - or would the introduction of such regulations endanger freedom of expression? 

It is challenging to navigate between preventing political polarization on social media and encouraging people to express themselves. But it is possible and certainly worthwhile. 

In most countries, lies and fake news do not get legal speech protections. Social media firms are essentially publishers responsible for the content on their platforms (as) they accept revenue for publishing. So the right combination of light touch regulation and diligence from the firms will prevent polarization and still allow consumers the fun and social aspects of media platforms. We are certainly past the point of industry self-regulation, so the goal now is minimal and clear regulations that show the technology firms their responsibilities and provide enforcement.

Read more: Twitter and QAnon: when should freedom of speech be free?

Philip Howard is a professor and writer. He teaches at the University of Oxford and is the former Director of the Oxford Internet Institute. He writes about information politics and international affairs, is the author of ten books, including "Lie Machines." Foreign Policy magazine named the multi-award-winning author a "Global Thinker" and the National Democratic Institute awarded him their "Democracy Prize" for pioneering the social science of fake news.

On June 15, Howard will speak at the DW Global Media Forum 2021. Register now for your free digital pass to participate in the conference and watch the panel discussion on "Social media inventory: Connecting people, dividing societies?".

DW recommends