Share this...

by Larry Magid
This post first appeared in the Mercury News

Because they’ve been written about extensively, there’s no need for me to repeat how the internet – Facebook, Twitter and lesser-known sites like Gab and the now-suspended Parler have contributed to the division, vitriol and violence that has shaken our country. What I want to know is how we got here and how we can get back to a conversation that leads to solutions rather than strife.

Mea Culpa

When I started my career as a technology journalist back in 1982, I mostly reviewed products, but I also wrote numerous articles about online services like CompuServe, The Source, GENie, and — a few years later — newcomers like Prodigy and AOL. I even wrote two books about all the great things you could do when you equipped your PC with a modem – the first, The Electronic Link: Using the IBM PC to Communicate, was published in 1984. A decade later Random House published Cruising Online: Larry Magid’s Guide to the New Digital Highways.

These books and scores of articles I wrote were extremely upbeat because I was awestruck at the enormous power and possibilities that these services offered. I was especially excited about forums and other online tools that let people from different parts of the world get together. I remember celebrating how great it was that people with common interests could find and support each other.

Back then I put very little thought into what could go wrong. I wrote an occasional piece about privacy and security, but it wasn’t until 1994 when I wrote my first publication about the potential danger to children, published by the National Center for Missing and Exploited Children whose board I later joined and served on for 20 years.

Eventually, those online services gave way to the Internet which emerged as an option for consumers just in time to make it as a last-minute addition to my 1994 book. That was an incredibly exciting time for me because it opened up the possibility of a free flow of information where anyone with a connection – not even constrained by the likes of AOL, CompuServe and Prodigy, could interact with any other connected person anywhere in the world. Although I disagreed with parts of it, I was nevertheless supportive of the late John Perry Barlow’s Declaration of the Independence of Cyberspace, which began, “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.”

Social Contract

Barlow went on to write “You claim there are problems among us that you need to solve. You use this claim as an excuse to invade our precincts. Many of these problems don’t exist. Where there are real conflicts, where there are wrongs, we will identify them and address them by our means. We are forming our own Social Contract. This governance will arise according to the conditions of our world, not yours. Our world is different.”

It was true that at least some of the problems articulated by government authorities didn’t exist or were overblown. That’s still somewhat true, but what Barlow got wrong was the notion that “we” internet users would be able to address and solve problems via our own “social contract.”

The closest thing to a “social contract” are so-called community standards or terms of service that companies impose as rules of the road for their users. Even though users don’t generally get to vote on these terms, they are a contract of sorts where – by clicking – users agree to abide by the rules in exchange for being able to use the service – an agreement generally referred to as a “contract of adhesion” which Cornell Law School defines as “a contract drafted by one party (usually a business with stronger bargaining power) and signed by another party (usually one with weaker bargaining power, usually a consumer in need of goods or services).

These standards vary by company but generally prohibit things like hate speech, violent and graphic content, cruel and insensitive postings, abuse and harassment, and promoting violence or terrorism, among other things. But these are not really social contracts — they’re rules that most people probably never actually read and — even if they overlap with common decency — sometimes fail to heed in highly charged discussions where passion and adherence to dogma can overrule logic.

Guidelines don’t punish opinions

One thing I’ve never seen as part of a community guideline is a requirement that you adhere to a specific ideology. One can be far to the left or right of the political spectrum and still treat others with respect and refrain from harassment, violence and other banned activities.  And, in fact, most people – regardless of political ideology – are not only able to follow these rules but are anxious to treat others the way they want to be treated. My Facebook and Twitter feeds include political conversations where people express a wide range of opinions without mistreating each other.

But, what could be a productive conversation designed to help us learn from each other has evolved into a war of words – and worse when it turns to actual violence – from camps or tribes that – at best – speak past each other rather than to each other.

Even worse than uncivil sparing is that many people – literally – aren’t on the same page with anyone outside their ideological tribe. Part of the problem is our natural tendency to hang out with like-minded people, but the blame also lies with services that use algorithms to feed us content and recommend people and groups to follow that they think we’ll like. The Netflix movie The Social Dilemma dramatizes how these algorithms work to make sure we see things that we’re inclined to like – mostly as a way of getting us to buy products based on our online behavior. But the same computer code that shows us shoes we’re inclined to buy shows us political content that’s likely to influence us. It’s not always obvious but it puts us on an insidious slippery slope that can shape our worldview.

What can be done?

I don’t have space or time before my deadline – nor maybe even enough insight  — to outline everything we can do to fix the problem, but one good start is a re-examination of the way services like Facebook, Twitter and YouTube recommend or feed us content.  I understand why they do it, but in some ways, I long for the good old days when people got their news from newspapers and broadcasters who didn’t have the ability to fine-tune their content based on our personal inclinations. If you and I read the same hometown printed newspaper or watch a TV local news show, we’re going to learn the same facts, regardless of our politics, age, race, gender and other factors.  In many ways, that’s a good thing, and I wish that our online experiences gave us more real facts and less filtered news and one-sided opinions.

I also wish that more of us used the critical thinking skills that we should have learned as children.  Don’t believe everything you see, always check your sources and look for bias and hidden agendas.

Disclosure: Larry Magid is CEO of, a non-profit internet safety organization that receives financial support from Facebook, Twitter, Google and other tech companies.

Share this...