Share this...

By Larry Magid

This post first appeared on Mercury News.

I never thought I would have to say this but– political extremism is expected to have an impact on the upcoming election and – perhaps even scarier – the potential aftermath of the election, should some people decide to take things into their own hands if it doesn’t come out as they hope.

The president’s response to moderator Chris Wallace’s Tuesday night debate question about white supremacy didn’t help, when he advised the extremist group Proud Boys to “Stand down and stand by.”

Social media is likely part of the blame, but there are, of course, other reasons for any increase in white supremacy and extremism which, FBI Director Chris Wray called the biggest domestic terrorism threat in recent testimony before the House Homeland Security Committee.

The Netflix movie, The Social Dilemma, helps explain why this is true. Algorithms designed to present us with content we find compelling, along with advertising that’s likely to interest us, fill our newsfeed with posts that appeal to our political leanings, world views, susceptibility to various theories and points of view, including conspiracy theories.  And, as the movie points out, it’s not that Mark Zuckerberg and other tech leaders set out to enable people to push us to political extremes. It’s an unintended consequence of brilliantly written code designed to give what the algorithms think we want to see. The movie includes interviews with former employees of Facebook, Twitter and other tech companies, including those who are now very critical of their former employer. But, as some in the movie point out, these same services also do a lot of good including helping disaster victims, raising funds for worthy causes and helping people organize social justice campaigns.

The tendency for social media to nudge people into dark places is not entirely the fault of the technology. There is also the bubble you put yourself in based on who you friend and interact with and the content you chose to view.

I’ve seen almost nothing on my social feed that even comes close to white supremacy and other hate speech. I go out of my way to interact with people with differing points of view, but almost all are civil and polite and I avoid people who are obvious bigots so that, too, helps shape what I see. I also don’t belong to very many groups, which can influence what you see even if it seemingly has nothing to do with the groups you’re in.

You could, for example, join a group around a totally non-political interest where there happens to be a connection because a significant number of people in that group are also in political groups, which may have nothing to do with the interests you share in common. You could also like something or click on links just because you’re curious, and all that is recorded and analyzed by the codes that help determine what you will see.

Facebook has a tool that gives you some sense of what it thinks your interested in. It’s not complete, but it does show you some of the things it knows about you that it uses to display targeted ads. You can access it at tinyurl.com/fbinterests, or if you go to Settings and then Ads, you’ll see a link for “Your Interests” and that’s broken down into categories including business and industry, news and entertainment, travel, and education. Click on the More link and one option is “Lifestyle and Culture.” When I clicked on that, I found out that I was interested in both the Democratic and Republican parties, happiness, homelessness, Black Friday shopping, Democracy, and instant messaging. If you hover over each interest, it will vaguely tell you why it thinks it’s relevant – typically because you may have liked one of their pages or one of their posts or clicked on an ad related to their page.

Twitter has a page called Interests from Twitter (tinyurl.com/twitterinterests), which gives you some – but not much – insight into what they think they know about you. I’m apparently interested in both Alexandria Ocasio-Cortez and Donald Trump Jr. along with education, drinks, gaming, British football, animals and auto racing, even though some of those topics barely interest me. But the fact that Twitter thinks I’m interested in both AOC and Donald Trump Jr. along with his father and Barack Obama, helps explain why I see such diverse political content.

Facebook and Twitter have been working on this problem and have made progress in the last few months. I spoke with one expert on background who told me that it’s now harder to stumble into extremist content, but that it’s still there if you know where to look and not impossible find extremist groups based on your connections, friends, likes and other activity.

I’m not exactly sure what Twitter and Facebook can do to reign-in their algorithms, but I suspect there is a way that they can help people avoid being pulled into directions that they might not otherwise go. What they can definitely do is take down extremist content as soon as they’re aware of it. They’re doing more of that, which is a good thing, but there is a lot of content that still shows up, which helps feed political extremism. And when they do take down extremist content, they get pushback not only from the groups themselves but public officials who feel that the companies are censoring political speech.

In the meantime, there are things we can all do including not sharing content that may not be truthful, not joining groups whose views you detest, and putting thought into what you like or otherwise react to.

Clearly, we should not believe everything we see on social media and use independent means to verify anything that seems to be fishy. You’ll find more advice in the Guide to Media Literacy and Fake News I cowrote at ConnectSafely.org/Fakenews


Share this...