by Amy Butcher '17
It’s 2013 and social media is all over the place. Words like “blogosphere”, “twittersphere”, and “[insert social medium here]sphere” that used to sound so awkward coming from the mouths of news broadcasters are now daily occurrences. As for science’s relationship with social media, at first glance the pervasiveness of high-speed communication seems fantastic for scientists and science lovers alike. Surely social media must be good for improving scientific literacy- more ways to share and access information mean a more informed and perhaps even more enthused public, right? Anyone who was online when Curiosity landed on Mars could see firsthand social media’s role in stirring up scientific awe and inspiration- people were truly wrapped up in the story, across nearly every social media platform. Hooray! There is, however, a darker and more insidious trend in social media which counteracts genuine scientific communication online…
Like any medium, online communication can lead to misinformation. With social media, however, this downside seems to be programmed in. Of course, there is the obvious bias online that each user can select which links to click on and which people to “add.” There is also, however, something called “news feed optimization,” or “algorithmic editing,” which ensures that on sites like Facebook the first posts we see are ones which we are likely to agree with, regardless of their accuracy (1). It’s designed to show you primarily things you are interested in or agree with: for example, if you click on a lot of conservative links, Facebook is more likely to show you the posts and opinions of your conservative friends (the ones who also clicked on lots of conservative links) first.
The same goes for scientific links or friends, liberal links or friends, etc. If you click on a lot of liberal links, Facebook will start filtering out your conservative friends, robbing you of the chance to hear the opposing side of various issues (2) -- issues which certainly include those pertaining to science: from climate change to vaccination conspiracies, plenty of scientific issues are politicized. Thus, people become more and more entrenched in their particular viewpoint, regardless of evidence or arguments for or against that stance. That is, Facebook facilitates confirmation bias. Considering both the number of people who rely on social media for news and the seriousness of some scientific issues today, that could be downright dangerous.