These days, trolls don’t necessarily lurk beneath bridges in order to ensnare unsuspecting travelers. Instead, they hide out in the comment sections on social media posts, ready to incite wrath and stir up controversy with their incendiary remarks. Because Facebook knows how quickly reasonable discourse can quickly devolve thanks in part to these online trolls, they’ve made a move to establish intelligent discussions through their new “Forecast” app.
The premise of Forecast is fairly straightforward. Facebook has invited an assortment of so-called experts (whether they work in the medical field or academia, or some other field) to cast their vote on predictions about the future. Not only will they share their vote, though, they’ll also pitch in their own two cents about these predictions, sparking what is expected to be an insightful and reasonable conversation about the topics.
However, while the premise is exciting (smart people! not basement dwellers! talking about serious stuff!), there’s more than a small amount of risk associated with Forecast. For starters, what exactly is Facebook planning on doing with all of this information that is being volunteered on their app? And secondly, are they going to take precautions to help prevent the spread of misinformation when these results are eventually published?
The fact is, Facebook is notorious for propagating and spreading misinformation. Now, I’m not blaming Facebook itself for this issue. Rather, the sheer volume of its user base inevitably leads to flame wars and dishonesty. You can’t spell “Fake News” with at least a couple of the same letters used in Facebook. Or something like that. The problem arises when people see the results of these polls, recognize that the information is being presented by these hand-picked experts, and then immediately takes them at face value.
It’s not so much that most people are simple-minded or unable to think for themselves; rather, they’re primed to believe that the admittedly educated guesses from these experts are somehow better, smarter, than what would be presented to them by the average layperson. The bias is inherent in the selection process of who is and isn’t allowed to vote. By excluding everyday folks like you and me (I certainly wasn’t given an invite!), undue prestige may be attributed to these projections.
At the moment, many of these projections are silly bits of fluff. One question asks, “Will Tiger King on Netflix get a spinoff season?” Another one wonders, “Will Mulan debut on Disney+ at the same time as or instead of a theatrical release?” But other questions? Well, they’re a little more serious than that. And speculating on serious issues (such as COVID-19, or the presidential election) can lead to the spread of serious — and potentially dangerous — misinformation.
Facebook has implemented very strict guidelines about what types of questions are allowed and which ones are forbidden. That, at least, is a step in the right direction. It’s no secret that expectation can actually lead to the predicted outcomes, directly influencing actions and behaviors. While it’s too early to tell if Forecast will ever gain that much power, it undoubtedly puts us in a position of wondering if and when intervention may be necessary.
But I’ll be honest with you: I don’t exactly trust Facebook’s ability to put this cultivated information to good use. Sometimes a troll doesn’t have to be overtly provocative in order to be effective, and it wouldn’t be too much of a stretch to see someone in a position of power exploit the results of these polls to influence the public. It’ll be interesting to see if Forecast is still around in the next few years, but alas, there’s no option for me to submit my vote on that to find out.