With less than a month before the next U.S. Presidential election, here's new survey results from the development team of Blind, an anonymous messaging/virtual community app for company staffers that's especially popular at large tech firms.
This one asked a question that's been the subject of constant press coverage since the 2016 election: Is Facebook doing enough to prevent the spread of misinformation and hate speech on its platform related to the 2020 election?
With nearly 1600 responses, most from team members at major technology companies, here are the results (right):
By overwhelming numbers ranging in percentages from the mid 60s to the mid 80s, staff at Microsoft, Apple, Cisco, and other tech firms believe Facebook is not doing enough. Notably, staff at several of these companies -- LinkedIn, Google (thorough YouTube), and Amazon (through its customer reviews section) -- deal with preventing misinformation and hate speech on their own platform, and presumably have better than average knowledge to evaluable Facebook's efforts thus far.
Contrast that with the response of Facebook staff themselves:
71% of respondents answered that the social network is doing enough to combat election period hate speech/misinformation. (Still, it is notable that nearly 1 in 3 of Facebook staff surveyed believe they could do more.)
Launched last week, the Blind survey ran until yesterday -- when as it happened, Facebook removed a post by Trump falsely claiming that the flu is more dangerous than COVID-19.
You can read more details on Blind's company blog -- including results on a related question: On whether (as has been reported by Bloomberg and other outlets) Facebook's policies and algorithms are designed to support Trump and other right wing political figures. Answer:
31% of professionals believe Facebook’s policies and algorithms are designed to benefit Trump’s Campaign. Contrasted by only 5% of Facebook professionals believe that.
Yes, just 5%.
Previously: In Biden Vs. Trump Survey, Staff At Top Tech Companies Less Liberal Than Often Assumed.
So why does it matter at all? Are facebookers too stupid to make up their own minds? I think that is what we are saying here. Facebook has to figure out what is propaganda from fact and then suppress (censor) that propaganda from the platform because their users cannot tell what is true and what is false.
It is a pretty sad statement on facebook users. I would rather have facebook be a platform and let the users figure out what to believe.
Posted by: TD Gunner | Thursday, October 08, 2020 at 05:28 AM