20200602_1619 - misinformation, infowar, and content policing on social media
* Note created: [time=Tuesday, 02 June 2020 16:19:01 +0700]
* Note updated: [time=Saturday, 06 June 2020 14:01:01 +0700]
* ###### tags `sTREaming`
Finally got around to reading this yesterday after putting it into Toby as a backlog on May 29th
> What responsibility does Facebook have to the truth?
> Is Facebook responsible for fake news? And if so, is Facebook responsible for outcomes that can be tied to the proliferation of fake news?
> Basically, Facebook needs to be called out on its ==selective hyper-vigilance==.
> On Facebook, only the original creator of a post on Facebook can edit it. The crowd has no recourse but to complain to Facebook, and Facebook, of course, has already said that the content isn’t their problem. So no matter how interested the crowd might be in preserving the truth, there is no mechanism for them to do that. That’s an ==architectural decision== that seems obviously preferential when designing a “social network.”
> The second question is one that is probably not possible to answer without returning to the first. It is one of __epistemology__ — ==how we know what we think we know==. In a large, distributed human society such as ours, the majority of our knowledge is, philosophically speaking, epistemologically weak. Most of what we know, we know because someone we trust told us so. Which means that most of our knowledge is second-hand, at best. But how could it be any other way?
Added note on the clipping ([using worldbrain.io](https://getmemex.com/)):
> so my current thought is:
> If FB doesn't want to censor fake news, they shouldn't censor anything else.
> if Twitter fact checks Trump, it should fact check everyone else (`// todo: read up on what actually happened there. got reports and issued the fact checking?`)
> Ning [did it manually](https://medium.com/@diego./the-sin-eaters-of-the-tech-industry-ab11abb85331), wikipedia crowdsources it.
> everything is by design. but we can't get everything right from the start.
> so policies should change. and changes to the system follow. everything is a living breathing changing being
And then today (June 2nd) stumbled upon this: <https://www.theverge.com/interface/2020/6/1/21276969/facebook-walkout-mark-zuckerberg-audio-trump-disgust-twitter> (`// got it from twitter?`)
Internally FB employees are speaking up and doing virtual walkouts because of this. `// been seeing tweets about this the past 12 hours and wondering what it was about.`
idk.... this is complex stuff. content censorship and policing
Fake news, hate speech, dangerous content, illegal content... subjective and complex.
Can't be done perfectly with algorithm, can't be done perfectly by human.
Technical aspects aside, hosting and content moderating is a wicked problem. Not that simple. Can't keep everyone happy. Judgement calls.
Just a quick recap on recent events for context when I read this in the future:
- Twitter fact checked and flagged Trump's "shooting and looting" (`// why now and why stop there`)
- Youtube got into it, and people freaking out saying it's violation of free speech
- FB got into this multiple times.
I personally don't care much about the `delete facebook`, `quit FB` movements throughout the years.
I haven't been active there since early 2010, just gradually lost its appeal, around the same time with many people my age there I think.
Nothing too political. I still keep the account as there's no harm in controlling that page for my own digital footprint.
`// cue: [[everything is cyclical. lens of abundance and scarcity]]`
* Update: [time=Friday, 05 June 2020 09:19:01 +0700]
Came across this article https://washingtonmonthly.com/magazine/january-february-march-2018/how-to-fix-facebook-before-it-fixes-us/ #misinformation #disinformation
The author shared some hypothesis on how the social media platforms were being used and complicit in what happened in the US election. And some suggestions on how to _fix the problems_.
* Update: [time=Saturday, 06 June 2020 14:01:01 +0700]
[Scott Galloway tweeted](https://twitter.com/profgalloway/status/1268208690609389571?s=09) out [this OpEd](https://www.nytimes.com/2020/05/30/opinion/sunday/trump-twitter-jack-dorsey.html)
> The Wall Street Journal had a [chilling report](https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499) a few days ago that Facebook’s own research in 2018 revealed that “our algorithms exploit the human brain’s attraction to divisiveness. If left unchecked,” Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”
> Mark Zuckerberg shelved the research.
OK, didn't realise. Wasn't sure how "guilty" Facebook is, but I am wrapping my head around it more.
> These platforms have very dangerous profit motives. When you monetize rage at such an exponential rate, it’s bad for the world. These guys don’t look left or right; they just look down. They’re willing to promote white nationalism if there’s money in it.
Not surprising but that puts things into perspective.
Articulated well [here](https://twitter.com/DanMunro/status/1268212155792232448):
> Solid pitch, but begging, pleading, cajoling or shaming people into taking less profit never works - which we've also seen in healthcare.
> "How many businesses do you know that want to cut their revenue in half? That's why healthcare won't reform healthcare."
> - Rick Scott
Also, I need to steal the phrase "monetising rage" some time. Catchy.
> Galloway, who also posits that, in our hyper-politicized world, this will turn Twitter into a Democratic site and Facebook into a Republican one
> “He’d be the equivalent of a slippery floor at a store that sells equipment for hip replacements,” says Galloway, who also posits that, in our hyper-politicized world, this will turn Twitter into a Democratic site and Facebook into a Republican one.
One interesting snippet from the comment section:
> I like to be called grandpa, dad, and I do my best to be a loyal friend to people I trust and who have a sense of humor. Dorsey? Zuckerberg? I don't like or trust them. They are pretentious cowards. I want nothing to do with their social media platforms.
> Live your life with passion. Get off Facebook and Twitter. They'll suck you dry.
> I turn 71 today, if I'm considered out of touch, then you have no idea what being in touch really means. Too bad for you.
> Fun to think about, but never gonna happen. The internet changed all that, and cable news had changed our tribalism before that. I can now go find my “own facts” about politics, racism, disease, education, whatever on the “world wide web”. Meanwhile, I need to be constantly stimulated and that’s not Facebook’s fault or Twitter’s. I can’t read any more because I need to have a screen up. I can’t focus any more because my brain has been hyper-stimulated.
> I’d like Jack to “pull the plug” and Zuck to “see the light”. But, why is this on them? Media figures need attention, like oxygen. If we start ignoring them (like maybe the Times not printing Trump’s tweets), well. Maybe they’ll go away.
Perhaps we don't need to tap into the ==hot mess of the "global brain" right now== to do good sensemaking.
* Update: [time=Monday, 08 June 2020 12:24:01 +0700]
Another great pieces
* Update: [time=Tuesday, 09 June 2020 14:49:01 +0700]
YouTube caught in same trap as FB. Too scared to craft and enforce good nuanced content policy, but also highly reactive to public outcry. Root problems go unaddressed in favor of scrambling to address today’s crisis in a ham fisted way. People learn ==outrage works==. Cycle repeats.
Ironically, “outrage works” is closer to the ==root problem for engagement oriented ranking and distribution==. If we really want to fix divisive and harmful content, trying to circumscribe ‘the bad’ is unlikely to be effective and yield legitimate questions about editorial judgment.
The root problem is that treating relevance as a view or a like is going to inevitably lead down the path where people make shitty and divisive content. Trying to build policy and models to clean up the fringes of that system ==isn’t addressing the core incentives==.