This is a fruitful disagreement with contributions from:
Cited Participants:Linda Dong, Karla Cole, J.T. Trollman, Rasmus Andersson,amélie lamont, Chikezie Ejiasi, Ankit Shah, Renée DiResta, Robert Mueller III, Tristan Harris, Daniel Burka, Julie Ann Horvath, Kevin Kwok, Bobby Goodlatte, Joel Califa, Mike Montiero, John Hanawalt, Elizabeth Warren, Josh Puckett, Kimberley D.
Curators:Buster Benson, Vicki Tan(DM one of us on Twitter for an invite if interested in helping curate)
💫 WHAT IS THIS DOCUMENT’S GOAL?
To help more of us collectively learn, orient, and ultimately take positive steps forward on the problems highlighted by this disagreement. We are but human and will no doubt make some missteps along the way, and we know we're not the final authority on fruitful disagreement, so please send us feedback, corrections, and additions that might help us improve over time.
STEP 1: LEARN
STEP 2: ORIENT
STEP 3: ACT
Gather facts and articulate values from multiple perspectives.
Identify shared values that point to evidence of improvement. Propose actions to get there.
Commit to acting on proposals and checking-in again once new facts turn up. Repeat if necessary.
🐲 WHAT IS THE DISAGREEMENT ABOUT?
🧠 FACTS
❤️ VALUES
✋ PROPOSALS
What do we know about Facebook's impact on the world?
What do we think about Facebook's impact? Who should feel bad about it? How bad should they feel?
What should Facebook, Facebook employees, and others do to make a positive impact?
“The hard truth is that the problem of disinformation campaigns will never be fixed; it’s a constantly evolving arms race. But it can — and must — be managed. This will require that social media platforms, independent researchers and the government work together as partners in the fight. We cannot rely on — nor should we place the full burden on — the social media platforms themselves.” - Renée DiResta
There is evidence that disinformation campaigns went to great lengths to attempt to influence the election, but it’s tough to know conclusively the extent to which those intentions were successful.
🧠 Question about facts: Is there evidence that Facebook’s content and/or ad policy has led to a negative impact on society?
SUPPORTING EVIDENCE
CHALLENGING EVIDENCE
Strongest evidence
“The Myanmar military’s Facebook operation began several years ago, said the people familiar with how it worked. The military threw major resources at the task, the people said, with as many as 700 people on it.” - NYT
“Members of the Myanmar military were the prime operatives behind a systematic campaign on Facebook that stretched back half a decade and that targeted the country’s mostly Muslim Rohingya minority group, the people said. The military exploited Facebook’s wide reach in Myanmar, where it is so broadly used that many of the country’s 18 million internet users confuse the Silicon Valley social media platform with the internet. Human rights groups blame the anti-Rohingya propaganda for inciting murders, rapes and the largest forced human migration in recent history.” - NYT
The misinformation campaign by Myanmar’s military was run via Facebook pages and Messenger, not via ads.
“I wouldn’t say Facebook is directly involved in the ethnic cleansing, but there is a responsibility they had to take proper actions to avoid becoming an instigator of genocide.” - Thet Swe Win, founder of Synergy, a group that focuses on fostering social harmony in Myanmar
NOTES / Disagreement template
💫 WHAT IS THIS DOCUMENT’S GOAL?
🐲 WHAT IS THE DISAGREEMENT ABOUT?
QUESTIONS
🧠 Question about facts: Is there evidence that disinformation campaigns influenced the outcome of the 2016 Presidential election?
🧠 Question about facts: Is there evidence that Facebook’s content and/or ad policy has led to a negative impact on society?