A open letter to Facebook on What ‘The Social Dilemma’ Gets Wrong

An open letter to Facebook and those involved in this post.


Our News Feed product teams are not incentivized to build features that increase time-spent on our products.

News Feed Product teams being the wording.

Instead, we want to make sure we offer value to people, not just drive usage.

Offer Value is the same wording used to employ moderators. Moderators, you ignored and treated worse than slaves.

And to this, I say bravo for the rearrangement of the news feed product teams because they aren’t the ones responsible for building features.

That task falls to the Product Engineering team.

There is a software engineer for adds, Engineering Director, front end Engineer, Data Engineer, and Enterprise Engineer.

The “Product teams” are not responsible for building new features. Now there is the Production Engineering team, but that’s not what was written. Was this simply a typo or a dance with words?

What is the value of Facebook?

According to GOBankingRates, Facebook’s net worth is valued at approximately $138.3 billion (£105.2 billion). Its market cap range is thought to be roughly $346.4 billion to $468 billion (£263.6 billion to £356.2 billion).Jul 30, 2018

What are Facebooks Values?

Rather than offer a nuanced look at technology, it gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems.

You are correct. The documentary skipped leading information about Algorithms’ work, mainly where most AI is other people. I concede this point. However, you skip over the simple fact that

FACEBOOK is incentivized to build features that increase time-spent on Facebooks products.

For example, in 2018, we changed our ranking for News Feed to prioritize meaningful social interactions and deprioritize things like viral videos.

Adam Mosseri posted a Newsroom post “Bringing People Closer Together” The wording used was “Using “engagement-bait” to goad people into commenting on posts is not a meaningful interaction, and we will continue to demote these posts in News Feed.’

“In a statement from Mark Zuckerberg, Facebook will be demoting content that comes close, or “borderline,” to the policy line of prohibited content.”

This was in March 2018 when it was discovered that Facebook knew about massive data theft and did nothing.

Facebook’s response was An apology tour and policy changes


Yes, Facebook did minimize the news feed. So what?

We collaborate with leading mental health experts, organizations and

Oh, do you now?

you also


At Facebook’s worst-performing content moderation site in North America, one contractor has died, and others say they fear for their lives


The Trauma Floor


“Hurting People At Scale”

Facebook’s Employees Reckon With

The Social Network They’ve Built


By that one action alone, you lose any credibility towards any ‘’mental health experts” you claim to have on payroll.

52m is FINALLY being paid out to content creators as compensation for mental health issues developed on the job after systematically firing people who didn’t meet your INSANE NPS charts,


if you are not paying for the product, you are the product

But even when businesses purchase ads on Facebook, they don’t know who you are.

This is true. As a business, you do not know who the person who sees the advertisement is as an individual user. It’s also got nothing to do with the privacy issues surrounding Facebook. People aren’t unhappy because GrannyMcsith at Family diner knows they like buying croissants. People are unhappy that FACEBOOK knows they want to buy croissants, then those adverts follow them around on the web. But like Mark said, ‘’We run ads.’’

Facebook’s algorithm is not ‘mad.’

I believe the social network used the term ‘’mad” is it’s easy to understand.

Source: Andrew Kirby

Facebook has one of the most advanced A. I center in the world.

Defining the algorithms is #1 pointless as they are proprietary, and 2: boring as heck for the average sally or joe. However, displaying a digital avatar jerked around like a puppet manifests the same point as easier to understand.

Over the last year, we have made significant changes as part of our agreement with the Federal Trade Commission.

Would that be perchance over the 5 billion Facebook settlement?


Despite what the film suggests, we have policies that prohibit businesses from sending us sensitive data about people, including users’ health information or social security numbers, through business tools like the Facebook Pixel and SDK.

Period trackers are ok though, got it.


We’ve acknowledged that we made mistakes in 2016

Multiple times. Again and again. Facebook is “very truly sorry.”


The film does indeed leave out most political information, mos likely to prevent polarization.

We don’t want hate speech on our platform and work to remove it, despite what the film says.

While this is true, Facebook takes FAR TOO LONG to debate using lists of rules against what constitutes the removal of a post.

Did you know Facebook has a classification system for murder? the amount of blood is determined towards removing a post, and a wrong classification counts against the person who is removing the post.

It’s insanity. Just remove the dang post already.

Stop debating what level of death constitutes a gore and remove it.

When a group of your users says, “Hey, we need this,” actually consider their thoughts.

We know our systems aren’t perfect, and there are things that we miss. But we are not idly standing by and allowing misinformation or hate speech to spread on Facebook.

But you are stealing data.

And you are treating your workers like pure garbage, causing fights around the details of the removal of a post based on an ever-changing set of metrics that, in the long run, doesn’t matter in the slightest.

‘Let’s talk about just ONE of those complex problems.


The genocide Incited on Facebook, With Posts From Myanmar’s Military 2018


Let’s talk about those difficult and complex societal problems.

Zawgyi vs. Unicode: What’s the Difference?


Whan Facebook moderators begged for Myanmar translations, Facebook did not listen. Facebooks moderator concerns were swept aside, and it is primarily due to the inconsistencies in translation caused by Facebook’s own inaction that caused the genocide to come to fruition.

Because Facebook ignored their own moderators’ requests, they did not listen to the reports submitted day after day. They suppressed the voices of their moderators that this terrible event happened.

Facebook moved fast and broke things, Things, in this case, being human lives.

Facebook Racks up 10m Myanmar users


In 2012


Facebook in Myanmar: Amplifying hate speech? 2014


Facebook becomes outlet for religious anger in Myanmar 2014


Digitalizing Myanmar: ConnectivityDevelopments in Political TransitionsAndrea Calderaro 2014


Wow! Myanmar is Going Straight to Smartphones 2015


The problem with Facebook By Mizzima Weekly

12 March 2015


Facebook nods to Zawgyi and Unicode 2015


​Facebook rules localized for Myanmar — with extra adorable cartoons 2015


The Facebook-Loving Farmers of Myanmar 2016


Translation in Myanmar: A Struggle in Today’s Market 2016


3G internet just arrived for Myanmar farmers, but they’ve had Facebook for ages 2016


Facebook removes Burmese translation feature after the Reuters report 2018


Facebook Slow to React to Violence, Hate Speech in Myanmar 2018


Integrating autoconversion: Facebook’s path from Zawgyi to Unicode 2019


To quote Facebooks own engineering post

“These tools will make a big difference for the millions of people in Myanmar who are using our apps to communicate with friends and family.”

Facebook says technical error caused vulgar translation of the Chinese leader’s name 2020


This was one terrible incident hidden among several others highlighting that Facebook puts profit above humanity.

The film’s creators do not include insights from those currently working at the companies or any experts that take a different view of the narrative put forward by the film.

To quote Sean Parker, ‘You’re exploiting a vulnerability in human psychology.’

‘It’s a social-validation feedback loop. It’s exactly the kind of thing that a hacker like myself would come up with because you’re exploiting a vulnerability in human psychology.’

— Sean Parker Source



Facebook addiction and personality

ThipparapuRajesh Dr B.Rangaiah


Facebook addiction and personality


The uses and abuses of Facebook: A review of Facebook addiction



Facebook Addiction: Onset Predictors

Roberta Biolcati, Giacomo Mancini,* Virginia Pupi, and Valeria Mugheddu


Research finds heavy Facebook users make impaired decisions like drug addicts 2019


How about those acquisitions?

Hello, an app for teens?


How about TBH


Furor Erupts Over Facebook’s Experiment on Users


A Facebook employee says he was fired for speaking out about his colleague’s suicide.


‘Do Not Discuss the Incident,’ Facebook Told Employee Fired After Speaking About Worker Suicide

A vice deleted article



Here’s Mark Zuckerberg explaining the internet