This is a contribution from a Gilbert Dawson to a Mensa Connect forum. He is reporting on a documentary he had recently seen two days ago.
I found it very illuminating regarding at least one explanation for the cause of the recent and on-going fragmentation of our country into groups that disagree with one another.
Denis
We streamed The Social Dilemma (2020, Netflix) Wednesday night. I downloaded the transcript Thursday morning. It's fascinating. It's also a bit frightening.
Several high-level creators* of several social media platforms (Google, Facebook, Snapchat, Twitter, Instagram, YouTube, etc.) share their views about what they have created: a collection of global systems which can learn and goal-seek by themselves. Once these systems are started going, no one, no programmer and no executive, really understands how they work. There is no one in charge, and they cannot be stopped.
These systems have three principal goals:
- Engage: Keep users participating.
- Grow: Attract more users.
- Make money: Show users ads.
Each of these systems has algorithms to maximize these goals and adjudicate tradeoffs. (For example, showing you too many ads, or the wrong ads, might reduce your engagement.) These algorithms are created to some degree by the programmers, but, more importantly, by the machines themselves, through a process called "machine learning", where constructs such as "neural networks" quickly evolve to achieve goals without anyone -- any human, anyway -- quite understanding how.
There are unintended side effects of these systems; collateral damage, if you will. For example these social media systems share a great deal of responsibility in fragmenting our society, dividing us into groups whose members don't agree with the members of other groups on many issues.
Here's an experiment: Into Google's search field, type "climate change is". Don't hit the return. Notice that Google proposes some phrases to complete your query. The completion phrases that you see will be tailored for you. Your phrases will different from, say, mine, or from many of your friends. This is one of the ways we -- very subtly -- grow into self-affirming groups.
Here's another: Start a Facebook post. Into the text field, type a single character, an "@". Don't hit the return. Facebook will propose a list of names. Again, this list will be unique to you. These are, to one degree or another, members of your Facebook "group". These are the people from whom you are likely to hear, if they post.
We used to call these "echo chambers", but now that term is insufficient to describe the myriad machine-defined groups to which we each belong.
There are other unintended consequences. The innocuous-seeming "Like" button, for example, was intended to spread positivity and love in the world. However, some people get depressed when they don't have enough likes. They may even seek medical help.
The "Like" button could be leading us subtly toward political polarization. We really do not quite know all the ramifications, nor, it seems, can we. But, in general, these social media systems are starting to erode the social fabric of how society works.
The movie suggests some rather paltry defenses against this change in our social fabric: delete your apps, limit your children's exposure, etc. But it's pretty clear that the phenomenon is too entrenched to be stopped, even by governments. The machines have taken over this aspect of our social lives.
So, where will this juggernaut end up?
There are plenty of doomsaying movies to forecast dystopian futures (Idiocracy being my current favorite), but I'd like to see a story depicting an attractivefuture. Perhaps a good story can suggest how we might make adjustments so that these awesome tools could help us achieve some universally desirable goals. Can you think of such a story?
"Universally desirable goals"? What might these be? Can we think of even one goal that would be, truly, universally desirable? I thought that it would be easy. Perhaps it is not.
I look forward to your thoughts.
* - Partial list of interviewees:
Aza Raskin helped start Mozilla Labs
Alex Roetter, former senior vice president of engineering at Twitter.
Tim Kendall, former president of Pinterest and previous director of monetization at Facebook.
Jeff Seibert, former head of consumer product at Twitter
Justin Rosenstein, coinventor of Google Drive, Gmail Chat, Facebook Pages, and the Facebook "Like" button.
Tristan Harris, former design ethicist for Google.
The above is from Gil Dawson, Lake Hughes, CA
In the Mensa Connect post that followed Dawson's, Steven Bloom, in Chula Vista made this astonishing assertion on another topic:
"If you google "child drinking milk directly from a cow", copy one of those images, post it on Facebook, you will be instantly suspended from Facebook for 3 days."
I would not be alarmed that an internet platform could make such a decision were it not that such platforms are allowed under Section 230 to deny any responsibility for content provided by a user of the service. In other words, they are asserting the prerogatives of a publisher while taking advantage of a law that strips them of the responsibility and liabilities of a publisher of content.
Just thought you might find this interesting.
Denis