The Facebook Papers: We’re Not Asking The Right Questions

It's difficult for me to get excited about the latest deluge of damning revelations about the myriad deleterious side effects of using Facebook. I realize that quite a bit of the current debate centers around the impact of the company's products on minors, so it's not quite as simple as waving away the controversy by reference to users' willful disregard for their own psychological well-being. Still, as I sifted through various media coverage detailing findings from a trove of internal documen

Join institutional investors, analysts and strategists from the world's largest banks: Subscribe today

View subscription options

Already have an account? log in

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

23 thoughts on “The Facebook Papers: We’re Not Asking The Right Questions

  1. Not willing to say the genie is out of the bottle with respect to FB, but there are a couple of things the company could do to significantly reduce the damage it is doing: 1) chuck the algo and populate the news feed with status updates from ALL friends, in real time, as they are posted to the site, no exceptions; 2) implement a rigorous screen to reduce/eliminate bots on the site. Both would be easy fixes for FB engineers, but because they’d likely result in a halving of the company’s ad revenue (bad for shareholders, a plus for society), Zuck and the board will never consider them. However, FB execs and the board continue to drag their feet on mitigating the damage FB is causing, the Justice Dept. should reach into its anti-trust toolkit and force a break up of the company.

  2. Maybe we should take a page out of Xi ‘s book and play hardball with Social Media. I do think it would be too Political and probably drop GDP significantly as a result of reduced Add revenues . (pun intended )

  3. Capitalism (profits before people) and the centralization of data are a lethal combination. A few tweaks to programming isn’t going to fix the underlying problem and Facebook execs are going to fight like hell to keep their compensation packages on an upward trend. And Zuck always has and always will think he’s the smartest IT guy in the world.

  4. Foundation or The Matrix?
    No Such Thing as Artificial Intelligence, a machine does what we tell it.
    To the machine, we are god. A. C. Clarke got pretty close to a universal law.

    1. Actually, many current forms of AI these days use a Darwinian approach to program themselves, to optimize efficiency, and protect themselves from human intervention once a goal has been set. Top level AI is too complicated for us to “tell it” anything.

  5. H. One of your very best, sir.

    You said, “The more important line of criticism should focus on the extent to which Facebook doesn’t fully understand the gravity of the experiment it’s running, let alone the ramifications of allowing it to continue without reprogramming the AI so that it adheres to a new set of instructions.” The truth is that FB’s AI is amoral and doesn’t care about its ramifications. Neither does the boss, (a guy who definitely doesn’t take direction very well)..

    As I was reading your post I could suddenly hear Captain Kirk carefully explaining Star Trek’s “Prime Directive” that forbid the crew from influencing the behavior of civilizations it was discovering in its explorations. Facebook has no such prime directive, although it could by putting constraints on the algorithm. However, the whole purpose of advertising, the life blood of social media, is to influence individual behavior, the diametric opposite of the prime directive. For that reason Facebook will not change fundamentally, no matter what the evil genius says to Congress or the public.

  6. The proper analogy might be biological: cancer. Every second of your life, something goes wrong with some of the cells in your body–a mutation, a viral infection, etc. The proper response of a cell in that circumstance–which happens almost all of the time– is apoptosis. The cell enters a program of suicide. While it may seem counterintuitive that a cell would kill itself, it makes sense in terms of organismal biology. Because if the damaged cell fails to undergo apoptosis, it initiates cancer. I propose that Facebook is a cancer on our civilization. If it had been programmed correctly (with proper regulation of the internet ca 1999 as a utility that needs to serve the public good, and with anti-trust enforcement that would have prevented the acquisition of Instagram and What’s App), then Facebook would have undergone something like apoptosis by now. Instead, it’s a cancer. It’s long past time to initiate chemotherapy.

  7. Seems to me there’s a desire in some humans to be able to create life beyond human procreation – Frankensteins, cloning and now AI and a billion billion on/off settings manipulated by a desire to be a god. Imagine the juice Zuck might feel knowing he can steer the behavior of billions. I don’t think he’s going to willingly dismantle his god machine.

  8. The challenge with “changing the rules of the AI” is that, at its core, Facebook’s feed-serving algorithm is nothing else than an optimizer. The AI portion just predicts engagement with a given content based on previous user interaction with previous content.

    As anyone who has dealt with optimization in the past can attest, the “ethics” and “morals” of an optimizer must be embedded in the cost function or its constraints. Facebook’s cost function is the inverse of predicted ad-related engagement, that’s it. How else could you pick the cost function to be minimized? There’s little else the company can measure beyond engagement (different kinds of clicks and time spent in a given page). Could it measure civilly of the user or happiness? It attempts to control its optimizer by removing some content.

    The “bad” thing about “the algorithm” is that we can’t stop ourselves from engaging with extreme content. The individual definition of extreme necessarily shifts to further extremes the more we interact with the platform and become desensitized to previous extremes.

    If anything, Facebook is lazy and doesn’t curate content like TikTok does or limit the type of content published like Snap does. The only solution to extremist content is removing anonymity and requiring proof of identity.

  9. This is exactly what you need to know about FB ” The algorithm is using what it learns about billions of people to help third parties manipulate human emotions and affect decision making”. The better they are at keeping you on the platform and doing the above the more money they make.

  10. Truly, it worries me when so many of my fellow commentators and our host are so down on FB (as a proxy for social media, hopefully. No one speaks of YouTube or TikTok, presumably b/c videos are tough to analyse with a text crawler of some sort) – without the beginning of a proof.

    https://about.fb.com/wp-content/uploads/2021/09/Instagram-Teen-Annotated-Research-Deck-1.pdf

    Please check page 13. Everyone is now crying crocodile tears about “the children” (“who will protect the children? whawhawha. Children are always the perfect excuse to take away people’s freedoms) but research now used to condemn FB shows that Insta makes things better in 13/13 categories for boys and 12/13 for girls. I appreciate that the one category it makes things worse for girls is body image, a fraught but genuinely important topic. Still… 12 out 13.

    And it’s the same for every argument. Not to mention that journalists, notably the NYT, hate FB and are happy to lie in order to hurt the company that destroyed their industry. But, until someone gives some kind of data-backed proof otherwise, I will continue to contend that FB is, by and large, a reflection of society, a mirror rather than an active agent of change.

    FB did not create liberals’ contempt for rural conservatives nor create rural conservatives’ hatred of big city liberals. That was Murdoch’s doing, by and large. Abetted by the Republican party. How many elected R officials have pushed back on the Big Lie that the 2020 election was stolen? Zero. None. And Fox News keep repeating and repeating that lie. Is it really any surprises that 60% of Republican voters believe it? I mean, I’m impressed 40% are mentally strong enough to reject it!

    1. The best way that “the government” can help the sheeple is to teach critical thinking and proper analysis, skepticism and due diligence regarding the “world wide web” and all of its cohorts in the public school system.

      1. By the way, this is a lesson that parents should be teaching their children. And, at least in my case, the adult children should be teaching their parents (ages 88 and 88).

    2. The Facebook algorithm promotes misinformation, because misinformation drives engagement.

      What are you most likely to click? Something stupid that Ron DeSantis said? Or some dry report about tax policy?

      Facebook is the National Enquirer of the internet. Whatever is vulgar, obscene, grotesque, outrageous, that’s what drives clicks. The algorithm on that surely steer you to all that misinformation.

  11. Sorry H, but I think you are giving Zuckerberg way too much of a pass here. The Algorithm is in fact doing what he wants it to be doing, driving engagement. Facebook long ago learned that the best content for driving up user engagement is the kind that evokes negative emotional responses. In fact, Faceboook even hired psychologists and sociologists to help them understand human behavior and especially what drives addiction. They then adapted their algorithm and platform to drive this type of engagement. Scores of former Facebook engineers have come out and said they built this thing to do what it is currently doing. Zuckerberg in the Facebook papers deprioritized work that would make the platform safer and contain less hate speech and misinformation in favor of work that “drives engagement”. I am myself a software engineer. AI is not some magical Matrix type movie software that learns and adapts to its environment to become something entirely new. AI does what engineers tell it to do. It is given parameters and logic and it evaluates inputs and makes decisions based on what it was designed to do. There is an Atlantic article that talks about exactly all of the different ways Facebook could alter its algorithm to reduce hate speech and misinformation. If Facebook actually did that their engagement would plummet, users would leave the platform because it’s not cool anymore, and profits would nosedive.

    1. “The Algorithm is in fact doing what he wants it to be doing, driving engagement. Facebook long ago learned that the best content for driving up user engagement is the kind that evokes negative emotional responses”.

      We all repeat this as if it’s gospel truth. Now, I am indeed willing to believe that, in terms of sheer engagement, negative emotional responses is best. But what about click through rates and purchase decisions?

      Presumably, one doesn’t go from “darn, I can’t believe what those evil Satanic pedophile Democrats are up to” to “hey, cool sunglasses, let me buy those” in a heartbeat…

      1. Yeah I don’t really see the connection between keeping people on your platform and returning click through rates and completed purchases either. But obviously it must be working otherwise they wouldn’t be wholly dedicated to this strategy. My guess is the people addicted to doom scrolling all day spend so much of their life on the platform that they ultimately also end up buying those tactical sunglasses and doomsday MRATS too.

  12. I largely agree with this analysis. The criticisms of the SM companies/platforms that read like dystopian sci-fi or watery neo-Marxism have, I’ve thought, missed the mark. This , I’m persuaded lies closer to the reality we face.

NEWSROOM crewneck & prints