- The Tango
- Clegg's Algorithmic Figleaf
c. Two reasons for the lie
- Clegg's proposed "control," isn't
- Miscellaneous issues
In his March 31 2021 article on Medium "You and the Algorithm: It Takes Two to Tango" Nick Clegg (Facebook's VP of Global Affairs and the former Deputy Prime Minister of the United Kingdom) responds to criticism of how Facebook (and other tech giants) use algorithms.
He offers a summation of that criticism:
It is alleged that social media fuels polarization, exploits human weaknesses and insecurities, and creates echo chambers where everyone gets their own slice of reality, eroding the public sphere and the understanding of common facts. And, worse still, this is all done intentionally in a relentless pursuit of profit.
And then presents the following solution as his thesis (emphasis mine):
...The internet needs new rules — designed and agreed by democratically elected institutions — and technology companies need to make sure their products and practices are designed in a responsible way that takes into account their potential impact on society. That starts — but by no means ends — with putting people, not machines, more firmly in charge.
There are a number of leaps, flaws, and misleading claims in Clegg's article, and I'll address more of them than are probably helpful simply because I can't help myself. However, there are three critical issues with Clegg's analysis that need to be considered:
Issue 1. Clegg centers his article around an inaccurate metaphor, which he uses to color all the topics he addresses.
The use of a dancing and partnership metaphor starts Clegg's use of a false 'us people against the machines' context.
Issue 2. Clegg presents an image of algorithms as individual actors, and uncouples algorithms from the people and organizations that design, deploy, and optimize them for specific business goals.
This attempt to use algorithms as a fig leaf to guard Facebook's soft bits from the inquisitive gaze of lawmakers and society is expanded by Clegg's surprising use of language that presents an image of Facebook skillfully wrangling and taming alien algorithms for the benefit of society.
Issue 3. Clegg's discussion of "control" is flatly a lie, and boils down to "better ad targeting" which is of course Facebook's critical goal as an advertising company.
With no added oversight, accountability, or transparency to Facebook's algorithms or business process, Clegg's proposed "controls" provide no additional control to consumers. Instead his proposals improve Facebook's content and ad targeting.
Issues 4-n: Bonus Debunking!
As a bonus for you, because the prolonged mental contact with Clegg's essay has left my brain itching from all of the lies, I'm also throwing in a bunch (not exhaustive, though!) of ad-hoc callouts of specific lies. If you care to read them, there is a link to 'Part 2'at the end of this article. These will be brief (well brief-ish, I promise).
Clegg anchors his essay with a metaphor: dance. This is more than just a snappy way to present an idea--metaphors change how we accept ideas.
Alastair Roberts and Andrew Wilson's book Echoes of Exodus begins with a description of the "influence of the controlling metaphor." They specifically compare military metaphors ("conquer," "enemy," "fight") and fabric metaphors ("weave," "tapestry," "tangle") to demonstrate that "metaphors have great power to fashion the way we conceptualize things, even when we don't notice they are doing it."
Clegg's metaphor is fun, cooperative, consensual, and occurs between partners. He expands this metaphor with an analogy of a couple where you pick up groceries you like and "your partner" makes a meal that they want to eat out of it. Clegg then says that "The relationship between internet users and the algorithms that present them with personalized content is surprisingly similar" (more on this later!).
Facebook already has partners. You, FB user, are not one of them. Nick Clegg is trying to sell you the idea of a dance between you and engagement algorithms. This is simply false. A more accurate analogy would have Facebook as the shoe salesman. An even more accurate analogy would have Facebook auctioning off dances with you to various bidders and bots.
It may seem trivial to call this single metaphor out as one of Clegg's big lies or (let's be kind) fallacies, especially when that metaphor merely lives in the background. However, this metaphor is foundational for Clegg's argument, providing the backdrop and drumbeat of his key claims.
If the 'tango' is the foundational lie in Clegg's article, this is the central lie. In his language choice Clegg consistently portrays algorithms as discrete entities, rather than an accretion of intentional business decisions made by individuals and enforced by software and policy.
By claiming that machines rather than people are "in charge", Clegg (and Facebook) seeks to deflect responsibility/accountability for Facebook's harmful impacts away from individuals at Facebook, while simultaneously portraying Facebook as a critical champion of society that can tame and harness Algorithms for social good and 'take charge back' from them.
With the very title of his essay ("You and the Algorithm") Clegg begins his portrayal of 'the algorithm' as an entity.
Remember the quote from earlier that "The relationship between internet users and the algorithms that present them with personalized content is surprisingly similar"? This is wrong. Here's why:
Clegg refers to a "relationship" between a user and an algorithm, and compares it to the "relationship" between two partners. First, let's look at definitions!
"Relationship" can mean different things: fundamentally it is just referencing a connection or association. For example, a desk and a chair have a relationship; it's just spatial and not personal. But in the most common sense "relationship" means an emotional or personal connection.
With his "relationship" comparison right after his dance and grocery/dinner metaphors, Clegg uses highly connotative language to personify algorithms.
"The truth is machines have not taken over, but they are here to stay. We need to make our peace with them."
"...that starts — but by no means ends — with putting people, not machines, more firmly in charge."
"You should be able to talk back to the algorithm"
Throughout his essay, Clegg monsterizes "machines" or "the algorithm" as an alien adversary that needs to be negotiated with, or an opponent that needs to be knocked back. This is wrong.
People, not machines, already are in charge. Facebook executives make business decisions and direct engineers to develop algorithms to execute those business decisions. Nick Clegg's statement that "we need to put people, not machines, in charge" is a lie intended to shield Facebook from accountability for its calcualted business decisions.
First, because it is self-serving.
Algorithms are designed in order to achieve specific goals; they are not discovered or spontaneously-generating. Facebook's algorithms are intentionally designed and painstakingly refined to support Facebook's business goals and increase revenue.
To demystify 'algorithms,' let's talk definitions again. Merriam-Webster defines an algorithm as a: "step-by-step procedure for solving a problem or accomplishing some end." In other words, an algorithm is a recipe. In computing, an algorithm is just a math-y recipe.
To continue this analogy, Facebook executives determine the end goal they are seeking to achieve (a cake) and how they want to pursue it (genoise or joconde sponge cake?). Individual employees then develop tools and policies (whisk and bake) to support that action (produce the required cake). The result is then reviewed and tested (reviewed and tasted), and modifications are made until the product meets set business goals.
Throughout this whole process, individual people make decisions and execute tasks to produce the outcome. "Machines" are not in charge, people are.
When companies are found to engage in processes that cause harm, that company is (or at least should be) held accountable. In 2019 Facebook was fined $5 billion for violating a 2012 FTC order by "deceiving users about their ability to control the privacy of their personal information."
Facebook would rather avoid another similar fine, and by suggesting to consumers, regulators, and lawmakers that algorithms (and not Facebook) are making these decisions, Facebook may be able to reduce the negative business impact of future deception.
More than that, if Facebook can succeed in portraying itself (as Clegg tries to do here) as society's champion--vanquishing and harnessing the menacing power of algorithms in order to bring about social good and connect the world--it can also potentially shield itself from competition.
Second, Clegg pushes this lie because it exploits the views of the key decision-makers Clegg is writing to.
Clegg is a former British MP and Deputy PM, and is Facebook's VP of Global Affairs. That means that you are likely not his target audience. State-level decision-makers are closer to the mark. And the U.S. Congress does not understand Facebook.
2006's "the internet is a series of tubes" has been replaced as the internet's favored "Congress is out of touch" meme by 2018's "Senator, we run ads" in response to Sen. Hatch. The median age in Congress is 64.8 years old. For the 'median congressmember,' Facebook was created when they were almost 50.
As a result, Facebook's scope and sway is somewhere between 'foreign' and 'utterly foreign' to most American legislators; and the depiction of "The Algorithm" as some alien monster that needs to be tamed by a skilled champion is likely to hit pretty close to home for most.
Making things more difficult, discussions about algorithms are awash in new jargon, common in few and insular groups.
Additionally, an element likely at play here is that people sometimes prefer algorithmic judgements due to a belief that they are more fair. So by casting decisions as ultimately made by "algorithms" instead of Facebook execs, Clegg may be attempting to tap into that trust.
Before we close out this section, let's revisit Clegg's proposed solution, as well as that 2019 FTC fine for Facebook's violation of the FTC's 2012 order.
Clegg says (all emphasis mine):
You should be able to better understand how the ranking algorithms work and why they make particular decisions, and you should have more control over the content that is shown to you.
[Facebooks] recent actions are part of a significant shift in the company’s thinking about how it gives people greater understanding of, and control over, how its algorithms rank content...
and finally concludes his essay with:
And tech companies need to know the parameters within which society is comfortable for them to operate, so that they have permission to continue to innovate. That starts with openness and transparency, and with giving you more control.
Let's now contrast that with what the FTC chairman said in 2019 (emphasis mine):
Despite repeated promises to its billions of users worldwide that they could control how their personal information is shared, Facebook undermined consumers’ choices
Facebook has persistently lied to users, even reusing the same empty phrase of 'you can control ____.' This--falsely assuring users that "control" is right around the corner while intentionally breaking those assurances to maximize profits--is the naked kernel of Facebook's business algorithm; it is a "step-by-step procedure for solving a problem or accomplishing some end;" a measured, refined, and repeated approach designed by Facebook leadership to maximize profits.
Clegg tells you he is talking about giving users more control, but he actually means helping Facebook sell a better product to advertisers.
While he repeats the soothing refrain of "you'll get more control" there is no discussion of consumers getting a better understanding of how Facebook's algos work, of permitting any third-party audit of Facebook's algos or bias, or any other accountability- or transparency-supporting measures. When Clegg chirps out "control" he is meaning that Facebook would like to understand the interests of users better. Facebook's business model is, of course, selling services to advertisers based on Facebook's reputed ability to target those ads to interested consumers, so it should come as no surprise that Clegg's reassurances only support Facebook's finances.
While Clegg notes that Facebook has added in some one-off transparency widgets such as the "Why Am I Seeing This?" tool, those tools do not provide true control. Such tools don't permit transparency into how Facebook's algorithms work at scale/systemically, and merely provide individual data points of minimal value to consumers.
Without the introduction of any oversight, accountability, or transparency, the "control" Clegg is referencing is just better ad targeting.
As a side note, Clegg's other arguments actually contradict his putative claim that providing 'control' will solve the issue of "problematic content", since he places the source of the blame on humans who want this content anyway:
"Perhaps it is time to acknowledge it is not simply the fault of faceless machines? Consider, for example, the presence of bad and polarizing content on private messaging apps... It’s just humans talking to humans without any machine getting in the way. In many respects, it would be easier to blame everything on algorithms, but there are deeper and more complex societal forces at play. We need to look at ourselves in the mirror, and not wrap ourselves in the false comfort that we have simply been manipulated by machines all along."
If you are really entranced, I've included a more rapid-fire listing of issues with Clegg's article here.