WYSK: 04/15/22

This Week: 1. Decisive Moment; 2. Un-mutable Mics; 3. Data Brokers; 4. ShotSpotter Alterations

WYSK: 04/15/22

What you should know from the week of 04/15/22:

  • Decisive Moment: America must respond now to Russia's invasion of Ukraine;
  • Un-mutable Mics: On a teleconference and think your mic is muted? Think again! popular platforms continue transmitting audio;
  • Data Brokers: Data brokers collect unbelievable amounts of data on Americans, but what happens when congressmembers get caught up too?
  • ShotSpotter Alterations: For-profit activities and policing mix dangerously.

Decisive Moment:

This Is the War’s Decisive Moment
The United States and its allies can tip the balance between a costly success and a calamity.

This week Eliot Cohen wrote the best article I have yet read on America's responsibilities in Ukraine. His article is fairly short, highly readable, and exceptionally clear. If you read only one thing this week, this should be it!

His points are so clear and simple that they are nearly cutting. While it is mostly incisiveness, I think it also reflects the extreme disappointment Cohen and others have in our government.

He starts off by clearly noting the significance of the war:

For those of us born after World War II, this is the most consequential war of our lifetime. Upon its outcome rests the future of European stability and prosperity... Russia’s sheer brutality and utterly unwarranted aggression, compounded by lies at once sinister and ludicrous, have endangered what remains of the global order and the norms of interstate conduct.

And he addresses the growing apathy toward the intensity and timeliness of the conflict:

Those who talk of a stalemate on the battlefield, perhaps lasting years, are likely making as big of an error as when they dismissed the possibility of effective Ukrainian resistance two months ago. Decisive action is urgently required to tip the balance between a costly success and a calamity.

Cohen notes that the conflict is moving into its fourth phase, which will likely become even uglier than the war has been so far:

The Russian military—revealed as inept at tactics, unimaginative in operational design, obtuse in strategy, and incompetent at basic logistics and maintenance—can do only two things well: vomit out massive amounts of firepower and brutalize civilians.

Finally, in four brutal paragraphs he sharply notes that America has taken a leisurely approach to this conflict:

In Washington, the metronome of war ticks too slowly...It does not seem to have senior leaders inclined to bulldoze bureaucratic obstacles and cut red tape. It feels like business as usual in the Pentagon.

...If British Prime Minister Boris Johnson can visit Kyiv...so can Secretary of State Antony Blinken or Vice President Kamala Harris. If other countries can reopen embassies in Ukraine, so can the United States, which never should have closed its own...

Cohen concludes by noting that the exact course of the war is uncertain, but that its significance is not:

All that is clear right now is that a failure to adequately support Ukraine will have terrible consequences, and not just for that heroic and suffering nation.

As an average American there may not feel like lots you can do, but at a minimum you can look up your Representative and call them, and look up your Senators and call them as well.


Un-mutable Mics:

You’re muted — or are you? Videoconferencing apps may listen even when mic is off
Not only did researchers find that the apps gather audio data while ‘mute’ is activated; they could identify activities picked up when microphones weren’t believed to be on, such as eating, playing music, typing and cleaning.

University of Wisconsin-Madison researchers announced this week that they have identified that most videoconferencing applications continue to collect some amount of audio even when participants are listed as muted:

They tried out many different videoconferencing applications on major operating systems, including iOS, Android, Windows and Mac, checking to see if the apps still accessed the microphone when it was muted.
...
They found that all of the apps they tested occasionally gather raw audio data while mute is activated, with one popular app gathering information and delivering data to its server at the same rate regardless of whether the microphone is muted or not.

The professor leading the research noted:

“It turns out, in the vast majority of cases, when you mute yourself, these apps do not give up access to the microphone,” says Fawaz. “And that’s a problem. When you’re muted, people don’t expect these apps to collect data.”

On of the most common reactions to stories like this is some statement like: "who really cares if Zoom knows I'm cooking during a conference call?"

That statement in itself is just really sad, since it demonstrates such a loss of personal agency that we have become accustomed to today. There are actually all sorts of books and research that answer that question (see Weapons of Math Destruction, or Future Crimes). For those who ask the question legitimately they'll find ample data (and if you want anything specific email me and I'd be happy to discuss further).

That question glosses completely over the fact that people have an innate right to privacy. While that right is not legally protected broadly in the US,

The issue isn't whether a company will/can do something harmful with data they collect, the issue is that so many companies are so comfortable wildly ignoring the expressed interests and expectations of customers.


Data Brokers:

Speaking of inappropriate and inescapable data collection... John Oliver addressed Data Brokers on Last Week Tonight. You can also read a transcript here (the data broker section starts about halfway down).

If you're not familiar with John Oliver, he does some truly excellent overviews of major issues. They are often deeply researched and provide action items for viewers, but he unfortunately peppers them with raunchy and/or irreverent comedy bits that can make his overviews unpalatable.

This week's section was on data brokers, and the technical content was quite excellent.

...We’ve all found ourselves being targeted by ads for something oddly specific, and thought, “how on earth did they know to show me that?” And tonight, we’re going to talk about who makes that possible: data brokers.
It’s a multibillion-dollar industry, encompassing everyone from credit-reporting companies, to these weird people-finding websites that pop up whenever you google the name of your friend’s sketchy new boyfriend, to these names you may never have heard of. But what all these companies have in common is, they collect your personal information and then resell or share it with others...

Oliver provides an overview of the data broker ecosystem, where data collection from websites and apps trickles into the databases of companies like Acxiom or Epsilon. After providing a functional overview, he dives into some of the harms in execution:

Remember Epsilon, the company that collects clouds of information about you? In 2014, their then CEO even went on “60 Minutes,” to reassure people that his business, in particular, operated in a completely above-board manner.

[Epsilon CEO Bryan Kennedy:] "If there are abuses out there, we don’t believe those happen within our company. And we would be the first to raise our hand."

John: Oh, really? You’d be the first to raise your hand, would you? That’s interesting. Especially because last year, Epsilon settled with the DOJ for $150 million for facilitating elder fraud schemes, after admitting that it sold more than 30 million consumers’ data to clients who employees knew were carrying out scams. And they were doing it for nearly a decade. So I guess that guy really should’ve been doing that entire interview with his [!@#$%] hand in the air.

Oliver addresses blasé viewers:

And you might still think: “I don’t care about this. My life’s an open book, I have no secrets, so data brokers can just have at it.” Even if that’s true for you, consider there are others out there who might have very good reasons to not want to be found...
[Some people have reason to fear being pursued by stalkers]. It’s happened. In New Hampshire, a stalker killed a former classmate after finding her with information he’d bought from a data broker for $45.

And also notes that the Government can purchase data from data brokers to skirt 4th Amendment protections and requirements for warrants.

But finally...

The juicy bit of the story!

One of the most fascinating bits of the story occurs about 21 minutes into the video:

It’s very frustrating that the people who could do something about data brokers are so actively incentivized not to. But here is where we may be able to help. Because interestingly, the one time that Congress has acted quickly to safeguard people’s privacy was in the 1980s, when Robert Bork was nominated to the supreme court, and a reporter walked into a local video store and asked the manager whether he could have a peek at Bork’s video rental history, and he got it.
As soon as Congress realized there was nothing stopping anyone from retrieving their video rental records too, they [freaked out], and lo and behold, the video privacy protection act was passed with quite deliberate speed. So it seems when Congress’ own privacy is at risk, they somehow find a way to act. And it also seems like they’re not entirely aware just how easy it is for anyone — and I do mean anyone — to get their personal information, which brings me to me.

Oliver and the show set up a:

demographic group consisting of men age 45 and up, in a five-mile radius of the U.S. Capitol, who had previously visited sites regarding, or searched for terms including, divorce, massage, hair loss and midlife crisis.

They then targeted that group with ads titled "marriage shouldn't be a prison," "can YOU vote twice?" and "do you want to read Ted Cruz erotic fan fiction?".

They ran the ads for a week and analyzed the results:

Let me start with the very first hit we got. It came at 3:35 pm on Tuesday afternoon from around the embassy row area, when a man fitting our description clicked on the Ted Cruz ad. Meaning we now have his IP address and device id, and also know that he did it on an Android phone. So we could now take steps to identify him. Just like we could with all these others who clicked on one of our ads in the Capitol Hill area this week, including at least three who may’ve been inside the Capitol building itself. One of whom clicked on the “can you vote twice?” ad, one of whom clicked on the divorce one, and another who clicked on the Ted Cruz erotic fiction, which was distressingly popular. And if you’re thinking, “how on earth is any of this legal?” I totally agree with you. It shouldn’t be.
And if you happen to be a legislator who’s feeling a little nervous right now about whether your information is in this envelope and you’re terrified about what I might do with it, you might want to channel that worry into making sure that I cannot actually do anything. Anyway, sleep well.

Oliver never shows the raw or de-anonymized data. Which lends itself just about equally well to suggesting either that the show didn't identify anyone of interest, or that they did and didn't want to draw too much congressional attention.

Regardless, it is a fascinating experiment. The security and governance implications alone—consider the ease with which a foreign or domestic group could collect compromising data on legislators and use that to influence policy—are pretty wild. But as the top commenter on YouTube noted: "John Oliver essentially blackmailing Congress with legally-obtained info sourced from data brokers is a new high point in late night comedy."


ShotSpotter Alterations:

Police Are Telling ShotSpotter to Alter Evidence From Gunshot-Detecting AI
Prosecutors in Chicago are being forced to withdraw evidence generated by the technology, which led to the police killing of 13-year-old Adam Toledo earlier this year.

While we're discussing the dangers of the intersection of rampant data collection and governance, let's shift over to policing.

This article by Todd Feathers is actually from last July, but is startling enough that I wanted to include it.

The story opens with a scenario from May of 2020 here police arrested a man (Michael Williams) for murder, heavily leveraging the fact that his car was recorded as having stopped at the time and location of the murder. Williams' car was seen on surveillance footage, but the time and location of the murder was identified through a different technology:

How did they know that’s where the shooting happened? Police said ShotSpotter, a surveillance system that uses hidden microphone sensors to detect the sound and location of gunshots, generated an alert for that time and place.

However, that determination by ShotSpotter was amended:

But after the [initial] alert came in, a ShotSpotter analyst manually overrode the algorithms and “reclassified” the sound as a gunshot [rather than a firework]. Then, months later and after “post-processing,” another ShotSpotter analyst changed the alert’s coordinates to a location [a mile away] on South Stony Island Drive near where Williams’ car was seen on camera.

This suspect classification caused the public defender to file a Frye motion requesting a Judge "to examine and rule on whether a particular forensic method is scientifically valid enough to be entered as evidence."

Rather than defend ShotSpotter’s technology and its employees' actions in a Frye hearing, the prosecutors withdrew all ShotSpotter evidence against Williams.

The rest of the article showcases situations where key evidence from ShotSpotter was either miraculously found or disappeared.

I don't believe that Feathers is able to make a compelling case that police are directly asking ShotSpotter to fraudulently alter evidence in specific ways to support prosecutions. However, that is an extremely high bar of social harm to reach.

Because ShotSpotter has never been examined or tested, the accuracy of the results is entirely dependent on the claims of the company selling the product: an entity that clearly has a financial incentive to push sales rather than to promote the wellbeing of the public.

“Rather than defend the evidence, [prosecutors] just ran away from it,” he said. “Right now, nobody outside of ShotSpotter has ever been able to look under the hood and audit this technology. We wouldn’t let forensic crime labs use a DNA test that hadn’t been vetted and audited.” [said Jonathan Manes, an attorney at the MacArthur Justice Center.]


💡
Have you liked this content and want more? Subscribe today!

Interest piqued? Disagree? Reach out to me at TwelveTablesBlog [at] protonmail.com with your thoughts.

Photo by Tamara Gak on Unsplash