Ding-Dong

Why we need to keep talking about Ring and "plug-in surveillance."

Hi! This is a Sonder Scheme newsletter, written by me, Helen Edwards. Artificiality is about artificial intelligence in the wild; how AI is being used to make and break our world. If you haven’t signed up yet, you can do that here. If you like this newsletter, please share, especially on LinkedIn and Twitter.


The Electronic Frontier Foundation recently published results of a study into Ring’s surveillance of its customers. If you are a Ring customer and use an Android phone, your Ring app is packed with third-party trackers which send out a whole range of personally identifiable information to a whole bunch of analytics and data broking-type companies. Data may include your name, mobile carrier, private IP address and sensor data (such as settings on the accelerometer your phone).

For example, Facebook receives alerts when the app is opened and includes time zone, device model, language preferences, screen resolution and a unique identifier that persists even if the user resets certain settings. Facebook receives this even if you do not have a Facebook account.

Branch receives a number of unique identifiers, such as fingerprint identification information and hardware identification data, as well as your device’s local IP address, model and screen resolution. Branch describes itself as a “deep linking” platform. Deep linking, in the context of a mobile app, uses an identifier that links to a specific location within a mobile app rather than simply launching the app. Deferred deep linking allows users to deep link to content even if the app is not already installed. For advertisers and data brokers it’s important because it acts like a backdoor to specific content (say, a particular product) when someone doesn’t have the relevant app already installed. Ring sells device information so that Branch can perform this function behind the scenes.

AppsFlyer also receives data which it uses as part of its offer to marketers. The company specializes in marketing attribution. AppsFlyer can come preinstalled on a low-end Android device - something called “bloatware” - where it is used to offset the cost of the phone by selling consumer data. This practice disproportionality affects low-income consumers because they tend to buy the cheapest phones.

The most data goes to Mixpanel, a business analytics service company. It tracks user interactions with web and mobile applications and provides tools for targeted communication with them.

So what’s the “so what?” We know this kind of tracking happens and we’ve certainly come to expect it with Android phones. What’s new here is that Ring is surveilling the surveillers. In the most extreme case, Ring shares your name, email address, your device and carrier, unique identifiers that allow these companies to track you across apps, real-time interaction data with the app, and information about your home network. This doesn’t seem to match the level of trust that Ring customers would expect. It feels like a fundamental fracture of the mental model a customer should have about Ring.

Perhaps a bigger concern is the growth and extent of “plug-in surveillance.” City-wide plug-in surveillance is experiencing huge growth. Think of it like a public/private mash-up of video surveillance, advanced video analytics and automation. The US has tens of millions of connected cameras and is projected to rival China’s per person camera penetration rate within a few years.

By pooling city-owned cameras with privately owned cameras, policing experts say an agency in a typical large city may amass hundreds of thousands of video feeds in just a few years. - Michael Kwet

Of course, this scale begs the question: what do you with all this footage when you get it? The answer is AI - sophisticated video analytics that can overlay footage of events happening at different times as if they are appearing simultaneously. Once this summarization is done, more AI can be applied, in particular behavioral recognition techniques such as fight detection, emotion recognition, fall detection, loitering, dog walking, jaywalking, toll fare evasion, and even lie detection. These systems can track individuals across a network of connected systems and single people out in highly automated ways.

People who say they aren’t worried about AI surveillance because they aren’t doing anything wrong, often fail to understand what “doing something wrong” might mean in the modern world of plug-in surveillance. It’s not only that bias is a known problem, it’s not only that the science of behavioral analytics cannot always be justified, it’s not only that a lack of accountability for decision and action in AI systems is a real cause of harm, it’s also that these private surveillance systems have a strong incentive to share data with third-party data networks in an opaque and privacy-invasive way. I’m speculating here but, in theory, surveillance can be extended right to the edge of the network - someones’s phone where AI can find patterns outside of human perception and consciousness.

It’s beyond most people’s capability to understand how all these systems fit together. The opacity, obscurity, non-intuitive and inscrutable nature of systems used in public spaces could get worse. The design of third-party data networks and the capability to plug systems together mean that “people are viscerally kept from their data” (h/t John Havens).

It starts to feel like our societies are biased against humans.


Also this week:

  • The every-day existential risks of AI - a Sonder Scheme article.

  • ICYMI, anti-virus software that collects and sells your every click 'Every search. Every click. Every buy. On every site.' After this Vice story, the company announced it will be winding this service down. Privacy is alive, journalism functions, public accountability works. The story is worthy of your time.

  • Sundance movie Coded bias. “Corporations & governments are deploying #AI in many ways, but they have failed to ensure technologies work for All. AJL United’s work to expose the resulting harms is featured in the Shalini Kantayya’s film Coded Bias at Sundance.”

  • Facebook has settled privacy lawsuit over facial recognition. Thanks Illinois, which, along with California, is the most progressive state in regs responding to the unique risks and harms of AI.

  • Fairness, Accountability and Transparency (FAT 2020) conference proceedings. The exponential increase in interest in this conference demonstrates how much people are invested in something that used to be a fringe topic.

  • Back-in-the-day, when I was working on nodal pricing in electricity markets, we used to joke about using similar technology for peak pricing in city parking. While cities have experimented for a while, the AI version is now here. The Wall Street Journal (paywall) reports on a new company, SpotHero, that will adjust parking prices based on predicted demand. Other applications for more consumer products whose prices could fluctuate in real-time could be on their way.