Any website you access today will ask you to choose between different cookies settings before you enter. If you wish to share the least amount of data possible, the website will have all the basic functionalities, if you choose to share a little more data you help improve the website, and if you agree to share all data, you get a fully personalised experience. This tailoring to your “needs” might be useful when you are shopping for, say, books of a particular time period or style, but it also severely limits variety. Most of what you are exposed to looks the same. Ultimately, the algorithms responsible for it are designed in the first place to make a profit. When installing a new application on your phone, or setting up a new device, something similar happens. You are asked if you agree to share analytics and usage data with the company that developed it. To make use of the application, device, or website, you need to make a choice, but do you know what you are choosing?
Privacy experts, software engineers, and large tech companies say that the data that they collect is for diagnostics, i.e., to help developers identify problems, debug, and improve their software or services. 1 Sounds fair enough, right? Well, according to Adam Tanner, this seemingly innocent request can open the door for transmitting a significant amount of anonymised [personal] information with potential privacy risks. 2
From an engineer’s perspective, it is easy to understand the need for information. Bug reports – the data collected when an application or piece of software crashes – are significantly more useful the more detailed they are. But this can mean that the report is practically a full memory dump, including sensitive user data. 3 Though most of this kind of data is anonymised, some experts do worry about sharing of usage and crash data. Daniel Kahn Gillmor, a privacy advocate of the ACLU for instance, worries that crash reports may include e-mail contents. If your e-mail application crashes when using it, the bug report may include the contents of the e-mail you were working in, which may contain sensitive or even confidential information.
And think of the voice recognition technology for instance. Data sharing for bug fixing here can mean your voice is recorded, and actual humans are listening to the conversations you are having. Back in 2019, when this came to light with Amazon’s Alexa, it caused quite a controversy. 4 Amazon responded to this by saying that the recordings “don’t include information that could identify users and customers can opt-out of a human review of their voice recordings”. Yet, if the person recorded states their name it seems quite straightforward to identify them.
Most tech companies echo this approach of allowing users to opt-out and anonymising personal information. In the next opinion piece, we explore how good of a solution this is to data sharing and privacy concerns.