When I started my post-doctoral research I wasn’t particularly into apps and wearables. What interested me more were friendships and other social connections, and how they relate to someone’s autonomy. At the UvA I had taken a cool class on privacy during which the influence of social media was one of the topics covered. The classes made me question how we can maintain relationships while our privacy is eroding.
All of these subjects come together in my dissertation in which I study the impact of self-tracking technologies on our autonomy and privacy. Examples of self-tracking technology are wristbands counting your steps or health apps on your phone, giving you daily updates on the amount of calories you burn.
Starting point for my research is a relational definition of the term autonomy according to which social relations are fundamental to someone’s ability to lead an autonomous life. This relational conception of autonomy was first introduced by a group of feminist philosophers who wanted to get rid of the hyper individualistic view of autonomy that was common in philosophy at the time. Today there’s nothing controversial about these relational theories anymore. Contemporary differences of opinion mostly concern the extent to which social relations contribute to autonomy. A key condition for autonomy is privacy. If you have privacy, you are more or less free to decide what information you want share – and with whom. Imagine your mother entering your room while you are in the middle of a conversation with your best friend about her break-up. Most likely the conversation will halt. At that moment you aren’t able to shape your friendship as you please. Privacy means that you can share different information with your friends than with your mother; or that you can discuss health issues with your doctor without this information ever reaching your employer or a commercial party. You can’t choose a particular role in your interaction with others – one of friend, daughter, patient, employee, or consumer say – if you can’t control who has access to which information. We call the rules, boundaries, and expectations we draw on to make these decisions, privacy norms.
“Who gets to interfere with your decisions depends on the social context.”
What makes new technologies so problematic is that they stretch these norms further and further. Based on privacy norms that we more or less take for granted, we don’t expect a health or fitness app to share our data with a third party. Nevertheless it’s exactly what the most commonly used apps do. Take the period tracker Maya. Recently it turned out that the app shares the data of its users with Facebook. For developers of these apps it is in fact their number one way to make money. In the United States, health insurance companies and employers are giving away free FitBits to their clients and employees. But in doing so these companies also gain access to personal data about lifestyle and health choices of the recipients – do they get enough exercise? Do they go to bed on time? And are they eating healthy?
Not only do such new technologies monitor our personal information, most of them are also designed to influence our behaviour based on the collected data. Take the advertising tagline from Apple Watch ‘There is a better you in you’. The device promises to help you to be more active, healthy, and more ‘connected’.
These control mechanisms shed light on another dimension of our privacy: our decisional privacy. Who gets to influence a decision depends on the social context you find yourself in. Generally we don’t mind if our partner weighs in on choices having to do with raising the kids but would find it inappropriate if our employers would do so. We also find it normal for our general practitioner to give us lifestyle advice but what do we think of Apple or Google doing the same? Although consumers seem to be used to it by now, in fact it isn’t obvious at all that companies penetrate domains such as child rearing and lifestyle choices.
And what is worse, often this behavioural steering by apps takes place unbeknownst to the users. Sometimes these users are even exploited by third parties. That can no longer be called influencing, that’s plain manipulation. A good example is the company Cambridge Analytica, which used Facebook data to influence the voting behaviour of the American electorate on a massive scale.
New technologies render us vulnerable to unwanted interference and behavioural steering. While promising to ‘empower’ us they are encroaching upon our privacy. But in fact, privacy is a condition for this empowerment or, in other words, for autonomy. Therefore consumers, citizens, and governments should ensure that large corporations interfere less with our personal choices. We have to take back control of our own lives!