Meet the World’s Leading Human Data Ethicist

Richie Etwaru is equal parts zen monk, business artisan, technology geek, and socio-economic historian. Recently I had the chance to sit down with him for a one-to-one interview in which we discussed human rights, data privacy, business ethics, Hu-manity.co, and what it feels like for a human data ethicist to take on the healthcare industry.

CHOU: First, what is an ethicist?

RE: It’s not an unusual question — I get it all the time. If you look it up in the dictionary, an “ethicist” is one who is a specialist in ethics, or more precisely one whose judgment on ethics and ethical codes has come to be trusted by a specific community.

I am by no means a traditional ethicist, meaning I don’t subscribe to the broad traditional hierarchy of concepts that filtrate what is right or wrong. I focus on data ethics, and specifically human data ethics. This is an emerging area where concepts such as data privacy, ownership, and exploitation are intertwined into a historic socio-economic construct.

For example, I study models on the fair trade of data that are analogous to the realm of fair trade of coffee. I’ve spent a fair amount of time developing constructs that can bring communities, corporations, and countries into a single agreement around crude data, how this data can be refined for commercial use, and how the clues to the answers around data ownership might lie at the intersection of intellectual property rights and labor laws.

Many times, an ethicist is seen as someone religious, spiritual, or wonky. While in my work I have tremendous respect for these approaches, that’s not what I do. I focus my efforts and insights around identifying and healing the flaws and abuses in our social order around human data privacy, human data ownership, and human data ethics.

CHOU: What needs to change?

RE: The immediate opportunity is to address the two central issues of data privacy and data ownership. But these are foundational issues, not deep solutions. Making progress on data privacy and data ownership in the next five to ten years will be equivalent to piling up sandbags as the floodwaters rise. You may avoid drowning this time, but again and again the water will keep coming. In the future, the serious flood will take the form of the mass manipulation of the human species by the unprecedented concentration of intelligence and the power to manipulate residing in the hands of those who own human data.

One of my favorite people, Dr. Yuval Harari, describes this threat as “the domestication of the human species.” With unregulated siphoning of our data, corporations and their algorithms will know us better than we know ourselves, and this will inevitably lead to the unethical manipulation of the people on a global scale.

For example, based on your SnapChat history, an algorithm might know your sexual preference before you know it, allowing marketers to target you with content designed around where they think you are on the gender spectrum. Regardless of whether we believe this would be a good thing or a bad one, such sophisticated targeting will erode our free will and bend the natural course of human self-awareness.

There is a ton at stake here beyond, for example, the manipulation of voters in the 2016 election by Cambridge Analytica. But this goes beyond issues of free will and politics. We have ithe biggest humanitarian crisis ahead of us — this war on our humanity and what it means to be human.

CHOU: How long have you been practicing your trade?

RE:. In my early twenties, I co-founded a company that purchased driver history data from various state departments of motor vehicles across the Unites States, and we sold those driver histories for compliance use cases to transportation companies. Given that I was young, I didn’t have the most developed moral compass, but around the age of twenty-three I arrived at the insight that the people whose driver histories were being monetized by folks like myself were unaware of it, and to me this posed questions from ethics to economic fairness. Over the last twenty years I’ve traveled a long path from that original insight through financial services and healthcare, immersing myself in the practices around credit reporting data, travel data, healthcare data, and a myriad of other data sets. Every year, I’ve watched as more and more data has been siphoned from consumers absent their consent and authorization, and in many cases used only to serve accretive economic value with reckless disregard for ethical values.

One of my crucial insights came when I looked at the product design and terms of service of the Ring doorbell. The absence of a social contract, or rather the presence of the forced social contract when someone approached my home and was recorded ringing my doorbell, was a surprise to me. The fact that no one seemed to care was even more alarming.

CHOU: Have you seen any progress in human data ethics?

RE: The overall problem is enormous, and you can only solve it one issue at a time. As you know, I’ve co-founded Hu-manity.co and the #My31 Movement around the 31st Human Right, which calls on the United Nations to ratify the 31st amendment to the Universal Declaration of Human Rights declaring that everyone has the right to own their inherent human data as their personal property. Within one year, we were invited to present that at the United Nations celebration on the 70th Anniversary of Human Rights; have had audiences with the World Economic Forum and the Aspen Institute; have seen governments begin to adopt policies around data privacy, data ownership, and data ethics; and most recently multiple state legislatures in the United States have introduced new legislation that would allow citizens to claim their property rights on their personal healthcare data.

Leave a Reply