Build a data strategy
Collect the right data
Improve your data literacy

Eureka! with Aurélie Pols (data scientist and privacy engineer)

She calls herself a recovering dataholic, and she spends most of her time fighting for privacy rights in data and tech. Aurélie Pols is a respected voice in the privacy debate, and renowned for her strong stance on the issue. We talked to her about the state of privacy and the future of data.

While Aurélie’s roots lie in academia, her history as a co-founder of OX2 – one of the first European data analytics companies – also made her a data pioneer in the corporate world. When OX2 was sold to Digitas LBI, she went on to establish digital consultancy firm Mind Your Privacy in Spain.


These days, Aurélie works independently, advising companies and policy-makers alike on Privacy-by-Design and risk mitigations measures for the GDPR, best practices and global alignment. She also teaches privacy and ethics at Maastricht University in the Netherlands and is part of the Platform Observatory of the EU.

Datafication is driven by market demands, but there’s no real reflection on how to make tech less risky for society.

Aurélie, you seem to walk a fine line between advising companies on how to leverage data and raising awareness about privacy issues. How do you sleep at night?

“I find myself in this awkward but interesting balance. I still work in digital and data for, among others, customer data platform (CDP) mParticle. Throughout my career, I have been increasingly confronted with the dangers of data abuse, which led me to focus more on data protection and privacy risk but from a fundamental rights perspective. Today, I advise start-ups and policy makers on the finer points of GDPR and global privacy regulations alignment.”


“At the same time, I have to admit I still love data. I’m a recovering dataholic. Perhaps it is precisely this dual stance that allows me to build bridges between these two worlds. I’d like to think I work for humanity, I guess that’s how I sleep at night.”

"When I arrived in the US, I was flagged by the PRISM surveillance program because my email address contained the word 'iran'. I suddenly felt like my fundamental rights had been obliterated."

Is there anything specific that worries you about how we’re handling data today?

“Oh, lots of things scare me. I’m mostly worried about the future of my kids. Specifically, facial recognition and A.I. are high on the list of things to keep an eye on, such as tools used by educational institutions to monitor kids from an early age. Have you seen the movie Brazil? It often comes to mind these days.”

Are you implying dystopia has arrived?

“Well, I guess we wouldn’t know since the descent towards a dystopia would go gradually. We would never be aware of it. I didn’t watch Black Mirror, it would give me nightmares. I’m a techno optimist!”

So you can’t sleep after all. When did you become aware of the dangers of data?

“I’m Jewish, so the Holocaust has had a huge impact on my family. In the Netherlands, before WW2, they made a register of people’s religions. The idea was noble, they wanted to make sure everyone had a proper burial. But when the Nazis invaded, they used the register to go after the Jews. The extermination of Jews in the Netherlands was extremely efficient because of the existence of those data lists.”

The Netherlands made lists of people’s religions to ensure proper burials. When the Nazis invaded, the extermination of Jews in the Netherlands was extremely optimized and efficient because those data lists existed.

The same can be said for technology these days. It’s not because it is regulated or well-intended now that it won’t ever be misused in the future.

“Precisely. That’s why ‘purpose’ is an important principle in legislation. Institutions can’t simply repurpose certain data points, say surveillance cameras, to start tracking people. They wouldn’t be compliant. But especially in digital, the notion of purpose is defined in a shoddy way. Take digital advertising, for instance. We are asked for consent on a whole list of vaguely described purposes. This data could easily be misused in the future because the purpose is not clearly defined.”

In digital, the notion of purpose is defined shoddily. This could lead to misuse of data.

Can you give a concrete example about the dangers of data from your own experience?

“Back when I was working at OX2, I went to the US. When I arrived, I was taken aside at the airport for extensive questioning about why I was there. Apparently, I had been flagged because my e-mail address at the time contained the phrase ‘madiran’, referring to a French wine. The word ‘Iran’ in my address had triggered some protocol of the PRISM surveillance program. It was the first time I actually felt like my fundamental human rights had been obliterated. I’ve never returned to the US since, I don’t feel comfortable there.”

Is there a cultural difference between Europe and the US in that regard?

“Absolutely. In terms of datapoints, there are some interesting use cases in the US. You can actually buy voter preferences. In Europe, political preference cannot be shared unless voters give their consent. Another example: the DMV, which is a public institution, buys and sells data around license plates of the cars registered. That’s unheard of in Europe. We don’t consider data a commodity.


“GDPR defines three types of businesses that handle data: you’re either a data controller, data processor or joint controller. These roles determine your responsibility. In California, you’re defined as either a business or a service provider. And lo and behold, there’s a third category: data broker. In the US, it’s not about responsibility, it’s about money and profit. There’s not much you can do about that. As long as the incentive is there, data will be sold. There’s always some senator who needs to get elected and he or she can buy data to make that happen.”

In the US, data is not about responsibility, but about money. As long as the incentive is there, data will always be bought and sold.

So, thank the stars for GDPR?

“Well actually, the GDPR principles are based on the Information Practice Principles (1974) established in the US. We didn’t invent it, but we collaborated to make it better. I compare the GDPR to the Sagrada Familia: it’s never finished. It’s an evolution and there’s a lot more coming. The GDPR is really a blueprint to have a global conversation about the obligations of all actors involved. Every day, more and more parties join the conversation: Brazil, South Korea… I increasingly think the US is an outlier with respect to these fundamental rights.”

The GDPR is never finished. It’s more of a blueprint to have a global conversation about privacy.

Have you been watching the European Championship? I’ve never seen so many TikTok and other Chinese advertisements. Talk about a country that has a different idea about what privacy means.

“The challenges for a country like China are vastly different from the ones in Europe or the US. I’m not one of those people that likes to point the finger and say ‘China’s the bad guy’. It’s not up to us to decide what’s good or bad when finding a global solution. They undoubtedly have some leadership issues, but when it comes to privacy policies, we can learn things from each other.”

Such as?

“Well, the whole idea of consent the US is so fond of was questioned early on in China. First of all, there’s not always a screen present to give your consent, for instance, with facial recognition tools. Secondly, the idea of consent puts the burden on the user instead of putting responsibility with whoever is collecting or processing data.


“The idea of consent is in essence very American. China brings a less individualistic approach to privacy to the conversation. Their ideas are based on societal needs. This is what’s missing in GDPR and Western privacy policies. Take these popular DNA tests such as 23andMe for instance. DNA is about your family, not you as an individual. So how can you even talk about personal consent? Group privacy needs to be addressed, and I expect Chinese and other Asian scholars to bring something to the table in that respect.”

"The whole notion of consent puts the burden on the user, relieving corporations of responsibility."

How do you see the use of data evolving?

“We will keep sharing more and more data. With evolutions like smart cars and smart cities, it will be increasingly complicated to identify who’s responsible for which data flow. There’s still a gap between the principles of privacy law, and what the engineers are doing.”


“Datafication is driven by market demands, but there’s no real reflection on how to make tech less risky for society. In Europe, we wouldn’t develop something if it was unlawful. In the US, they would claim it’s not their responsibility if the market asks for it. I work in data, and I don’t even understand how some of these data streams flow. I bought a new car, and my son can seamlessly connect his iPhone with the car. I don’t understand how the integration is engineered. That’s worrying.”

With evolutions like smart cars and smart cities, it will be increasingly complicated to identify who’s responsible for which data flow.

Your stance on the ‘legislation versus corporations’ dichotomy seems to be quite clear.

“All corporations are sociopaths. Not because the people working there are, but because they have an obligation to make profit. As a result, profit comes before ethics. Whenever I talk to a growth hacker, they always aim to maximize downloads or subscribers or whatever. But their KPIs are completely opposed to privacy obligations. Of course, the compliance of these companies is going to be low. The trick is to find the right balance with respect to fundamental rights. This by including mitigation measures into their daily practices. Most of them get it, as long as you are clear.”

What’s the solution?

“There’s a kind of pyramid of objectives for people in a corporation, and they are all based on maximizing profit. We always go towards the incentives, it’s human nature. We need to implement social and environmental KPIs. This is a bigger discussion than privacy or data. It’s about how we can change capitalism.”


“But I don’t necessarily agree that privacy and personalization are opposing forces. I think it’s a delicate balance that we need to maintain. You can totally have personalization, but the ethical factor is in the responsibility. People will gladly share data, as long as corporations take responsibility for that data.”

Two movie tips from Aurélie Pols:

"I can recommend anyone to watch two movies that have really opened my eyes and will form a good basis to truly understand privacy issues in our society."

All corporations are sociopaths. Not because the people working there are, but because they have an obligation to make profit.

“The biggest problem I see is the gap between private companies and policy makers. When Apple introduced their transparency framework, they came up with their own idea of what privacy is – and it didn’t align with legislation. There’s about 20% missing in their vision, including purpose classification. We need to aim for a world where private and public domains work together on data and privacy, much like we have financial rules for accounting. My goal is to help bridge that gap.”