Home>"Tech leaders ought to study the humanities"

12.06.2019

"Tech leaders ought to study the humanities"

Maëlle Gavet, Sciences Po alumna, was awarded the 2019 alumni award from the Sciences Po American Foundation. Gavet graduated from Sciences Po in 2002, and today is the Chief Operating Officer of Compass, a real estate technology company building an end-to-end platform for agents and their clients.

In her acceptance speech, Gavet surveyed the challenges facing the tech industry, a field in which she has 15 years of professional experience. She argued that while technology has unquestionably improved almost every aspect of the way we live and work, it has had a host lethal side effects and unintended consequences impacting their own employees, communities, other businesses and, last but not least, democracy. Gavet’s suggested solution is to reimagine the way tech leaders are educated. In her opinion, while humanities will not magically fix everything that’s wrong with tech, it can certainly help introduce the much needed empathy and understanding of the world. "We need engineers who can both code and read the Economist. We need engineers obsessed with transforming society (not moving fast and breaking it)," she said. We followed up with Gavet after her speech for a quick interview about the themes that she addressed and tips for recent Sciences Po graduates.

You painted a bleak picture of where the tech industry is headed if it doesn’t hire people with a background in the social sciences and the humanities. How likely is it that tech executives get it right and steer us away from the path we’re on now?

This is a very difficult question; whether or not we figure it out, it will have an impact on what kind of society we live in. If I knew the answer, I’d probably have a very different job from the one I currently have. What I can say though, is that I think there is an increased awareness, and while my speech was very direct, this is not the first time I’m talking about it, and this is not the first time I’m hearing people discuss it. This subject has definitely been gaining more and more visibility compared to previous years.
 
The second thing I do start seeing is that because of these conversations more and more tech leaders are trying to take action. The problem is that they’re still a minority. It’s not a small group, but it’s still a minority. Another challenge is that there’s no playbook with clear guidelines on what to do. I’m generally pretty optimistic and I do believe that when human beings focus on solving a problem, they generally do get there. But this is a very, very big and completely new problem, which we have never faced before.

You’ve written before about regulating AI. What would that regulation look like ideally?

I think right now we’re trying to figure out how to regulate the humans who are working on AI the same way we tried, more or less successfully, to regulate genetic experiments. And I say more or less successfully because, in the Western world, we have, to an extent, limited what can and cannot be done and we have designed certain ethical standards around it. Based on what I have seen and read, I am not convinced the same ethical standards have been applied to genetic research in China, for example. The challenge with AI is how do we regulate something that is going to end up being more intelligent than us.
 
AI is the equivalent of a one year old who doesn’t know how to speak and barely knows how to walk. And we have researchers admitting they are not entirely clear how their AI baby came up with the result it did. Now imagine what happens when the baby grows up.  It will be a completely different situation. That’s why we need to make sure that people working on AI fully assume social responsibilities this discipline carries and embrace intellectual, geographic and social diversity while establishing industry standards. AI — in a very, very, very simplified way —  is nothing more than a set of equations and hypotheses that are formulated by humans. The more biased the human, the more bias is input into their code. So we need to advocate for creating forcing mechanisms for the diverse and multidisciplinary approach in AI, and in tech in general.

How has your education at Sciences Po prepared you for your career?

I think that Sciences Po is great at training students to analyze problems, find facts, historical data to support their point of view, and then to effectively communicate it. This is a universal skill that is crucially  important on top of everything else that I mentioned during my speech (firm foundation and a deeper understanding of society, historical processes, macro and microeconomics).

What kind of advice do you have for young graduates and current students starting their careers?

You have to remember that you work with people and for people. When you make decisions, when you work on a project, when you run a company, when you’re an entrepreneur, when you work within a company, you will be more successful if you collaborate without ego and think about the human impact, the human stakeholders affected by the things that you’re trying to do.
 
The second advice: dream big. Our dreams can limit us. If you dream small, you’re going to do small. If you dream big, there’s a chance you’re going to do big. We have a tendency, especially coming from a school like Sciences Po (known for combining approaches and confronting different worldviews), to carry the weight of history on our shoulders, to think about all the reasons that something can or cannot happen. But at some point you risk not reaching your full potential. You can’t really impact the world if you don’t try to dream bigger than you ever thought was possible.
 
You should always combine these two things. Remember that you are, after all, a social being, and that you should try to work with other human beings and be part of this society. I guess another way to say it is that culture is very important. And by culture I mean the company culture which promotes inclusion, diversity and is empathetic. Don’t think that success is only related to IQ; it is at least 60 percent, if not more, related to EQ.

Learn more