Maëlle Gavet is a tech industry leader, having held positions as the COO of Compass, Executive VP of the Priceline Group, CEO of OZON.ru, and Principal of the Boston Consulting Group. Maëlle has been named a Young Global Leader by the World Economic Forum, one of Fortune’s 40 Under 40, one of the Most Creative People in Business by Fast Company, and was fifth among Time Magazine’s List of the Top 25 ‘Female Techpreneurs’. Maëlle sits down with Chris Snyder to discuss her first book, Trampled by Unicorns, and the tech industry’s growing empathy problem.
Today’s show is sponsored by banks.com – the world's most comprehensive and trusted branding and discovery platform for banks and banking related products & services. Banks.com is aligning consumer core values with trusted financial institutions bringing attention and awareness to leading financial brands.
"Basically, corporate empathy is what I would define as the ability for a company to understand and integrate into its decision making the impact it has on people and people in general." - Maëlle Gavet
"Engineers have this strong belief that innovation will make the world a better place. And as a result of that, all the bumps in the road, all the destruction that innovation creates, are actually for the greater good." - Maëlle Gavet
"The best investment you'll ever make is the investment you make in your people." - Maëlle Gavet
[00:00:02] Hello, everyone, Chris Snyder here, host of the Snyder showdown president at Juhll Agency and founder of Financial Services Platform Banks dot com. On this show, we take a no B.S. approach to business success and failure, told through the stories of the top entrepreneurs and executives who have lived them. Join us today as we get the unfiltered backstories behind successful brands. My guest today is Maëlle Gavet. She is a tech industry leader, having held positions as the COO of Compass, Executive VP of the Priceline Group, CEO of OZON.ru, and Principal of the Boston Consulting Group. Maëlle has been
named a Young Global Leader by the World Economic Forum, one of Fortune’s 40 Under 40, one of the Most Creative People in Business by Fast Company, and was fifth among Time Magazine’s List of the Top 25 ‘Female Techpreneurs’. Maëlle sits down with Chris Snyder to discuss her first book, Trampled by Unicorns, and the tech industry’s growing empathy problem. By the way, could not be more than a convenient time to have this conversation.
[00:01:18] Super excited to have you here today. Welcome to the show.
[00:01:23] Hi, Chris. Very happy to be here.
[00:01:25] Excellent. Well, I think, you know, if anyone's heard the show, the way we kick this off is I think everybody likes to know the origin story. So tell us a little bit about your upbringing, where you grew up and how you got to where you are today.
[00:01:38] Sure. So I'm sure you can already hear it. I'm originally from France. I've been living in the U.S. now for a little over three years. But I've been coming to the U.S. for a really long time, probably almost 20 years now. I grew up in what you would call your typical middle class family. My mother was a teacher and my father was an engineer at a time where being an engineer was anything but glamorous. And you spend most of your time in my back room. Was that window helping with punch cards? And so that is really that is really the environment. I grew up. I grew up in. And because of that, I started I I started my first company actually pretty early on. I was 16 because we had enough money to buy books. There was an unlimited budget for books. Yeah, but that was about it. And I liked pretty dresses and I liked going out. And so I built my very first business that had nothing to do with Stack. When I was 16 and then from there I went to build the second line, assert one. I joined BCG, as you mention, and then I got hooked on technology and I realized how technology was truly changing the world and how fascinating that was for me to be part of that journey. And so I started I started working as an executive. I've been an entrepreneur. I've been a consultant. And then I've been I've been an executive. And now I'm trying to be an author. I don't know about that. I will say that.
[00:03:17] I think I think your timing is great. Obviously, your experience speaks for itself. But going back to that experience when you were 16 years old, you know, I have a daughter that's gonna be 11 this year. And we are also trying to you know, she has an unlimited book budget as well. I think she reads four or five hundred page books like they're going out of style. But what was it about? I guess at a very early age that created this sense that you wanted to be in charge of your own stuff, you wanted to build your own stuff. You wanted to be an entrepreneur. Was there something in your family, your relatives or society as a whole that said, look, I'm going to cut my own path, I'm not going to sit in the dark room with a punch card, I'm going to do it this way.
[00:04:05] I think it's a mix.
[00:04:07] I think I was very fortunate to be raised by a family that deeply believe that girls have the same potential and the same opportunities than boys. And then actually the sky is the limit. As long as you work really hard. I my grandfather, I was very, very close to him. My grandfather had one of the most unbelievable work ethic I've ever seen. And at the same time, he was a very incredibly kind, caring man. And he fought during World War Two. He was in the resistance. Every three daughters, no boys, just daughters. And and what my grandfather taught me, what my father taught me. And then obviously, my mom, too, was that I was in charge of my own destiny. I could do whatever I wanted as long as I was working hard enough to get there. My grandfather used to say nothing. I've seen French, but my grandfather used to say nothing worth doing is ever easy. Yeah. So that was it. Yes.
[00:05:21] Well, that's great, you know. And, you know, I'm noticing that. I mean, we're similarly age. But you actually went to school for Russian language and literature. Finnish. Right. But your and I find this a lot with a lot of tech entrepreneurs and folks that are in tech.
[00:05:41] They usually start in, you know, literature or, you know, some design categories or, you know, political science or something in the humanities.
[00:05:51] And then you started your career, Boston Consulting Group when you went to Boston Consulting Group, to Gerty, have an idea about tech or did that role going from client decline to client?
[00:06:03] Did that introduce you to the whole tech ecosystem?
[00:06:07] It really introduced me to it.
[00:06:09] It's quite interesting, actually, because because of my father's background, I got my first computer when I was, I don't know, 10, which had the time was no small feat because it was like it was not common to have necessarily to have a computer at home. That's how I learned to program. But then for whatever reason, probably boys, I actually I guess from the age of whatever, 60 into the age of when I joined. When I joined BCG, I was. How old was it? Twenty five or something. Yeah. I somehow computers and the Internet and all of that, just like I completely ignored it. So I don't know what happened. But then when I joined BCG, I tech wasn't really part of my thinking. It wasn't really part of my environment. I wasn't particularly excited about tech companies. I had actually never really thought about it. And I just happen to be working in two practices. So basically specialty's, as we call them at BCG one, which was the consumer goods and retail practice, and then the other one, which was TMT, technology, media and telecom. And the reason why I ended up in the GMC practice was because there was a lot of questions around how do traditional how do traditional consumer brands were going to embrace the digital revolution. But I started from the consumer brand side, and this is where I discovered e-commerce was like, oh, that's interesting. And then from there from there, I got deeper and deeper into tech. And this is where where I was like, oh, that's that is where I want to work. Like, this is where the world is being made. This is where I want to be.
[00:08:03] Yeah, well, that's interesting because studying the humanities gives you that interesting perspective, introspective on how people think. Right. But I think it's also interesting that you've basically gone to e commerce and all not only e commerce, but these are marketplaces. Right. So ozone is an e commerce marketplace, a Russian e commerce marketplace. Right. And then Priceline, we all know that's a marketplace. Compass is a real estate marketplace. I got to tell you, either lucky or super good. Super good. You've made the exact right decisions over the last 20 years. Marketplace in an e commerce. Did you. Was that intentional?
[00:08:52] No, not at all. And it's it's it makes me smile that you say that, because I can tell you that every time I join one of these companies, people around me were like, are you crazy? Like, I. No. The face of my mother when I told her, hey, mom, I'm about to become a partner at BCG. Like about probably a year and a half from now. But you know what? I've decided that I'm not going to become a partner at this G. I decided that I was going to join this tiny company called Ozone. Oh, by the way, I'm moving to Russia. And, yeah, they're you know, they're one of the e-commerce companies, I think about it like the Amazon of Russia. My mom had no idea what Amazon was, by the way. And yeah, I think I can I can I believe this company one day is going to be the largest e-commerce platform in Russia. And I want to be part of that. So imagine having this conversation with your French mom, who spent 30 years of her life working as a teacher in high school in Paris and making sure her daughter got, you know, the best education.
[00:10:10] One would argue the best education on the planet to go probably walk this very traditional path, which I would argue would have been very safe, right?
[00:10:20] Yes. And instead, you're like, I'm going to Russia. It's like she's. We freaking out, right?
[00:10:29] Totally. And it was just I think it was the combination of like, okay, you're going to Russia and you're going to do what?
[00:10:35] Oh, and by the way and by the way, as I as I look at it, this was the the capital markets implosion. Right. This was 2008. So I would I would like to suggest that 2008 was a capital markets implosion. So I'd like to suggest that the Boston Consulting Group was probably the safest place to hang out at the time. But like, we weren't even out of the woods on that. If you went to ozone and you started there no less in a sales role. So it's not like you went there, took an executive job, you had a look you had to do work you to work your way up there.
[00:11:14] It looks like I joined a sales and marketing director. So I was already part of the executive team. But when I joined, they basically told me that the CEO who became a friend and who has been for me such an amazing mentor, wanted to eat. He knew that he was going to leave about a year and a half later. And so they were looking for his replacement. And when they when they hired me, I did the full story as I was at the BCG and I was at BCG at the time, I did a small project for that company, not because of the company, but because one of their shareholders was a good client of BCG. The company was really small. We would never have taken that kind of project otherwise. And the evern funny stories are actually I bend over backwards to not do that project because I thought that I was still thinking, OK, maybe I'm gonna become partner after all. And when you are on the verge of becoming a partner at any consulting company, BCG among them, you are supposed to build your portfolio of clients. And so I was looking at this tiny little ozone and thinking they are never going to buy another project. So why am I going to spend several precious weeks of my life to work on a client that will never buy another project? So I did everything I could to not get on that project. And then they forced me to get that project. And I it was like a revelation. I love the culture. I love the speed. I love the complexity. It was everything. It was just like, oh, my God, this is perfect. This is like being an entrepreneur without having to those for the zero to one. And so I joined them because it was just this aha moment was like, this is where I'm meant to be. And there was definitely a part of risk because when they told me, you know, the CEO was going to leave and maybe you a year and a half, you'll you'll you'll replace him. There were three other candidates and I was by far the least attractive on paper. I was this French young woman was a humanities background. We just had spent six years as a consultant item in writing. Right. All over the world. Actually, it wasn't a consultant in Russia for more than a few months, but still, it was just. Again, talking. I remember my mom's face and then I'm taking my mom as an example back and tell you I had this conversation with so many people who were looking at you like I was completely insane. It was just like, I know, I know it's crazy, but I feel it in my gut, like best of right company to go to.
[00:14:00] Yeah. Well, I got so glad we spent some time on this. Cause I guess, you know, each day was hard at this time.
[00:14:10] There was a lot of other marketplace businesses that were hot at this time, and you basically got the Book of Secrets for supply and Demand. All the bidding and budgeting algorithms. You know, the platform business. You know, the beat of the businesses loading up their product and producing these feeds and consumers coming in. You've got to see a lot. And you basically saw it there. You definitely saw it at Priceline. You saw it again at Compass. So there's this kind of book of secrets that not only were you introduced to, you know, probably 10, you know, 10 years ago or even more, but then you continued to build and grow on that experience. And I think that's an important baseline to talk about the book, because at the end of the day, you've seen the algorithms gridded. These are successful media algorithms. And maybe you were exposed to some. Some consumer reviews or customer reviews. You were probably exposed to some back in algorithms that adjusted prices and gave premier placement to some products or companies, you know, rather than others.
[00:15:21] But you probably started to see the amount of deep tech, algorithmic tech and big tech. Want to say big tech. I mean, this is stuff that scales right in. So what did you when did you realize after these experiences that you wanted to write a book because you left Compas in twenty nineteen? Obviously that's that wasn't that long ago. And then you like sit around in New York City, you know, the pandemic is coming to you because you're only new marketplace and network effects for the businesses to be in. So you're like a better write a book about this. Another good guess, I must say about because it takes about a year to write a book, takes some people two years to write a book, too. So you guessed right. Again, you're smart. I get that you guys write again and now we're here.
[00:16:13] What was it that drove you to this? Tripled by unicorn's in the tech industry. Growing empathy problem after all of this experience.
[00:16:22] So before I answer that question, I just wanted to make a comment on the first half of your of your of your question, which is around my understanding of what you overall, what you generally call the arborist meek ecosystem. Actually, when I when I started at AutoZone, retargeting did not exist in Russia. I actually worked very closely to what at the time was one of the biggest retargeting company. It's a French company equal to which became a public company. Are you familiar with Chrystia? Yes. OK. And so I discovered retargeting. And as as we were helping them grow in Russia and I remember at the beginning being super excited, it's like, oh, that's amazing.
[00:17:12] We're going to be able to show targeted advertising. Like, these are people who just came onto a Web site. We know what they will and they're going to go somewhere else and we're gonna be able to show it to them. And then I remember a couple of months later, when these advertising start, these these ads started actually following me. And I'm like, oh, my God, that's creepy. And so that was like this first moment was I? Oh, wait a minute. So that means actually that because I looked at whatever that book like, I'm going to see this same book or similar book for another six months and they know about it. And then I remembered the other piece, which was we built we build from scratch an emailing strategy. And so we would depending on what you had seen on our Web site again.
[00:18:05] And also but I had the same experience after an open table, which belongs to two Priceline and then and then a compass. So I'd ozone we were just building for the very first time an email chain. So depending on your activities on the Web site, we will send you different emails. If you're on Amazon, I mean, you sure? I was customers who receive this e-mails all the time unless you opt out.
[00:18:29] And it was the exact same process. He was me being super excited about it. And then about a month in, I was like, wait a minute. Like that actually means that they my company, my company actually registers every single thing I do on the Web site, every single thing that I buy, every single thing that I don't buy and then decide what to send me to try to influence me. That's right. And then it was just it was just really scary. And by the way, one of the reason, for example, why I never worked for this big tech companies that I talk about in the book was precisely because of that, because I was realizing that I was fundamentally uncomfortable, was the core of the business model, the surveillance economy, as sign Zubov talks about. And I was just like, I I cannot do that. So sorry. It's it's a law. It's not the question you are asking, but I do this because I've got to build it. And a built recommendation algorithm. I hired some of the very first A.I. engineers at Ozone and I did some of this work in Priceline was open table. It was just and then again at Compaq, we we hired we hired one of the. First, we hired the first team for four comebacks. I just I was in the Corps like building it from scratch and every time I remember, like to look like when I realized what you could do on Facebook to target people was look like agencies. And it was like, oh, my God, it's going to be so bad.
[00:20:14] I'm glad I'm glad you brought this up because you knew maybe 10 years ago that this sort of surveillance, as you call it, in.
[00:20:28] You knew it made you internally, just not it made you feel icky. And you're kind of like, I don't really like this, you know? You know, God forbid if someone has a health problem, you know, maybe they have some health problem and then they go to a health Web site. And then everywhere they go on Yahoo's home page and other what other. Otherwise, they have to be reminded every single day that they have a health problem. You should take this medication, right. That's terrible. Right? It's like, why do I need. How are people following? But you know what's interesting? Either nobody knows or nobody cares because 20 years ago, everything started to go to free. And when when things are free, you have to realize you are the product. Right. So your data, your life like you are the product. So if you're not paying to go to a new site and you're looking at ads on a new site, they're targeting you and you are the product. So it sounds like to me you do a very long time ago that you didn't really want to get into this. And it's interesting, you are early, but as you know, as these things are cyclical, your your timing was early, but it was right. It just wasn't right until right now. And now the book is right.
[00:21:48] So I you know, I wrote my very first article about data regulation for Wired six years ago. I was it was an open so it was a one pager.
[00:21:59] And I was like, we got a problem. We we are we having this big tech company which are mining our data like it's oil, which is something that has been saved many, many times by many, many other people. So I don't I don't really I don't know on that terms. But what was fascinating to me was the reaction of my peers.
[00:22:19] I I have a lot of friends in the big tech company from Silicon Valley. They were reading that. And I to me, it seems so obtuse, like I didn't feel like I had I had wrote something that was like worth talking about.
[00:22:35] And and I remember the reaction of my friends telling me, you're crazy. Why are you writing that? Of course. Like, why do we need regulation? That government is bad? And like, what's the problem with the data? Like the future is in the data is that that's a problem. Right? There's no oversight. These companies are collecting more and more data. I think it goes beyond just the product is free. And so you're the product. It goes down to most people to this day. Still do not fully understand. And and trust me, big tech companies are doing everything they can to make sure that you don't fully understand. I don't fully understand how much data is being collected on them and what is being done with it. And so I think it goes beyond just it's free. I think people do not understand fully the tradeoffs that they are making. And on top of that, there is no alternative. Like, let's assume you want to use Facebook because your your friends and your family is on it, but you don't want them to collect your data and you'd be ready to pay. There is no such option. I would pay whatever prices to do what they would.
[00:23:48] Is there billions of users? And we can get into solutions too. But let's just say, since you brought it up, I would pay 12 bucks a year for Facebook to get no ads. I would pay 10 bucks a month for Facebook to get no ads. And by the way, these some of the co-founders in some of the early employees of these tech companies are very outspoken about what is going on and how dangerous it is to be. So so you so you're so you're in front of this thing. One thing before we get into the book, you're still I believe you're an investor. You're on the board of in red.
[00:24:28] You're also a limited partner, an operator, collective. Right. So it seems like you're you're deeply involved still in a lot of the day to day, but also as an investor and also doing a lot of other strategic things, right?
[00:24:42] Yeah, I'm in tech. Tech is my world and I'm an optimist. I really do think that technology companies are, when managed properly enlist. The right values at their core are actually a source of progress and innovation.
[00:24:56] And I think humanity will can benefit from it. So, yes, I'm still very, very involved. Some of the things officially, some of this things less officially, but yet very, very involved.
[00:25:08] So let's talk. About the article you wrote in Fast Company, and I know you've written a lot of them. But on September the 1st of this year, how social media is pushing us toward nineteen eighty four, which is some amazing George Orwellian stuff. And I and I actually read the article. And as you I mean, obviously, if you know a lot of the folks listening or are in our demographic. But I mean, we're talking about, you know, dystopian big brother machines that can't be switched off. They record every conversation and movement.
[00:25:52] You know, our media is becoming reductive and simplistic. Think Twitter with one four characters. I know that that's change. But you know, the two minutes of hate and I started reading this thing and I was like, oh, my God, how did you. Because that's an old book. We're not we're really old people. So, I mean, we're 10 years old. When that you know, when that when that book was was written or or maybe a little younger. But at the end of the day, how did you get a hold of that book? And then you read it because you're already on this path and you tied him together. And then you wrote this article for Fast Company. How did that happen? And then second, to follow up with that. How do you feel about those things that I said, like dystopian and these character limits in two minutes a hey, good. Can you go into that a little bit?
[00:26:37] Of course. So you remember when I was telling you there was unlimited book budgets? My dad was my parents. That's exactly what happened. So I was a pandemic I went to. I went back to France for a month to take care of my family. They knew they really needed some help. And so I found myself after quarantining and being tested negative. I myself in my parents house and an afternoon thinking, OK, I have a free afternoon. Let me let's just read something. And so I go I go through the multiple, multiple bookshelves in our home. And then suddenly I see that book and I'm like, oh, haven't read it in a really long time. That's right. That's right. That's right. To read it, to see if it's as good as I vaguely remember it was. And I started reading it, expecting, you know, the usual Big Brother description and thinking it's going to we all know the analogy between Detec and Big Brother out there. Ever be interesting? Give me it will give me a good conversation for my next dinner, virtually, obviously. And then and then I read that thing and I was like, oh, my God. This goes so much deeper than just the concept of Big Brother and continuous monitoring. It goes so much further than just comparing the two or iPhones to what they have.
[00:28:10] And so they have money Czerski now and like the telescreen. Here we go. So. Oh yeah. The Amazon. Alexa. The Google home. Exactly.
[00:28:19] So I was, I was expecting all of that. Like I knew all of that. I was like, okay, great. That's a great book. And then as I was reading it, I was like, oh but there's so much more to it. Auro had this unbelievable ability to describe what is currently happening. So you you briefly mentioned the concept of the two minutes hate, which is this daily ritual of outrage which is orchestrated by the party. And so in all this novel, people interrupt whatever it is that they're doing that day every day, and they stand in front of their telescreens to flame the enemies and celebrate big brothers. And what is really scary when you read the description is Orwell. Explain how even if you didn't really want to take part into it, you after 30 second, you would feel it impossible to not join with your heart and soul like you would go there because you were fourth and then you would kind of be drawn into it and you would feel these ecstasy of outrage and fear and you would feel this desire to kill and to torture. And I was reading that description, I think, inside. And I was like, oh, my God, this is exactly what is happening very often on social network. I talk in the article about what happened during the GamerGate scandal where there's a fat woman and her family that became the target of massive harassment campaign, where people leaked her personal photos and threatened her with rape and death. And then I remembered, I thought about all this right wing and conspiracy spewing influencer who are propagating. Racist and misogynistic means and and how all of us, at one point or another, myself included, we we go into it like we we read an article was a particularly catchy emotional headline. And we posted we've retweeted sometimes without reading it now and we contribute to that. And so to me, it was like these two minutes hate actually happens in our daily beat in our daily life when we are actually logging into a social network and for whatever reason, get really engage into these hyper emotional, hyper aggressive debates. And I'm having a hard time to use it to call it really debate. But this back and forth between people who don't get it because nobody's listening to each other.
[00:31:01] I remember I refuse to talk about politics. There's a lot of things I refuse to talk about.
[00:31:07] And I, too, just pretty much spend my day in tech because some of these other issues are difficult to resolve when you try to speak publicly about them. It could be dangerous, candidly. So I. I try to stay out of the fray. Are you good? But I do remember. Plus, it's emotionally just draining Jesus. I mean, I can sit there all day and look at this stuff because we have work to do. Right. We're actually trying to build stuff that people like not, you know, create tension and build stuff that people hate. But I do remember a day and this may have happened in your family as well. But I do remember a day when people were allowed to share a different perspective. The spoken word was something that a lot of folks did the written word. You know, although complex at times was some signal of, you know, your interest in a topic or your ability to communicate effectively. And we don't do that anymore. So I opt out of a lot of those discussions. Which actually brings me to another question for you about Newspeak, which was also in that article. Can you go? Can you talk a little bit about Newspeak and how you tie that back to the Orwell 1984 book?
[00:32:25] Yeah, absolutely. So in in that book, Orwell talks about a new language, which is called Newspeak, which is the language used by the party to strip meaning out of language.
[00:32:41] And so it reduces over time the number of words and oral really describes extremely well how words are being eliminated and how the goal is to actually have as little word as possible, as few words as possible so that you can you can limit the way people feel, the way people process ideas. If you don't have the words, you can't process ideas. And if you don't process ideas, then you don't process the world. And if you don't process the world, then you don't comprehend it. And so the language loses completely its meaning. And what that means. And so there's this a sentence which is repeated in the book many times. It's like worries. Peace. Freedom is slavery. Ignorance is trends. And this is this. This comes directly from Newspeak because of the manipulation of the language where words are kind of emptied of meaning completely. And the party, the ultimate goal. Why is the party doing all of that is because the party becomes completely in control of what is considered reality. And then on top of that, during the book, Orwell described the process where the party actually physically rewrite history constantly by eliminating and rewriting. I mean, literally rewriting history by reprinting magazines and stuff like that. And when you think about what is happening in social network, on social network, it's it's not fundamentally different or the language is becoming more reductive and more simplistic because viral outreach requires that you can't you can't go viral if you have a complex concept to explain. The character leanness that you mentioned is part of that. The hashtags like ashed hashtags are like this genius idea to surface and promote catchy. Easy to understand ideas, events and trends. But again, it's it's it's catchy. It's easy to understand. It's very simplistic. And so on these platforms the same way. Viral outrage, the viral outrage. What we're talking about before is the two minute hate. And I translate on social media. Well, it's not an accident. It's actually a core feature of this right wing media. The same way this simplification, this easy trend, easy to understand. It's nuance is not rewarded. And it's it's not an accident. It's a core features of of the business model of social networks.
[00:35:18] Now, can you imagine your 2000 word or 4000 word prose going viral like you have to sit down. You have to think about what you want to say and then you have to research it. So you're not saying something stupid. And then you have to provide the the backup in the facts to why you believe what you believe. And then you have to allow people to come into that conversation and potentially argue their points as an equal. Can you imagine that? That's it. Just it really doesn't exist. And candidly, it doesn't exist much in journalism anymore either. Not on the ad supported platform, because the incentive is for the engagement. The incentive is for the clicks. So we're misaligned on incentives as well. So these are I mean, these are are really dead deep and somewhat shocking points. And I'm glad you wrote them and I'm glad that, you know, Fast Company published this stuff. So now let's talk about, OK. I think we've identified some of the problems. We've definitely. I mean, maybe we can just say, you know, George Orwell created all these problems. But no, I'm kidding. But at the end of the day, I'm so glad he thought about this is not the only thing he's thought about, Bob. He's thought about a lot of really neat things. But at the end of the day. Get it, just taking this down a notch for me anyway, because I'm crazy about this stuff. All right. You're positive in general about tech, you know that there's opportunities in tech to fix this. So we're in the rabbit hole. We're in deep. Facebook is not going away. Google is not going away. Twitter is not going away. LinkedIn is not going away. There's so many of these platforms that connect the planet and do provide some good. So now that we've basically told everybody all these bad things that are happening, I don't think that's good enough. And I know you don't either. So how do we fix it? What are your proposals? What are some specific changes you feel like you can put forth to help fix this problem?
[00:37:34] So there is no silver bullet then, to answer the question that I didn't answer before, you know, around like the book and the timing. I've been I've been working on that book for more than 18 months. I was actually already working on the book when I was a compass, because I've been talking about this topic of how how to put humanity at the center of tech for for really quite sometimes. And I kept talking to people be at Ozona, Priceline or open table or confess about the fact that you to build a great tech company. It doesn't start was a great product. It starts with great people who care about other people. And then these great people go and build great product. And so you've got to always go back to people in the way you built your company and in the way you think about the product that you're then pushing outside your doors. And so as I was writing the book, what what became really obvious as I was researching also everything that had been written was that a lot of the articles that I was reading in the media and the vast majority of the books I was I was reading about tech.
[00:38:49] They were talking about problems, by the way. I mean, most of the time they were talking about problem in a very dark, very one sided way. I think there is some really good stuff that tech did. It's not just all the ugly. It's the good, the bad and the ugly. There's there's definitely some good. But not only. I thought they were extremely one sided. Like everything is horrible. Everything is dark.
[00:39:13] We're doomed. But then there was no solution. And back to the headlines.
[00:39:18] Right. And you're talking about the popular media who has ad supported platforms. There's a couple who don't. But so, of course, they're just going to give us the worst tickets like the social networks do.
[00:39:32] Right. They do the exact same thing that they say is bad and they're making it worse. Right.
[00:39:39] Yeah. And so I wanted to make sure that I would actually really talk about a bad solution. And that's part of the reason why the book took also so long to write, because I wanted to make sure that at least half of the book was about. OK. So now we all agree that we have a problem and we all agree about what the problems are and where they coming from. And again, the good, the bad and the ugly. Now let's talk about what we're going to try to do to fix it. And again, there is not one single thing that you can do, and that's it. You're solving the problem. But there are definitely a series of things that can be done. My overall iPod is. My overall approach is that I believe that solutions will come from a more balanced role between companies and governments and a heightened, heightened awareness from users about what's happening. So I don't believe in self-regulation. I'm not a socialist. I'm not advocating for governments to take over everything. But I have never seen you know, I have a background in humanities and I've studied history and and I've never seen any company ever centry self regulate. It just doesn't exist. And so there's no reason why tech would be different at the same time.
[00:41:00] I did leave in Russia. I spent some time in China and I've seen firsthand what it is to have a country which is extremely tightly controlled by the government. And I don't want that either. And so the first the first thing that I advocate very strongly in my book is some kind of surge ways where we absolutely need government to step up. And there are topics that need to be at the center of conversations with or elected officials, things like antitrust. There were some ering not long ago. We can talk or not talk about it. I think we can do a way better job than what we've done so far around antitrust, but also things like taxations or tax or tax codes, both in Europe and in the U.S. is absolutely not adapted to tech companies. And it's too easy to point finger at them until they know paying their tax, they actually paying the tax that they're supposed to pay when the tax code. The tax code is just not optimized the way it needs to be. There needs to be a huge conversation about labor law and what do we do with skilled workers and how do we protect them properly? There needs to be a huge conversation around misinformation and civil discourse and whether or not we should be authorizing microtargeting. So one of the for me, really important conversation we need to have is around how do we rein in the surveillance economy and how do we make it easier for users to have access to manage and delete all their data? How do we make sure that companies. Keeps users more informed about data collection and targeting. And how do we make sure that these same companies are more transparent about their the users rights? And all the implication of all the lies and the follows? We need to have a conversation about data mining and personalization and whether it should be up to our opt out. We don't have a huge conversation around data brokers, which aren't like a play, especially in the US, because they collect data. And the the SEEMY needed to sell it without you ever knowing where your data is going.
[00:43:11] We used to call it. We'll aggregate it and then we'll anonymous it and then we'll sell it because maybe that that seems better. I want to make I want to make two comments on your proposal here. My my first comment is yes, I think we should have in an opt in policy, not an opt out policy. You know, it's funny as the whole cookie palooza, chrome and safari and everything happened, you know, you you go to Web sites now and you get the annoying thing and then you just click it. Nobody reads this stuff, by the way. You're not really. They didn't really make any significant changes. The tech companies know that all they have to do is go into their user experience lab, look at a million of these things that pop up and they'll get the customers to say, OK, collect my data and then you're going to. So, so fundamentally, you know, I don't know where you stand on this. It doesn't matter. There should be a public discourse, to your point. I think it should be opt in, not opt out. And then secondarily, do you really think that a lot of these jackasses in our government even understand what we're talking about right now? These people are 60, 70 years old. They just don't know what's up. And we're asking them to regulate an industry that's probably, you know, probably no more than 20 years old. Right. Facebook was not a publicly held company 20 years ago. They're one of the most one of the biggest, most valuable companies on the planet. Same with Apple. One would argue IBM was around, but I don't see there being a beast that will do these nefarious things. Do you really believe we have the capability in our government with the people that are in our government to to take this role on, as you described?
[00:45:02] So I am again, I'm an optimist, and as a European, I think I have had a strong belief that governments are actually a really key component of a society, I think. And I'm talking obviously about democratically elected governments. Yeah, I think in the West in general, we have democratically elected governments. We want them to implement a certain vision of society, a certain set of values. And I think that's their role now. Are the current politicians, are the current officials always up to depart? Unfortunately, no. I think that we are seeing a new generation of politicians who do understand that this is really a key topic.
[00:45:47] And yes, the sixty sixty five seventy five years old, there may be a little behind, but I think I'm I want to believe that more and more of them are being are being educated by their teams because they have teams. You know, it's not just one person. I believe that a lot of them are more and more realizing the impact and isn't doing fast enough. No, I wished I was going so much faster. I'm I'm I'm worried about the fact that we're not our governments on both side of the Atlantic are not moving fast enough on these topics.
[00:46:26] But I do see movements and I I think that and I think we don't have a choice. Like, what is the alternative? We have to have I. No matter how much I want to believe that big tank is going to make promises. And a lot of what I'm talking about up teen and microtargeting regulation. I believe that tech companies could actually be proactive and really think about the end user and the impact they have on humanities and really implement a lot of these changes without the regulator coming in. That would be my my my dearest wish is that if only the regulator didn't have to intervene, that would be awesome. But as we've talked about, self-regulation doesn't work. And so the regulator will have to step in. And I think we don't have a choice. We have to be we have to be voting for people who understand this topic. I think we conscious complain about the fact that they're bad at it. Let's just go and elect the one who can actually understand it and we don't have a choice.
[00:47:29] No, I agree with you. A thousand percent. So one thing I wanted to ask you, actually, the title of your book is Tripled by Unicorn's. And then there's a subtext that says big text empathy problem and how to fix it. Can you describe the empathy problem statement? Can you go into that a little bit?
[00:47:49] Yes. So it starts with the idea of corporate empathy. I think people don't always understand what I mean by corporate empathy. So let me let me start with that. Basically, corporate empathy is what I would define as the ability for a company to understand and integrate into its decision making, the impact it has on people and people in general. It staff its customer or human society in general. And the problem is with tech is that how to build empathetic tech has been really difficult because one new industry has had such a rapid, profound and extensive impact on humanity. And it's just really hard to to, like, understand the impact we have on all these people. The second tech companies are in their vast majority built on the idea, publicly expressed or not, by the way, consciously was conscious or not. Also, that humans are inefficient and that technology can replace all of this inefficiency can replace humans, that people are often what I would call a Justman variable. The thing that tech leaders are trying to get out of the equation. And so we have, by definition, an empathy problem because, one, we are impacting humanity so fast that we just don't have the time to fully understand how we impacting it and how to really understand what's happening to we. We are built on this idea that we are going to replace humans. We are going to code is so much more efficient. So why why try to really understand the impact we have on people we're trying to replace people with?
[00:49:37] And then the you know, the reason why it's so funny. You're saying these things that I'm like, well, wait a second. This might be obvious. So that's why you've written a book. And I haven't. But it's like these companies are being created by engineers. So, of course, they're gonna tackle these problems through their lens. And of course, when they see an opportunity there, they're opportunity to see user centric. Design is an opportunity to, you know, dare I say, optimize towards their incentive.
[00:50:08] And I just don't think they have the right incentives, right?
[00:50:13] Totally. So really, I talk in the book about the culture of Silicon Valley. That comes a lot from the fact that added at its heart, it has engineers. And look, I I've been working with engineers for a really long time. They are unbelievably smart people. They felt like life changing products that we use all the time. And so I again, I'm an optimist person. I really believe that made the world a better place in many, many occasions. But they also come was that lens that you're talking about, which is by code, is better. Like humans are deficient by nature.
[00:50:48] And so we've got to replace them with code. And then the last thing that they have in that that goes back to your question about what is the tech empathy problem is that most of the time engineers have this strong believe that innovation will make the world a better place. And as a result of that, that all the bumps off on the road, all the destruction that innovation creates are actually for the greater good. And what that means is that if you really deeply believe that innovation drives you to a sunnier, have your place and the bumps on the road aren't just that, just thunks and you have to just go through them. You forget that all these bumps and all these distractions, they affect real people with real families, that their pains, their emotions, they're not just a data point. They're that it's not good enough to say that the world will be a better place in 20 years if it's not a better place right now. And so because of that, there's like all these things happening within tech that makes empathy such a core core missing valued technology.
[00:51:50] Yeah, I am. This is just fascinating.
[00:51:53] And I honestly I mean I mean, you are the expert in this field, but I've never seen anyone. Obviously, I think there's some op ed and I think there's some stuff out there around this, but I've never really seen anyone go out, write a book about it. Have you experienced because you're in the crowd, right? You're in the group. You're in the cool crowd. Everybody. You're you know, if you were to.
[00:52:19] I don't. In the cool crowd. OK, yes. Let's do that. Let's let's say that it makes me feel good.
[00:52:25] And that's kind of that's kind of what I'm getting at. Right. You look at you look at what you've done. You look at the businesses you've participated in. You're part of the group. And now basically what's happening is, is you're saying the group that I'm a part of, I don't. I think all of this needs to be done a different way. Have you received any criticisms or backlash based on these comments? Because if we go in the direction that you're proposing, which I believe we should, by the way, people are going to lose money. I mean, think go back to Creo. I can't imagine what the new Chrome browser and the safari work and then some of the Apple work that they've done to critique, though, in a few these other retargeting companies. I've used them plenty of times as an advertising agency. It must have really, really hurt their business. Have you have you received criticism over this approach and what is the big deal?
[00:53:25] So the book hasn't yet been published. So I expect the avalanche of criticism to happen after people read the book. But because I've been talking about these topics for quite some times. Yes, I did receive some criticism. Mainly, though, around the fact that people believed somehow that being empathetic means being weak. Yeah. And I always reply to them being empathetic doesn't mean that you can't make harsh decisions like you can fire people in an empathetic way.
[00:54:04] You can make. You can pivot an entire company, change your business model and doing, you know, in a way that really, really think about the impact on people, which was the which was the definition of corporate empathy I was talking about. So and I'm yet to find a technology company, by the way, which is too empathetic, like I show me one that is really which we empathetic. And because of that, they can't make a decision. And then I'll be like, yeah, maybe empathy is a problem. But right now we are. We have a deficit of empathy more than anything. So that's the first argument that I usually have with people who criticize what I talk about again. One way before the book. And then the second one is the one that you've just mentioned, which is, oh, you can make money. And I actually fundamentally disagree with that. And I'm in the process. In parallel of the book, we're going to launch a corporate empathy test. So you're gonna be a. Go onto my Web site and then you'll be able to take a test for your company where you'll be assessing the corporate empathy score of your company. And over time, I want that data, which will be fully anonymous, too. It's not about you. It's really about. About the company is to figuring out is there a strong correlation between empathy and performance. And my my understanding and again, I want to have more data to prove the point is that when you build an empathetic company, you build a sustainable company. And so maybe short term, you are going to be growing a little less and be a little less profitable. That's right. By the way, I am not advocating for you to become a nonprofit. I'm a capitalist at heart. I believe that capitalism we don't need less capitalism. We need more empathetic capitalism. We need capitalism. Was conscious the same way. I keep repeating we don't need less tech. We need we need more empathetic tech. So I'm a as a hardcore capitalist. But I do think that to save capitalism, to have companies that are not nuts, that are not going to be just off the fat of the moment, we're not going to be there just for 10, 20 years. We need sustainable company. And how can you build a sustainable company if you do not care about humanity?
[00:56:28] Yeah, you can't do it. I mean I mean, honestly, if you have kids or you have moms or you have dads or you have aunts or uncles or cousins or nieces or you have people in your life.
[00:56:42] You should care about other people in a. If you get educated towards this issue. You'll realize very quickly that probably a lot of these folks are being taken advantage of, including ourselves. And there's a lot of smart people out there that have used, you know, different browsers and different ways to avoid all this tracking and all this stuff. But let's let's be let's get real. That's for like the top one percent of tech users who understand that they don't want to be followed around and yet given these messages. So it's a really difficult thing to do. Well, I mean, I got to tell you, we could talk about this all day, but we will have to wrap it up. But before we do that, you've got a ton of experience. This is an executive audience of both founders, entrepreneurs and C Suite in tech building companies. If you had to share one or two pieces of advice with our audience or listeners, what would that be?
[00:57:41] So, not surprisingly, my first advice would be lead with empathy. And again, it's not a weakness to try to understand people. It's not going to make you make a worse decision, if anything. As you understand fully the impact of your decision, you're going to be able to make better decisions and you're going to be able to see around the corner because you'll understand better how people may actually react.
[00:58:06] And your company fundamentally is made up of people like will you have or are people? They're the one who built your product. They're the one who built your factories. They're the ones who do everything that if you have a factory. They're the one who do everything that you have.
[00:58:19] And so lead with empathy. See it as a way to really be smart about understanding the world around you. And then I would say, don't be afraid of the truths. I think a lot of people, when they start needing was empathy, feel the need to be much more transparent and much more open about some of the things that are happening in their company and and share some of the difficulties so that people understand what's going on. And suddenly there is a reflex, which is, oh, I can't possibly tell my employee X, Y or Z. And my experience over now, a long time, 15 years in tech and then twenty five years working, is that the truth is always, always the best choice. People have a capacity to. Employees have a capacity to handle the truth. And they know. They know when you lie. They know when you hide something. They can't necessarily they don't necessarily know it consciously, but they feel it. And I think it is you'd be amazed by how how employees can react when you are transparent to them and say, yes, crewed up. Like, I made, like, one of the most terrible decision. I should never have done that. Or I'm stuck like we having this really terrible decision to make about employment and payroll and whatever that is that you're dealing with. And it's it's it's difficult. And I'm not sharing that so that you you you you feel bad for me. I'm sharing that. She explained to you why this is difficult to make the decision. I have repeatedly been amazed by the ability to come to much better decisions when you leave with empathy first and then trues second. The truth is always the best policy, no matter what. And then if I can I know you say one or two, but I'm gonna add a certainly ten, maybe ten of all. Good. The best investment you'll ever make is the investment you're making your people. Now, every single time. Like whatever it is.
[01:00:27] B and I'm not just talking about training, because when we talk about investment in people and they always end up being like, oh yeah, we need more coaching and more training, I'm talking about the time you spend with your employees to understand them and and share with them what's going on and hear from them their feedback and talking about the time as an executive that you're going to spend hiring the right people.
[01:00:49] I'm talking about all the time you're going to spend thinking about what is the right compensation strategy for your company. This this is all about investing in your people.
[01:00:59] There is in my experience and I've been known executive for well over ten years. And when I was an executive at the Priceline Group, we had sixteen thousand employees. When I left campus, we had twenty five hundred employees. So like these are big companies.
[01:01:17] My experience has been every minute that you spend investing in your employees in one form or another pays tenfold.
[01:01:27] Any other minute you you put in something else. So invest in your people.
[01:01:32] So empathy, truth and people. Right. If you could summarize that, that's that's excellent advice. And I mean, think about it. Think about. For our audience out there. Think about your significant other. You don't. I hope you don't. But I don't generally think that, you know, spouses and husbands and parents with children. And I I guess you don't walk around a lot of your kids all the time. Right.
[01:01:57] I mean, you know, usually there are hotly contested and hotly debated emotional and truthful conversations about where we stand. Right. And I think if you invest in that truth and you invest in that honesty and you do it with empathy, which I mean, look, all of us probably work a little bit more on that. But in investing in your people, you know, the world has become. So decentralized, it's easy to go grab a freelancer, use them for a few weeks or a month and then let them go. Right. I mean, but if you're going to build a sustainable company with a sustainable team, you need all these people in the boat headed towards the direction believing it can work well, even it can happen. Being truthful with each other, operating with honesty and integrity and empathy and values. There's all this these really neat concepts and important concepts that you've talked about today. So, look, everyone. I'm not going to mess this up a normal to do this, right?
[01:03:06] Maëlle is out with her forthcoming book, Trampled by Unicorns: Big Tech’s Empathy Problem and How to Fix It. September 29th, is that right? It'll be. Exactly. Yes, right. Written by a Silicon Valley insider who has held leadership positions in high growth tech companies around the world. And one of the few women to raise the very top of the industry. This is gonna be a good one. We're gonna post the Amazon links and everything in the show notes, LinkedIn and all that stuff.
[01:03:38] She also has a website that you can go to, will post all that fascinating conversation. Thanks for coming on the show today. I know everyone's going to love it. Thank you. All right. We'll talk to you soon.