NANJALA: There's a lot of money to be made by people who have figured out the internet … We've made it the most sexy and the most profitable industry in the world. So everybody wants a piece, whether they have good intentions or they have bad intentions

Hi! Welcome to The World Unspun - a brand new podcast series from New Internationalist magazine. I’m your host, Maxine Betteridge-Moes.

You just heard from Nanjala Nyabola. She’s a Kenyan writer, political analyst and the author of three books on digital technologies. She also edited the latest issue of New Internationalist magazine on disinformation and is generally a very impressive person.

I was thrilled to interview Nanjala for our first episode of the show. She’s someone I’ve been following for years, and she manages to make this complex and abstract subject matter super accessible and I think presents some original ideas you likely won’t have heard elsewhere.

Throughout our conversation, you’ll hear us make reference to several articles in the issue, including Don Kevin Hapal’s piece about the Philippines Disinformation Machine, Conrad Landin’s piece about the infamous document that brought down Britain’s first labor government, and Samira Sawlani’s piece on the role of fact-checkers in combating fake news. Subscribers to the magazine can read all of these pieces and more in print and digital.

Nanjala’s overarching argument is that we need to understand how information exists within an ecosystem rather than in isolation. It sounds complicated, but you’ll hear that it actually makes a lot of sense.

Suffice it to say that Nanjala is in high demand. I managed to catch her between moving house and while juggling multiple deadlines including another book project that’s due out next year. Midway through our conversation, you’ll hear her step out to pick up a delivery, before jumping right back into the conversation.

Thanks for listening! We hope you like the show. You can read more of our award-winning, in-depth journalism.

NANJALA: One of the things that we really try to do with this issue is to tell a global story. We wanted to move away from the framing that, for example, what came through during the COVID 19 pandemic when it was like, oh, imagine if misinformation is this bad in the West? How bad is it going to be in Africa? Right? That was the narrative through line. And actually the opposite happened. Is that a lot of the myths and the tall tales, sort of the things that shaped action in Western societies, didn't take place in African countries, and they didn't take place in the global majority.

I study society, politics and and in the digital age, and especially electoral politics, and that's kind of how I came to misinformation and disinformation scholarship is really just paying attention to the way political information has changed over the last 30 years, and sort of trying to situate the histories properly so that we can address the issues properly.

And so for me as a scholar, and sort of this is one of the things that I really wrote about in my third book, was that we don't spend enough time thinking about the soft matter that makes societies function. I'm always fascinated by things like trust and belief and conception, and I think this is the center of what information is. And this is what we really, really tried to do with this issue, is that the central character is information.

The central character is not the person, it's not the institution. It is really information as a central pillar of the political process. And I said political deliberately, not democratic process, because it works also in the same principles in authoritarian and that's, I think, one of the things that comes across in comparing, for example, Conrad's piece to Don's piece, right? Is that the Philippines, you know, it's a complicated democracy. It has a lot of issues with authoritarianism, with state violence and yet can still be an epicenter of this global information ecosystem.

So that's really one of the things that we were trying to put across in the piece. Is that in looking at information as the thing to be studied, not sort of the kind of society that people live in, because democratic societies have been incredibly vulnerable to the misinformation tactics that have emerged in the last 30 years with the digital age.

And then that's another thing that's a piece of it that's really important to understand is the impact of digital technology in the way in which information travels. And that's, you know, we try to really stress in the piece, you know that the speed of connection and the networks that are possible, the size and the scope of the networks that digital technologies make possible, is what makes what happens online unique. The actual practices are the same. Politicians lie. Politicians have always lied, and politicians will keep on lying.

And the political system, you know, requires a democratic system, requires a great deal of vigilance that's never changed. But what digital technologies make possible is number one, that we are able to connect with each other on a scale, on a completely unprecedented scale. At the peak of Facebook, there were 3.5 million billion users. That's about half of the world's population was on Facebook at one point, and potentially, sort of talking to people who are outside their closed network of people that they see every day, or people who they talk to every day, and that shifts our perceptions of, you know, the space that we have for political action. And we're seeing this in Gaza. We're seeing this sort of global conversation about the rights of Palestinian people, and digital technology is playing a very important mediation role in that.

And then the other thing, as I said, is the speed at which you're able to do that. That's another thing that digital technologies do that wasn't really possible before that we can, you know, there's a great saying that I really love, is that a lie has raced around the world twice by the time the truth sits down to tie its shoes. This is partly because of the economics of online information. You know, most reputable media houses put their content behind paywalls because they have to pay journalists and they have to pay editors, and They have to pay fact checkers, and they have to and they have to figure that out. But people who are telling lies don't have to invest in any of those systems. They can just buy a website and put whatever they want on it, and people will consume it. And so those are the things that we really wanted to emphasize to readers, is that these are the key things that digital technologies do differently that where, and that's where the scope for action lies

We know that there is a profit driven model behind the spread of lies and that politicians and leaders will use misinfo to spread ideologies and propaganda. But do you feel like both through this issue, but also in your work more broadly, do you feel like you've gotten closer to really understanding the root of why misinformation spreads?

NANJALA: I think the profit is definitely part of it because, you know, there's a lot of money to be made on the internet. Let's just take a step back, like there's a lot of money to be made by people who have figured out the internet. And so one thing that's happened is that we've made this one of the most profitable industries. These people being adjacent, being in close proximity to internet based technologies, we've made it the most sexy and the most profitable industry in the world. So everybody wants a piece, whether they have good intentions or they have bad intentions, everybody wants a piece. We're telling people not to go to university to study philosophy or law, sociology or whatever. Don't even go to study medicine, go and figure out how to do something with tech so that you too can be a billionaire. We've shifted the aspirations of billions of people away from sort of value driven, sort of employment or professions, towards this very value free, or what appears to be value free industry, that's like the big meta issue.

When you tie that to the information question, or the misinformation question, then you start to see that you've attracted both people who have good intentions and you have people who have bad intentions, and they're driven by the profit motive. And it's a marriage between people who are those people and the people who have a great incentive to use their money and their influence to shape politics.

Because I think, you know, neoliberalism is the handmaiden of fascism, right? People who run big businesses don't want high taxes, most of them don't want to pay fair wages. Having an exploited underclass is how profits are locked in in a lot of these large industries, whether you're talking about the slave trade, whether you're talking about colonization in contemporary periods. We're talking about fast fashion. We're talking about tech like the ability to have this exploited underclass is kind of a spine that runs through this massive, let's build these mega, big businesses that will produce to consume, et cetera.

All of that is a preamble to say that what we've seen, paying close attention to how misinformation works, is that there is a core group of people, usually young men, who begin by misogyny and attacking women online. Because tech companies don't really care about what happens to women online, we struggle to get them to pay attention to this, to these issues. They do that, they build an audience. And then a politician out there who is aligned with the interests of big business, or who's able to present themselves as being aligned to the interests of big business, comes in and weaponizes those networks to be able to spread misinformation online, and that politician uses that platform to then go back to the people who own business, to say, big business and industry, to say, Hey, I am the candidate who will protect your financial interests if you are able to separate your political interests. You know, having a democratic society is good for business. Having the rule of law is good for business. But there's this idea that if we take the short term gain, which is profitability, then we don't have to pay attention to the other stuff.

And I'm oversimplifying, but this is a pattern that we've seen in Mexico. We've seen it in Kenya, we've seen it in the US. We've seen it in the UK. It is such a consistent pattern that it is almost like the playbook. And the playbook is, therefore what digital technology does, is that it makes it easy for people to cultivate audiences.

And these audiences, these people might have positive intentions and negative intentions, and then the marriage of that with people who have the financial capacity to turn these audiences to input, put money into these audiences and turn them towards the political process, is what makes societies particularly vulnerable to misinformation. How much of a role does money play in your political system? How much tolerance do you have for misogyny online? All of these are factors that are not very obvious, but studying very closely how misinformation behaves, you start to realize that this is how these things go.

I've given you a long answer, but ultimately it is the role that money plays in our political processes and the incentives that it creates for people to build audiences, weaponize audiences, and turn them towards the political process.

I’m going to interrupt for one second, I think my delivery is here … sorry. Give you space to cut …

NANJALA: Alright I'm back.

Cool. So I wanted to touch on something you said around the workers within troll forms, and something that I think comes across well in Don's piece in The Philippines is these workers who are basically the ones doing the dirty work of setting up the account pages and spreading the lies. And I think it's very easy for us to sort of point the finger and lay some of the blame on these people, but I think in actual fact, these people are probably just as much victims of this misinformation ecosystem as the rest of us. And I'm wondering if you can sort of help us understand who the actual architects of misinformation are. Can we put a human face to them and how do we sort of shift that focus to not the little guys, but actually the top dogs?

NANJALA: This is again, another reason why I really encourage a systems level thinking and not sort of crisis to crisis or moment to moment kind of thinking. Ultimately, the patterns that we're seeing with the misinformation industry are so similar to other industries that we've seen in other capitalist systems that there is an exploited underclass that does the dirty work, because they are almost no other opportunities available to them for to make a meaningful living, we've sort of constrained their scope for economic action to a narrow set of choices for various structural reasons.

And then the exploited underclass might not have a full understanding of what the impact is, of what they're doing to the global system. And then you have a handful of people who are making profits middlemen, you know, that sort of merchant class, and who are making intermediate profits from those by aggregating the proceeds from the labor of that exploited underclass.

And then you have a group of people who are making super profits, right? The oligarchs, the people who are at the top of the system, who are sitting on top of these piles and piles and piles of money that is accumulated through the exploitation of everybody who is below them in the food chain. And it's, it's sort of very banal and very, you know, capitalism 101. But it's so weird that, you know, people have been reluctant to take this systems level approach to understanding what's going on in the digital age.

The reason why young men and women, especially in countries like the Philippines, right? Let's take the Philippines as an example. One of the reasons why the Philippines is such a hot spot for these misinformation networks is because it has this large population of English speaking people, young men and women, who are too educated to do farm labor, to do the kind of low level employment that would be made available to them in an agrarian society, in a capitalist agrarian society. And these jobs are presented to them as tech jobs, as jobs that will put them in the pathway towards the kind of sexy lifestyles that we're told tech entrepreneurs are owed, that this will put them on the path to becoming the next Elon Musk in the next Jeff Bezos. And it's just fundamentally not true. You know, it's not a pathway to that.

It is a little bit of dirty work that is needed to sort of make this industry take over. And then the middlemen are usually in third countries, people who have that baseline knowledge of the political systems that they're trying to interfere with. So with the US, for example, we had both farms that were in the Philippines, that were owned by Russian companies that had an interest in the American election. This was the 2016 pattern with, with the UK election, with the UK Brexit vote. Again, these were bot farms that were based elsewhere that had an interest in influencing the electoral outcomes. The meta, which runs these publishes these reports and coordinated inauthentic behavior on Facebook and all of their other platforms you know, issued a report that the misinformation ecosystem in the Central African Republic was actually Russians who were trying to influence who were in, I think it was in, it was an, it's a European country. I forget where, but Russian speakers who were putting English language misinformation in the Central African Republic social media and French language and then that they were people in France who were putting misinformation into the social media on the Central African Republic, promoting contesting narratives. And was basically a proxy war that was happening online.

And so it's this kind of sort of the middle men are usually a third country of people, nationals who are not really political, but really are seeing this as an opportunity to make, to make money, to influence the politics, political situation, and that has very strong connections to analog influence economy, lobbyists, campaign managers. This is a whole industry.

Then you have the people at the top, the people who make the super profits, the people who own the infrastructures, who own the platforms, the people who, you know, make money by this, from the middlemen, the politicians, and for those people, obviously, the interests are very clear, but what makes responding to them really difficult is the fact that they are embedded within the most powerful sectors of society.

Right? Elon Musk, everybody knows that he's a problem. Everybody's aware that what's happening on the site, formerly known as Twitter, is a problem, but he's going to go to dinner with, you know, Senator so and so, and they're going to laugh and pretend that there is no problem there with the capitalist class. They're they're going to country clubs together. They're invested in each other's lives. And so the people who make the super profits are able to sanitize the work that they do because of their relationship to power and their proximity to power, and people are willing to look the other way because they're not in the trenches doing the dirty work. They're not getting their hands dirty in the way that this worker in the Philippines is getting their hands dirty.

And that's the structure that makes all of this possible, is that it's so easy to go after powerful people, to go after the low level people, because they can tell themselves that these people are not like us, and they're not of us.

It's a lot harder to go after the people who are making the super profits. And yet, all of it is part of the ecosystem. And if anything, if you want impact logically, you should go after the people who are profiting the most from from the systems that they've built, the the site, the platforms, their interest in misinformation is that it drives traffic that their big thing that they're doing is they're not selling social networking, they're not selling web space or whatever. They're selling advertising, and advertising requires people's attention, so whatever will keep people on the site the longest is what is going to fester the longest.

And it's that disparity in interest that also makes it difficult to regulate or to govern or to take meaningful action, because these platforms are trafficking in attention and misinformation, or certainly lies that are very well packaged drive attention to the sites, for better or worse.

Does English still have a huge linguistic advantage that is excluding key people that could be part of this collective fight?

NANJALA: I think it cuts both ways. I think, from a regulatory perspective, the dominance of English, or the inability of the platform owners and the platforms are the main vectors of the misinformation that we're thinking about or talking about. The challenge is that they don't do governance regulation effectively outside of European languages and you know, that includes French, it includes Spanish, it includes Italian, whatever the European governments have been able to demand a higher level of accountability because they represent such a big market.

But if you think about it, there are more people in the world who speak Swahili than they are who speak Italian. There are more people in the world who speak Swahili than they are who speak German. You know, there are more, certainly, more people in the world who speak Hindi than there are who speak any European language. And yet you don't see the same events, investments in content moderation, the same investments in structuring policy in deep and meaningful ways that are rooted in the linguistic context, because it's not just about translation, it's also about context.

Do you know why you are able you are prohibited from sharing certain content and not other kind of content, and that's really the regulatory gap that language presents, is that there's just this idea that the internet user is primarily an English speaker, or certainly a European language speaker, who is verbal, who is able to navigate, you know, all of these terms and conditions, and who is able to understand the contextual obligations that arise from using platforms. So, yeah, there's definitely still a linguistic gap, I think, from where the work that I was doing, Kiswahili Digital Rights Project, the emphasis was really on and the motivation for me behind that project was that I didn't want to continue a struggle or fight for digital rights?

That was kind of like Moses coming down from the mountain with, you know, the 10 Commandments that I or the lawyers or the experts as the custodians of information are sort of descending towards the people and handing out all of these edicts about their digital rights. I felt that it was more important to empower people to be able to decide what kind of right context they wanted to live in, to give them the baseline tools to have that agency and creativity. And that was the ambition of the Kiswahili Digital Rights Project, so I'm not going to tell you what digital rights are important.

I'm going to give you the information about digital rights in the language that makes most sense for you. And you get to decide how you want your digital future to be shaped. What do you want from your institutions? What do you want from your community? What do you want for yourself? Which I think doesn't it's not necessarily the first instinct of people who work in digital technology. Again, the emphasis is always on experts, and experts giving power to the experts and giving space to the experts and that's not I'm not saying that there isn't room for expertise. I think there's absolutely room for expertise. I think experts need to be humble. Experts need to listen. Experts need to make space for people's agency and creativity and give people the tools to express that agency and creativity in the language that makes the most sense for them.

I think this discussion around communities and experts, it's so important, and I think it ties in really well to an example that you cite in your piece as well, which is the role of community health workers, the role that they play in the rural public health system in Kenya, particularly during the COVID 19 pandemic. But can you talk a bit more about this? Because it's sort of an anecdote in your piece, but I'm really interested to learn more about what your experience was with those health workers and the role they play.

NANJALA: I wrote about it extensively in my third book, Strange and difficult times, because that was basically a journal of my experience with the pandemic. During that period, we kind of had two very strict lockdowns in Kenya where we weren't allowed to leave the city. And then there was a break of about six weeks, and then we went back into lockdown again. And during the. Six week period. I, you know, I got on my motorcycle, and I rode my motorcycle up to the northern border with Ethiopia, so the furthest north National Park in the country. And there's long stretches.

I mean, this is, this is really remote. There's no phone signal. You're off the phone signal for like, two days. There's definitely no internet, no newspapers, no like, the only real connection to the center is radio, and even in some parts, the radio signal doesn't really hold. But everywhere I went, I would stop in these very small towns, and I'm using the town very loosely. The word town is used very loosely, because it was often like, two shops and health center and or like, three or four shops, sort of selling the same thing and a water point. But I would always see a poster about COVID awareness and people the importance of hand washing.

And I would talk to people, you know, because of the remoteness, and it's in the semi desert, so it's on the edge of the desert. You are encouraged, as an outsider, to travel with water, with bottled water. It's a shame it's plastic bottled water, but that if you get stopped, or if you see a herdsman or people sort of tending to their camels, that you share your water with them, and that generally tends to assure safe passage for where you're going. So we would stop and we would share water and sort of ask about life in these places. And, you know, it'd be like, you know, the the you're the first person that we've seen since the health community health worker was here two days ago the community health so these community health workers basically are just like, they're not nurses, they're not doctors, they're not RNs, they're trained in very, very primary health care, managing fevers, managing sort of diarrhea, managing conditions to allow people children to survive the first years of five years of life, to be able to help with safe deliveries, to be able to give people, buy people some time so that they can go to the health center in the major town.

And these people, to me, are the real heroes of why Africa navigated COVID successfully, right? Because they are going into people's homes and they're telling them to wash their hands, and they're telling them about this disease, and they're teaching them how you don't really have to teach social distance because you're in the middle of nowhere, but just sort of this. And again, I go back to the information as the center of the system, as a pillar of the system. How does information travel?

One of the things that happened in the West when the COVID pandemic broke out, I was in Italy, and I left Italy just when they started their lockdown, like I literally left, and two days later they were, they were shutting down the country. Everybody's dependent on the internet to mediate their information.

And that means that if misinformation is seeded, it travels very quickly. It reaches people even in the remote places, and it's able to shift people's perspectives towards specific things, in this case, vaccines, sort of the mistrust of vaccines and all of that stuff, but because information is still human mediated in so many parts of Africa that the community health worker is really the spine of the healthcare system in largely rural societies, that if you have a core group of well trained people who are able to go and have these almost intimate relationships with people, then they're able to slow down the spread of misinformation.

People looked at African societies, looked at Kenya and other African societies, and said, well, because you don't have XYZ, you are not going to be able to survive this. And didn't just survive it kind of came out of it, really, in the economic and the social circumstances aside, much better shape than people expected us to. And I think part of that is, well, what if you looked at how information behaves in specific societies, what are the mechanisms through which information is shared? What are the obstacles and the opportunities and all of that stuff, and, and, and that's what I'm really trying to encourage people to do with this concept of the information ecosystem is take a step back. Just because a person is educated doesn't mean that they are inherently superior in processing information than a person who isn't educated. This is why I criticize the whole media literacy as the response to misinformation. I think Samira gets to this in her piece as well. Um, that people can be smart and misinformed. People can be highly educated and have bad intentions, and people can be not educated and sort of successfully navigate the misinformation networks.

Formal education is good and it's important, but there are very many smart people who don't have formal education, and our approaches to thinking about misinformation and addressing misinformation cannot be premised on the idea that formal education is going to be 100% bulletproof.

We have to think more holistically about what it means to give people the tools to navigate an information ecosystem, regardless of how much or how little formal education they've had, which I don't think a lot of these media literacy initiatives take into account.

That’s an interesting point as well because obviously there’s a need for media literacy and obviously there’s benefits to that but who is the one delivering that media literacy and who is delivering that. It feels like we're sort of at a crossroads where, you know, social media are only poised to become a bigger part of our life. And if you know, these tech companies have their way, even these people in the most rural parts of Kenya that you were just speaking about, you know, eventually they'll be online too, and they'll be on Facebook, and they'll all be connected. But can you maybe lay out what the path, or the paths are in front of us and where we sort of go from here?

NANJALA: I don't know that we're going to get to a point where everybody is online. I don't think we are. The more I look at it, I think that we might get to almost like a natural J curve, like we've hit a saturation point and or maybe everybody will be online, but everybody might not be in the same place, might not be online the same way. So I feel like it's important to say that our digital future might not be homogenous, and will probably not be homogenous. We might just need to prepare for a future where people are online in different ways.

The second thing is that, what do we do next? I think there's calls to action for different groups, I think as citizens, as ordinary people in the world. You know, the most important thing that we need to do is become more conscious in the way we consume and disseminate political information, and be more aware of how our political behavior is shaped by the information that we consume. And this sort of takes us beyond media literacy in the narrow way that it is defined, to becoming conscious consumers of information.

I think there was a window in which we were conscious consumers of information, and people read newspapers carefully, and people watched the Nine O'Clock News carefully, and people listened to the radio carefully. And I think we need to bring back that intentionality to the way in which we engage with digital information as part of this broader information ecosystem. I think for those of us who work in digital rights, I think it's very important for us to be more plural in the way we see the world and to sort of deepen our understanding of the global nature of the issues that we're contending with.

And that's really what this issue puts on the table. If we're not just narrowly looking at the Western experience and saying, Well, this is what happens to the world, and we're not looking at experiences in the global south and saying, Well, these are unique, pathological examples. These people are weird, and this is why they're doing weird things, which has been the tendency in a lot of digital technology scholarships. We're saying that we are all the world. We are all of the world and the same phenomena might reveal different outcomes in different contexts, but ultimately, we are all human beings, and we're united by that human experience, even if it shows up differently depending on our social and political context.

And so the challenge for digital technology scholars is, what does a truly global approach to misinformation look like? What? What? And that's, it's a challenge. It's a complicated challenge. But I think that's kind of where the solution starts to coalesce. And for those who have power, you know, the governments and the people who are regulating, I think it's, it's really trying to separate the interests of capital and capitalism from the interests of the people, and being very sensitive to this creeping idea that more money equals, you know, more democracy, I Think the neoliberal lie that that, you know, unchecked consumption would lead to more democracy and peace. This is what the 1990s ended with the Cold War. This was the narrative that took shape. And I think we're reaching a point where we're seeing the limits of that, that approach. And the challenge for those who are regulating is to really be more sensitive to the sort of capitalist, neoliberal underpinnings that shape behavior and action and I think for those who are profiting from the system I always like to remind people that you can't out grind a failing system.

When I see people in the US in Europe take their democracy for granted by enabling fascism and enabling extremism in the name of, you know, these guys are making me money, so I'm gonna look the other way. You're playing with fire. And they are not the first billionaires in history. We might have the most billionaires in history, but they're certainly not the first billionaires in history, and ultimately, if society fails, all of that money turns into paper and a couple of zeros on a balance sheet.

It doesn't save you. So it's a very grim note on which to end, but that's always my message to the people who are profiting from these systems. You're not going to escape the blowback.

That was Nanjala Nyabola, the guest editor of the latest issue of New Internationalist magazine on disinformation.

New Internationalist is an independent media co-operative based in the UK. We’re proud to be co-owned by our readers and staff. We’ve been publishing award-winning, in-depth analysis on progressive issues since 1973 and we’re really excited to present this new podcast series to you.

If you liked the show and you want us to make more episodes like this, please leave us a review in your podcast app - it really helps other people to find us. And if you don’t already, please consider becoming a subscriber. You’ll gain access to our latest issue on Disinformation as well as our full 50 year archive.

Listeners to the show can also get 20% off their first year of a print or digital subscription to New Internationalist by using the promo code THEWORLDUNSPUN at checkout.

That’s it from us.

This episode was hosted and produced by Maxine Betteridge-Moes. I’m the digital editor at New Internationalist. Co-editors of the magazine are Amy Hall, Bethany Rielly, Conrad Landin and Nick Dowson. Our editorial assistant is Paula Lacey.

Our theme music has been produced by Samuel Rafanell-Williams and our logo design is by Mari Fouz. Audio editing is by Nazik Hamza. Special thank you to Impress, the UK’s independent media regulator.

See you next time.