Netskope named a Leader in the 2024 Gartner® Magic Quadrant™ for Security Service Edge. Get the report

close
close
  • Why Netskope chevron

    Changing the way networking and security work together.

  • Our Customers chevron

    Netskope serves more than 3,000 customers worldwide including more than 25 of the Fortune 100

  • Our Partners chevron

    We partner with security leaders to help you secure your journey to the cloud.

Still Highest in Execution.
Still Furthest in Vision.

Learn why 2024 Gartner® Magic Quadrant™ named Netskope a Leader for Security Service Edge the third consecutive year.

Get the report
Netskope Named a Leader in the 2024 Gartner® Magic Quadrant™ for Security Service Edge graphic for menu
We help our customers to be Ready for Anything

See our customers
Woman smiling with glasses looking out window
Netskope’s partner-centric go-to-market strategy enables our partners to maximize their growth and profitability while transforming enterprise security.

Learn about Netskope Partners
Group of diverse young professionals smiling
Your Network of Tomorrow

Plan your path toward a faster, more secure, and more resilient network designed for the applications and users that you support.

Get the white paper
Your Network of Tomorrow
Introducing the Netskope One Platform

Netskope One is a cloud-native platform that offers converged security and networking services to enable your SASE and zero trust transformation.

Learn about Netskope One
Abstract with blue lighting
Embrace a Secure Access Service Edge (SASE) architecture

Netskope NewEdge is the world’s largest, highest-performing security private cloud and provides customers with unparalleled service coverage, performance and resilience.

Learn about NewEdge
NewEdge
Netskope Cloud Exchange

The Netskope Cloud Exchange (CE) provides customers with powerful integration tools to leverage investments across their security posture.

Learn about Cloud Exchange
Netskope video
The platform of the future is Netskope

Intelligent Security Service Edge (SSE), Cloud Access Security Broker (CASB), Cloud Firewall, Next Generation Secure Web Gateway (SWG), and Private Access for ZTNA built natively into a single solution to help every business on its journey to Secure Access Service Edge (SASE) architecture.

Go to Products Overview
Netskope video
Next Gen SASE Branch is hybrid — connected, secured, and automated

Netskope Next Gen SASE Branch converges Context-Aware SASE Fabric, Zero-Trust Hybrid Security, and SkopeAI-powered Cloud Orchestrator into a unified cloud offering, ushering in a fully modernized branch experience for the borderless enterprise.

Learn about Next Gen SASE Branch
People at the open space office
Designing a SASE Architecture For Dummies

Get your complimentary copy of the only guide to SASE design you’ll ever need.

Get the eBook
Make the move to market-leading cloud security services with minimal latency and high reliability.

Learn about NewEdge
Lighted highway through mountainside switchbacks
Safely enable the use of generative AI applications with application access control, real-time user coaching, and best-in-class data protection.

Learn how we secure generative AI use
Safely Enable ChatGPT and Generative AI
Zero trust solutions for SSE and SASE deployments

Learn about Zero Trust
Boat driving through open sea
Netskope achieves FedRAMP High Authorization

Choose Netskope GovCloud to accelerate your agency’s transformation.

Learn about Netskope GovCloud
Netskope GovCloud
  • Resources chevron

    Learn more about how Netskope can help you secure your journey to the cloud.

  • Blog chevron

    Learn how Netskope enables security and networking transformation through security service edge (SSE)

  • Events and Workshops chevron

    Stay ahead of the latest security trends and connect with your peers.

  • Security Defined chevron

    Everything you need to know in our cybersecurity encyclopedia.

Security Visionaries Podcast

How to Use a Magic Quadrant and Other Industry Research
In this episode Max Havey, Steve Riley and Mona Faulkner dissect the intricate process of creating a Magic Quadrant and why it's much more than just a chart.

Play the podcast
How to Use a Magic Quadrant and Other Industry Research podcast
Latest Blogs

Read how Netskope can enable the Zero Trust and SASE journey through security service edge (SSE) capabilities.

Read the blog
Sunrise and cloudy sky
SASE Week 2023: Your SASE journey starts now!

Replay sessions from the fourth annual SASE Week.

Explore sessions
SASE Week 2023
What is Security Service Edge?

Explore the security side of SASE, the future of network and protection in the cloud.

Learn about Security Service Edge
Four-way roundabout
  • Company chevron

    We help you stay ahead of cloud, data, and network security challenges.

  • Leadership chevron

    Our leadership team is fiercely committed to doing everything it takes to make our customers successful.

  • Customer Solutions chevron

    We are here for you and with you every step of the way, ensuring your success with Netskope.

  • Training and Certification chevron

    Netskope training will help you become a cloud security expert.

Supporting sustainability through data security

Netskope is proud to participate in Vision 2045: an initiative aimed to raise awareness on private industry’s role in sustainability.

Find out more
Supporting Sustainability Through Data Security
Thinkers, builders, dreamers, innovators. Together, we deliver cutting-edge cloud security solutions to help our customers protect their data and people.

Meet our team
Group of hikers scaling a snowy mountain
Netskope’s talented and experienced Professional Services team provides a prescriptive approach to your successful implementation.

Learn about Professional Services
Netskope Professional Services
Secure your digital transformation journey and make the most of your cloud, web, and private applications with Netskope training.

Learn about Training and Certifications
Group of young professionals working
Post Thumbnail

On the latest episode of Security Visionaries host Max Havey dives into a conversation about the complex world of election security with Shamla Naidoo and co-host Emily Wearmouth. As a distinguished law professor at the University of Illinois, Naidoo offers a fresh and valuable perspective on the influence of rapidly advancing technology on our electoral processes. The episode takes a look at aspects of election security around voter registration and physical controls at polling places. Naidoo’s biggest worry, though, is not about compromise at the polls, but instead the alarming rise of psychological manipulation through misinformation and disinformation, emphasizing the need for consumer education and awareness in combating these deceptive tactics. Tune in to hear critical discussions on combating these threats and tangible tips for consumers on how to adopt a discerning attitude towards information.

As a cybersecurity practitioner, I’ve looked very hard at all the places and opportunities available for compromise, because that’s what I do. And I will tell you that I’ve identified the places and on the day of election, there’s no opportunity for large-scale compromise to the digital systems.

—Shamla Naidoo, Head of Cloud Strategy and Innovation at Netskope
Shamla Naidoo

 

Timestamps

*(00:01): Introduction*(20:53): Government initiatives in combating misinformation
*(01:31): Shamla's perspective on teaching a class about digital election law*(25:45): Consumer education and awareness
*(04:43): Impact of technology on elections*(28:06): Tips for discerning consumers
*(11:38): Importance of physical controls in polling places*(29:10): Significance of informed voting
*(16:34): Challenges of misinformation and disinformation

 

Other ways to listen:

green plus

On this episode

Shamla Naidoo
Head of Cloud Strategy and Innovation at Netskope

chevron

Shamla Naidoo

Shamla Naidoo is a technology industry veteran with experience helping businesses across diverse sectors and cultures use technology more effectively. She has successfully embraced and led digital strategy in executive leadership roles such as Global CISO, CIO, VP, and Managing Partner, at companies like IBM, Anthem (Wellpoint), Marriott (Starwood), and Northern Trust.

Emily Wearmouth
Director of International Communications and Content at Netskope

chevron

Emily Wearmouth

Emily Wearmouth is a technology communicator who helps engineers, specialists and tech organisations to communicate more effectively. At Netskope, Emily runs the company’s international communications and content programmes, working with teams across EMEA, LATAM, and APJ. She spends her days unearthing stories and telling them in a way that helps a wide range of audiences to better understand technology options and benefits.

LinkedIn logo

Max Havey
Senior Content Specialist at Netskope

chevron

Max Havey

Max Havey is a Senior Content Specialist for Netskope’s corporate communications team. He is a graduate from the University of Missouri’s School of Journalism with both Bachelor’s and Master’s in Magazine Journalism. Max has worked as a content writer for startups in the software and life insurance industries, as well as edited ghostwriting from across multiple industries.

LinkedIn logo

Shamla Naidoo

Shamla Naidoo is a technology industry veteran with experience helping businesses across diverse sectors and cultures use technology more effectively. She has successfully embraced and led digital strategy in executive leadership roles such as Global CISO, CIO, VP, and Managing Partner, at companies like IBM, Anthem (Wellpoint), Marriott (Starwood), and Northern Trust.

Emily Wearmouth

Emily Wearmouth is a technology communicator who helps engineers, specialists and tech organisations to communicate more effectively. At Netskope, Emily runs the company’s international communications and content programmes, working with teams across EMEA, LATAM, and APJ. She spends her days unearthing stories and telling them in a way that helps a wide range of audiences to better understand technology options and benefits.

LinkedIn logo

Max Havey

Max Havey is a Senior Content Specialist for Netskope’s corporate communications team. He is a graduate from the University of Missouri’s School of Journalism with both Bachelor’s and Master’s in Magazine Journalism. Max has worked as a content writer for startups in the software and life insurance industries, as well as edited ghostwriting from across multiple industries.

LinkedIn logo

Episode transcript

Open for transcript

Max Havey [00:00:01] Hello and welcome to another edition of the Security Visionaries Podcast, a podcast all about the world of cyber data and tech infrastructure, bringing together experts from around the world and across domains. I'm your host, Max Havey, senior content specialist at Netskope, and today we're talking about elections. It's the year of the election, so we would be remiss if we didn't dedicate an episode to election security and cyber threats to democracy. And as luck would have it, regular guest and friend of the show Shamla Naidoo is, among many other things, a law professor at the University of Illinois and teaches a class about election security. So, chatting with her ahead of recording another episode, we learned about this and we just had to get her on, so. Shamla, welcome back. Glad to have you here.

Shamla Naidoo [00:00:41] Thank you for having me, Max. It's great to be back.

Max Havey [00:00:44] And for the first time ever, we also have my co-host, the great Emily Wearmouth, as a guest. There have been suspicion that we're a bit like Superman and Clark Kent or Bruce Wayne and Batman. Never in the same room together. But we are here to dispel those myths. Emily, welcome to the show.

Emily Wearmouth [00:00:59] It's so good to be here with you, Max. I'm excited. We both exist.

Max Havey [00:01:03] Absolutely. We do indeed both exist. And the reason that we're both here today is, frankly, we both really wanted to get in on talking to Shamla about this. With me sitting here in the U.S. with eyes on the elections in November and Emily sitting in the UK promising to interrogate Shamla about wider international elections and democratic systems. So, without further ado, let's jump in. Shamla, to get things started here. How is teaching a class about election law in the digital world changed the way that you sort of look at election season here?

Shamla Naidoo [00:01:31] You know, Max, one of the things that I do is I teach and many of the people in my classes are young adults, and without kind of the benefit of having 40 years worth of life and corporate experience to actually look at all of this as a holistic issue. And so I've had to figure out how to break down the problem into small parts, simply because when we talk about elections, people often think about election equals good, bad, you know, compromise not compromise. Fair, unfair. We think about it in very much a binary way, and we need to be thinking about this in small parts, because an election is an entire process, and it has a workflow that actually extends from before the election, during the election, after the election. And each of those has its own opportunities. They have their own challenges. They have their own digital footprint. So one of the things that I've had to do to teach this topic more effectively is to break this up into smaller parts. So it's consumable that we identify the issues in a very unique way, because when we can identify the issues uniquely, we can come up with solutions. But if we just throw everything and the kitchen sink into this election security or election fraud or election compromise, it seems too simple. And one of the things that I learned most of all is to break this up into consumable chunks of conversation.

Emily Wearmouth [00:03:01] Shamla, how old are the people that you're teaching? I'm wondering whether some of them might never have, participated in a democratic process. Have they all voted before?

Shamla Naidoo [00:03:10] Actually, most of them would have, because this is a kind of postgraduate study program. So most of them would have participated in an election of some sort. However, I do from time to time find students who are either foreign students in the US studying postgraduate here. I also find immigrants who have just been naturalized into the US who have never participated, but they are adults and so there's a good mix. One of the things I do is I have my students do an exercise of saying, go register to vote in an election. And if you've already registered, figure out what happens if you try to register again. I also tell them if you've registered and everything is fine with your registration, go find someone in your community who you can help to register to vote. Somebody who's coming into the process just now, or somebody who's not old enough to understand how to navigate all of the technology that goes with this. So I have them go through and do this step by step to learn and to observe. Right. Because it's just very easy for us to talk theoretically until you actually do it, and you do it with the lens of what could go wrong with what I'm doing. It doesn't enrich the learning process.

Max Havey [00:04:33] Absolutely. And Shamla as you've been teaching this class, what are some of the biggest ways that you're seeing evolving technology impacting elections here? Like are you seeing evolving technologies impacting elections?

Shamla Naidoo [00:04:43] Yes, definitely I see this. But again, let's break it up. And I'm going to talk specifically in the US because that's the system I know best. Right. So in the US before the election, the technology that's used is to help people to register to vote. To capture all of their personal information to make sure that they are eligible. To do all of that kind of record. Who you are and express your right to vote. Right now, what you find is you can go online. These are internet connected systems. You can go online. You can register. You put in all your information. And, you know, identify yourself, etc.. Now could you masquerade as someone else? Absolutely. So the opportunity for fraud does exist. There's another common way that people register to vote is you might have canvasing teams who go out into the communities and encourage people to register to vote, and they might collect all your information, either on a piece of paper with a clipboard, and pretty much take it back to an office environment where they might have the technology and capture all that information on your behalf. Now nothing stops them from making up some of that information. Nothing stops them from tampering with the information and adding additional people, additional records, etc.. So that's that. The registration to vote. Is it open to compromise? Absolutely. Can someone hack into that system? Add records, take records out, change your name from Max to Mathew? Absolutely. Those opportunities exist. But remember, these are kind of preparatory systems. They not the actual voting systems. So just because that is exposed to some measure of fraud and other types of compromise doesn't make that the end result. So you could end up with a voter roll that looks extremely large or extremely short. But it doesn't mean that that is the authentic way. Remember, when you go into a polling station, you identify yourself. They look for you on the voting roll. Then you really have to take additional steps to validate that you are who you say you are. And the information captured about you in the voter registration is accurate. If not, there's a whole bunch of administration that goes into correcting that information. So the validation happens on the day of voting on the day of elections. Prior to this, technology that actually helps you do this helps you do it well. You might have electronic validation, you might have checks and balances, etc. and you know there are systems that are open to compromise as well. On the day of election though, that's when you have the validation, because only after you pass those checks and balances can you actually go in a booth with a with the ballot and vote.

Emily Wearmouth [00:07:46] It's interesting when I and I know you said you're an expert in the US system, and I'm playing through sort of international translation. We have very similar processes in the UK where the primary digitization opportunity has happened on voter registration, but it's like the day you go into on polling day, it's your multi-factor authentication that you've also walked in. And for the first time last year, we had to take ID and it was very antiquated before that, that the ID shown in person becomes your multi-factor authentication to protect against risks on the digitization process of registering to vote. It's sort of that multi-touch check that you don't think about when you think about these processes in isolation.

Shamla Naidoo [00:08:25] Absolutely. And so, you know, every, every election will have some measure of kind of voter registration unless you show up. And I remember doing this right in South Africa in the first democratic election. There was no way to do voter registration. And even if you did, they would just, you know, tens and hundreds of millions of people going to vote. So essentially what happened is when you went to vote, you got a cross put on your skin that wouldn't wash off for seven days. And so once you had that, you couldn't go vote again somewhere else in a fraudulent way. You were done, right. You had voted. And so it's steps like that that countries take in the polling station that actually does, like you talked about the multi-factor authentication, but there's also a number in the US anyway, there's a number of physical controls. So we depend heavily on physical controls in the polling place. Everything from video cameras that tracks every single action that will track you from the moment you walk in til the moment you walk out, making sure that you didn't do anything untoward. There are cameras, there's audio recording equipment, and then there's physical election judges who are neutral. They are not affiliated with any particular party. They are meant to be observers. And then there are some judges who will actually be arbitrators or mediators of any kind of contention. So, for example, if I. Show up and I said, I'm on the voter roll. Here's my ID, I need a vote, and they don't find me there. They are still allowed to give me a ballot so that I can vote. The administrative personnel will take that ballot, and they'll set it aside for extra attention later so that I don't get upset. They don't create contention in the polling place. They give me a fair chance to vote, but that doesn't mean my vote automatically counts. There's additional steps that get taken, so there's a lot of physical checks and balances that occur in the polling place to make sure that it's fair and that it's accessible to everyone who is allowed to. But just because you were allowed to have a ballot doesn't mean that the ballot gets counted. And so [00:10:50]I would argue that the physical controls in the polling place are probably the most robust. And I have one observed it. I've been an election judge. I have been a polling place observer and I watched all of these steps. Additionally, as a cybersecurity practitioner, I have looked very, very hard at all the places and opportunities available for compromise, because that's what I do. And I will tell you that I've identified the places and I'll tell you that on the day of election. There's no opportunity for large scale compromise to the digital systems. [40.5s] Remember, in the polling place in the US, the voting systems are not connected to the internet at all.

Emily Wearmouth [00:11:37] Right. I didn't know.

Shamla Naidoo [00:11:38] That. They might give you a screen, a touch screen to convert your paper ballot into a digital form. There are two things that happen, though. Those ballots get printed for storage and for future reference. And the digital aspect. All it does is it takes what you have on paper and converts it identically to a digital form for the purpose of making counting easier. But it gets reconciled in an auditing process with the paper version. So even if you compromise that and the systems are local. So even if you compromise a local system and let's say you you took Emily's vote and gave it to Max, or you converted everyone's vote to a particular party or a particular candidate. The paper based ballots will not match. So in that auditing process, it would be very, very easy and very quick to discover an anomaly. And then you go back to the video footage, you go back to all of the steps and all of the records, and you decide whether or not Emily's ballots should be counted or not. Or whether it's compromised in a way that compromises or puts into question the result. So those are things that have been done, and I think they've done very well. And so I tell people all the time, unless we break up this on the day of voting, prior to voting and then after voting. So if there's an anomaly, they set all that aside and they might set your ballot aside because it's the only one that seems like, you know, we've had situations where people have said, I touch the screen for this candidate, and it actually counted it for a different candidate. Well, we know touchscreens are notorious for being sensitive and you could touch something or barely touch something and it records it. And so your intention may not have made it to the screen, but it was your action. And so there's a number of validation that says, are you sure you want to vote for this candidate? Are you sure you want to put your checkmark here?

Emily Wearmouth [00:13:46] I find that so reassuring as a fat fingered person.

Shamla Naidoo [00:13:50] And so [00:13:51]I feel very confident that these systems are not open to these large scale compromises. And if you have what it's going to be, because I use my finger and I press the checkmark in the wrong box, but I get the opportunity to validate that, to confirm it. And if I've confirmed it, well then that's too bad. I just voted for the wrong person because I didn't take enough care. This is not a hacker came in and changed my vote from one candidate to another because in the ballot you are in control, the voter is in control, and if the system looks like it's acting up, that's why you have observers and you have judges. You go out and you tell the judge that this thing is malfunctioning, and they will have procedures that are pre documented to help you resolve the issue, like scrap the whole thing, start again, give you a new ballot, put you in a new booth, give you access to a new system. [53.0s] So there are many many ways to stop these things from happening. And I feel very assured that when I go into a polling place and vote, that my vote is done exactly the way I intend. But if I have an issue, there are procedures and steps that can help resolve it, either at the moment or later. So, for example, if I wasn't on that voter roll there, so give me a ballot, I'll still vote. It will not get counted until that extra step has been taken, which is usually after the election. So post-election, all the counting, all the validation. Many times in the US, when the polls close, all of those local manual systems will get converted to a USB drive. And so you'll dump everything that went from your touchscreen to the system that then got counted. It will get put onto a USB stick that then goes with the count. And again, post election or post voting they looking for any kind of anomaly. If the numbers don't match, you got more on the USB drive than you actually have printed. Then you have counting. Then that requires an investigation. There's all kinds of procedures to go back, look at that, investigate it, determine what's the actual fact. And then that's how you come up with spoiled ballots.

Max Havey [00:16:16] So Shamla knowing that they're like all of these analog sort of checks and balances and processes and procedures here, there is not much of a cybersecurity worry when it comes to a lot of these election processes. But zooming out a little bit on the broader scope of elections, what does where you about elections in our digital world?

Shamla Naidoo [00:16:34] Here's what I think and what I worry about. And this is what I tell my students to be aware of. Psychological manipulation is a very big threat, simply because it's all techniques of saying, whatever I tell you, whatever you believe, that may be how you will vote, because I cannot hack the election systems if I'm a adversary. I'm not going to be able to go out there and change all the votes to go to one candidate or another, because we talked a little bit about the physical controls, etc.. So the way I can do that though, is if I tell all these people a lie and then have them go in the polling place, they're probably going to vote based on what they know and what they think is factual. So I think that psychological manipulation is probably the biggest challenge to elections. And how do you manage the threat of misinformation? And by that I mean false information. Even if you didn't have the intent for distribution in a broad way. The misinformation is false information, and so do we get false information in the hands of a consumer. The second thing would be disinformation in my mind, which is also false information, but it's created with the intention of wide distribution. So you really want the information to just go viral. And those are two things that I think creates the psychological manipulation. And today some of those can be so authentic looking even if they're false. Right. You could get an article that looks like it was published by some major magazine globally. And if I don't know how to validate the publisher or the publication, if I just look at it on its face, it could look like a very professionally typeset article. It could look like it's done with very sophisticated technology, the right kind of fonts and colors, etc. but with messaging that's false, but that looks real. And so to me, I think if you psychologically manipulate societies, you're going to get a result in the polling station. That's not what might have been intended. That to me is probably the biggest challenge. And then of course, with the technology, it's so easy to write things that look professional looking. Right. You just go to any of these open AI systems, even if the information is false, it could look very fact-based and very authentic. So that's a challenge. And now what you're seeing with the deepfakes, you seeing the person you know and trust telling you their mouth is moving. The facial expressions are real. They're telling you things that they want you to believe. And because you believe them, you might actually take the words coming out of those videos and other kinds of medium as fact, when it was never that person. And maybe those words were never theirs, and maybe it was never true. And so it's becoming so easy to do those things. It's becoming affordable. Anyone can do it with a few clicks, mostly for free. So just think about that and you put some money at it, and you can create an entire media machine that can psychologically manipulate an entire society. Those are the things that worry me more than can somebody hack the technology?

Emily Wearmouth [00:20:10] One of the discussions that we're seeing emerge in the UK around an election, when, if we ever get one, is they've changed the spending limits for political parties in the run up to the election. And one of the big drivers for increasing the spending limit has been social media and digital, and the opportunities and the requirements for parties to properly campaign through those platforms. And I wondered whether you've knowing that this dis and misinformation is an issue in markets around the world. Whether you've seen any particularly interesting moves from governments in any markets to try and attempt to mitigate these risks of misinformation and disinformation in the run up to their elections, whether in the US or anywhere else. Is anyone doing anything clever to try and tackle this?

Shamla Naidoo [00:20:53] I would say there's probably countries out there, but they are tackling this problem not as an election problem, but rather as a misinformation problem. And so many are creating legislation that does a number of things in terms of who creates the information, making sure that people are taking ownership and accountability for the material that they create, the content they distribute. I know there's been some talk in the US about social media companies in particular, putting metadata tags and labels on these types of videos and information that gets created, telling people or telling the consumer that this was created using this type of technology so that they don't think that their candidate has created it. For example, the other things that we have seen in the US, in particular, where a candidate has to endorse a message for it to be considered authentic, but even that could be spoofed nowadays, right? Yeah. Because just like they can go out and tell you things that they didn't intend to tell you that are not factual and not them, you could say, well, I am this candidate and I approve this message. So there's a lot of opportunity there for manipulation. I actually think that it's a global issue. And when we get the AI legislation around the world passed and approved and adopted, we might fix some of this, because this is not just an election issue. This is across the board. What do we do to societies when you keep feeding them information? That's not accurate, right. If you just go back and you look at, for example, the Cambridge Analytica scandal, right. Exactly the same thing. They didn't have access to AI, but they had something close. They did exactly this, which is find people's kinds of vulnerabilities and weaknesses, and then just keep playing on that by psychologically feeding them information that would enrage them, information that would force them to act even in ways that were counter to their self-interest. And so we just have a lot of it going on. And as I think about this problem, I look at the cyber security technology teams and I look at the precedent we've created, right? Cybersecurity leaders do not always have the opportunity to stop bad things from happening. So just a simple thing like a phishing email can land in your mailbox. And on the technical side, we can try to do as much as we can to stop it. But every so often one is going to creep into your mailbox. How do we then stop the impact? What we do is we create awareness and education programs that we keep telling you year after year, quarter after quarter. We keep telling people, be aware of these kinds of emails. These are the hallmarks of a phishing email. Don't open it. And if you open it, don't respond to it. And if you respond to it, quickly call for help. Let us help you recover from that. And so we have precedent here for issues that come up where we don't have direct and full control. I think we've got to double down on education and awareness, just like we do in every other cybersecurity topic, which is, you know, let's warn people about the manipulation attempts. Let's make sure we provide them with awareness education of what to do, what kind of tools they can use to resist becoming a victim to this kind of psychological manipulation. How can they triangulate the accuracy of the information? Who should they trust? How should they find either neutral sources of information or trusted sources of information? So there's a lot of things that we can teach people to do because frankly, in my opinion, that is the last frontier is the consumer. We could do all kinds of things. On the technology side, we could do all kinds of things on the process side. We can even do a lot of things on the regulatory side. But people believe what their friends tell them. And so we need to lean on these practices that we know have helped us before. And we have a playbook for it. Let's start educating people about how to avoid becoming the useful idiot that people take advantage of and use old spycraft tactics on us. And so let's find the consumer. Let's elevate the consumer's IQ. Let's figure out how to educate them about being aware of their actions and the information they consume as fact.

Max Havey [00:25:45] Yeah, absolutely. My last question here was going to be about what sort of advice would you offer to the consumers out there in the world of elections? But, I mean, I think that's kind of it. Giving them the tools, having the strategies to know how to deal with the sort of disinformation and misinformation that they're seeing and trying to get that legislation passed through so that we don't have to deal with this or that. This is no longer the primary threat that we're dealing with as we're watching elections. Is there anything else in the advice frame there, Shamla, that you'd want to offer as parting words here?

Shamla Naidoo [00:26:13] For elections specifically, I'd say people should be aware of manipulation. But more importantly, when you look at a piece of information, you need to be a little bit more discerning about who authored this information. Ask yourself, why should I believe this? You know, and what is the interest in this person who supposedly authored this information? What is their interest in this position? And as a consumer, how will my behavior change after consuming this piece? And then I should triangulate that information with other sources, because often what happens is in these pieces, whether they be videos or written articles or just comments from people you know and trust, often what'll happen is you look at that and you think they're trying to convince you that there are only two positions, theirs and the other person's. And so what we need to do as consumers is say, is there a third position? Is there fourth position? How would I react to all of that? The consumer is powerful, but we need to take back our power. Social media has just created this ongoing scrolling of information. The eyes and the brain begin to relate to things you see over and over again. But if you just take a step back and you say, what is the goal of this person? Why should I believe this? Ask those questions. Who authored this? What's the interest in the outcome? And then ask yourself, as a consumer, what would I do differently after consuming this piece and then go introspectively to should I? And the last thing I would say is people should be judicious at what they share. Don't become part of the machine that distributes this information and makes it go viral, no matter how compelling it might seem and no matter how credible it might seem.

Max Havey [00:28:06] A healthy dose of skepticism goes a long way on the internet these days, and I think I think being judicious about what are the intentions behind this? How is this impacting me? How is this serving other interests that are not my own? I think that's an important way to think about this, and an important way to move through the world and on the internet and such. That's fantastic advice Shamla.

Emily Wearmouth [00:28:24] And I think Sharma also hit upon a potential episode title, because I think how not to Be a useful idiot is probably. It'd be one of our best podcast episode titles ever.

Max Havey [00:28:36] Oh, that's that is very that's very, very true.

Shamla Naidoo [00:28:38] And the goal is to avoid getting manipulated into amplifying somebody else's false message. That's the only goal. The only role for the useful idiot.

Max Havey [00:28:51] Fantastic.

Emily Wearmouth [00:28:51] Nailed it. Shamla.

Max Havey [00:28:53] Shamla. Emily, this has been a fantastic conversation. I think we could probably keep talking about this for a while, but, I can see our producer waving at us that we need to wrap things up here. So, Shamla, thank you so much for coming on here. This was a delightful conversation and so insightful. Is there anything further you'd like to add before we let the people go?

Shamla Naidoo [00:29:10] Vote. But be an informed voter.

Emily Wearmouth [00:29:12] Yes. Vote. Be an informed voter.

Max Havey [00:29:15] Absolutely. And Emily, thank you for joining as well. It's so rare that we get both of us on an episode like this. So this is a true delight.

Emily Wearmouth [00:29:21] An absolute pleasure. Max, thank you very much. I feel like I'm a guest rather than a host. It's marvelous.

Max Havey [00:29:26] I know shoes on the other foot this time. Well, you've been listening to the Security Visionaries podcast, and I've been your host, Max Harvey. If you enjoyed this episode, please share it with a friend and subscribe to Security Visionaries on your favorite podcasting platform. There you can listen to our back catalog of episodes and keep an eye out for new ones dropping every other week, hosted either by me or my co-host, the great Emily Wearmouth. And with that, we will catch you on the next episode.

Subscribe to the future of security transformation

By submitting this form, you agree to our Terms of Use and acknowledge our Privacy Statement.