Connect
Connect
Finding Balance Between the Physical and Virtual Worlds
Virtual assistants, social media platforms and other interactive technologies are enhancing our personal and professional lives in many ways. But do they pose a threat? Can they put our personal information and business intel at risk?
As more people and businesses flock to the virtual world, it’s important to ask: Who is actually benefiting? More of us are quickly opting into agreements, freely exchanging data and openly broadcasting our daily activities without stopping to understand the implications.
In this episode of Connect, our guest Tim Wenzel draws upon his experience as former Head of Global Security & Protective Intelligence at Meta, and his time spent with the US Department of State, to share expert advice on leveraging intelligent tech and real-life relationships in order to strike a balance between the physical and virtual worlds.
We’ll also delve into risk management and the benefits it offers to individuals and enterprises alike. We’ll learn why data is the new currency and how it drives everything we do. And, as we speed toward the metaverse, we’ll discuss the stewardship necessary to ensure that technology continues to benefit society and future generations.
For more information about Axis Communications, visit us at www.axis.com
Follow us on social media at
Axis Communications - Home | Facebook
Axis Communications: My Company | LinkedIn
Axis North America (@Axis_NA) / Twitter
Axis Communications USA - YouTube
Scott Dunn (00:05):
From AI and deep learning to cybersecurity and IOT, keeping up with technology can be challenging.
James Marcella (00:14):
Our podcast is not just about helping you keep up, we're inviting you to the precipice of what we now know is possible.
Scott Dunn (00:21):
Join us as we interview industry luminaries and trailblazers to hear how they're leveraging technology, navigating the pitfalls, and predicting the future.
James Marcella (00:32):
Together, we'll explore today's most timely topics, combining human imagination and intelligent technologies to discover new solutions.
Scott Dunn (00:43):
I am happy to introduce my co-host, Mr. James Marcella, security expert and industry association leader.
James Marcella (00:52):
And it's a pleasure to introduce Scott Dunn, technology innovator and award-winning speaker.
Scott Dunn (01:00):
And this is Connect, a bimonthly podcast from Access Communications. On today's episode, I'll be talking to Tim Wenzel about data protection, risk management, and how technology connects us all. Welcome to the podcast, Tim, it's great to have you here. I'd like to begin, if you could just take a couple of minutes and tell the audience about yourself and your background and what you're doing today.
Tim Wenzel (01:32):
Well, thanks for having me, I appreciate it. Tim Wenzel, I'm a global security leader, I started off as an army medic and I did the invasion of Iraq in 2003 and ended up doing the Abu Ghraib Prison after the scandal. Our unit actually relieved that unit of duty and then I went home in the ambulance because I was a paramedic, was super boring. So I went back as a contractor for three years and then I got into executive protection, did some work for the Saudi Royal family in the DC area, and then moved on to the state Department where I taught, and then I ended up spending nine years at Facebook/Meta where I was until last month.
Scott Dunn (02:15):
That's a pretty fascinating road in the physical security career journey, it's fairly common to see ex military, I'm ex military, we see a lot of that in this. And you've been doing this for a while, and at one point you were the head of global security and protective intelligence for Meta. So tell us a little bit about that, that's pretty fascinating.
Tim Wenzel (02:39):
Essentially, I came to Facebook in 2013 as a consultant with AS Solution to rebuild Mark's executive protection program. And during that rebuild, the CFO at the time asked me, how do we know if Mark and Cheryl's conference rooms are actually private? When they're having conversations about the new markets, we want to get into our new products that are coming up, acquisitions, how we're going to expand the company. How do we know that competitors aren't hearing these things and beating us at our own roadmap? And I said, I don't know, I just got here, how do you know?
Scott Dunn (03:19):
That's awesome. I'm the new guy, I don't know.
Tim Wenzel (03:21):
Exactly. And he laughed and he was like, cute, Tim, why don't you figure that out? And I said, okay. So we started a very basic TSCM program, a technical surveillance countermeasures program, which is something a lot of companies do. But I quickly realized that just doing 'sweeps' of rooms randomly doesn't really ensure a lot because what happens in the meantime? And so I started developing a program around privacy around some of the people on the teams and the company activities that were at the highest risk for, we call it physical data loss. Essentially, if you had information that you wanted to keep private, trade secrets, roadmaps out of the hands of competitors and out of the hands of other governments, how would you go about that and who has this type of information, and how do they work with it and where do they work with it?
(04:19):
And so I started this program of bringing all the bits and pieces, all the people, all the teams, all the workspaces that house this really sensitive information and feeding them into this program based on managing the risk around that information, not necessarily making sure that a room was free of bugs, if that makes sense. And so that went on for a lot of years and I was actually in the executive protection organization for a very long time until 2021 when I switched over to the Intelligence and Investigations Organization, which is when I became the head of protective intelligence technical countermeasures.
Scott Dunn (05:03):
I'm sure you've got quite a few stories that would be fascinating to hear, it had to be extremely challenging. I'm curious, how did you interface, so you were protecting information as well as the people, how did you work with the information technology folks? What was that relationship like?
Tim Wenzel (05:21):
Really good. So when I step into a space that I have to share with other people, I take an interesting approach, I think it's interesting, at least. One, I assume I'm the dumbest person in the relationship because it's mostly true, everybody else is usually way better at their jobs and far more technically confident than I am. And so what I do is, I come into a relationship, I understand why am I here, what is the problem I need to solve, and why does that involve all of you? And when you bring really smart people to the table and you give them a problem that involves them, that individual departments cannot solve, we have to do it collectively, and you just let smart people do their work and you just help them document it and make sure you're hitting your goals and moving it forward, apparently smart people get a lot of really good work done.
(06:16):
And so all I really had to do, is make sure the right work was getting done, we were building the program in a way that met the needs of our executives, the different programs, but also the needs of these other departments. And all of a sudden, people really want to work with you when they know, one, they're going to get their credit. Two, they're going to do really good work. And three, the work is interesting and it's going to be fun.
Scott Dunn (06:39):
I get that risk management itself isn't a one size fits all, but there has to be some commonalities, some basis for risk management that you deploy pretty much in a common set of tools and then it tweak them for the case. So what would those look like? What would the common base foundational things that you assess right away?
Tim Wenzel (07:06):
I subscribe to the program of Enterprise Security Risk Management, ESRM, became really popular with ASIS around 2017, 2018, and I was actually part of that initiative that was rolling that out with ASIS, which is a lot of fun. But the reason I really like ESRM is because you do a couple of things. One, you look at the assets that a company has or an organization has or a nonprofit and you identify and prioritize these assets, what are all the assets that we need to protect, and which ones are the highest priority and why? And so we can collect a list of the assets, but it's the business that has to make the decision on how high priority they are and what would the impact of some loss be. And then you have to identify and prioritize the risks to those prioritized assets. And then after that, you have to mitigate or manage or treat depending on what industry you're in, those risks to those prioritized assets. And all of that is bringing information to the business and having them answer the questions.
(08:14):
And so I really enjoy that type of work, to where you're actually working with the company, the executives at a company, the business verticals within the company, say like, how impactful would this be? And then what's the most appropriate way to treat it? And where security professionals, they get mixed up is, we think we need to mitigate everything, which is what we want to do, we're in the business of keeping people and things safe. But the business doesn't always see it that way, and the cycle continues and continues, and so it's really a cycle of conversations and learning. And that's what I would say, I'm a professional learner, I would rather learn what's going on today and what we're dealing with today, than assume that I already know based on something I did five years or a decade ago into different context.
Scott Dunn (09:02):
It's great. I like that professional learner comment and thesis, I like that a lot because a lot of what we do in the physical security space is learning about what the challenges are, and the challenges never end and they continue to evolve. You mentioned AI a minute ago and using AI, and it brings to mind this whole concept of data, everything is data today, and coming from Meta, of course, I know you understand that really well. And to customers, it's their lifeblood, to businesses, it's their lifeblood, it's something that's really sacred. So it begs the question in my mind, if I were meeting with you, I would say, well, what should I be doing? What should every company do to protect their data? What do you do with that type of challenge today?
Tim Wenzel (09:53):
Well, so the thing is, data is the new oil that's been said, and it's one of the hottest commodities, it's one of the most valuable things. And there's a reason for that, it's more readily available and given away. So oil you find in the ground, you pay somebody for their land and the rights to tap into that land and you pull it out. But here, people are freely giving their data away. So the first thing I would ask an organization is, have you classified your data into types of data so that we can prioritize those types and then we can do the risk analysis? Because just like risk, a lot of security professionals are like, risk and risk, but they don't understand that risk is actually a noun, it's a person, place, thing, idea, event that we can actually define, it's not just a tag word.
(10:46):
And data is similar, a lot of companies just think of like, we have lots of data. Well, that's not helpful, so let's define it a little bit so we can prioritize it and then we can do a risk analysis. The second thing I would ask is, what data is not actually yours, but you are a steward of it? So other people's data, people that are using your products and services, volunteering it up somehow. And I would ask, are you stewarding this data well in a transparent way? And do you have KPIs to measure yourself against? Because most companies aren't getting in trouble for how they manage their own data, they're getting in trouble for how they manage your data. And so those are the first two questions I would ask.
Scott Dunn (11:42):
I would assume then that you're pretty familiar with some of the laws like GDPR over in Europe and now we have the California Privacy Protection Act. Are those effective? Do they help? What is your take on legislation to enforce data protection? What do you think of that?
Tim Wenzel (12:00):
And that is it, isn't it? It's an enforcement plan. So I would say that they're pretty effective in giving the government or any regulatory body recourse. Now, are they clear enough? So are they meant to actually help businesses and consumers because the consumers in these laws as well help them understand the types of data, what it actually is, how to parse it out, to understand how to treat it well, or is it just designed to get regulators recourse to make money? It does a really good job of giving the regulators a revenue stream if they want it, and a way to push back on specific industries and companies, and it has to be a part of it.
(12:45):
But I think, if governments were really serious about this and not just looking to make money, they would really invest in an education platform for their citizens to understand, how are all these companies getting your data? What are you doing that is giving it away? What are you opting into? What do these clauses in the Google Play Store and the Apple Store, what do they mean? And start educating people on this because the consumer is a big deal. We've been told like, look, we're in a free economy, everything is free to you, your email, your messaging, all these things are free, and it's amazing.
(13:27):
When I was at the State Department, a really awesome guy, he said to me, and he was talking to his mom and she's like, Gmail's great, it's free. And he's like, mum, it's not free. And she's like, no, I don't pay for it. And he's like, what's the product? And she's like, it's email. He's like, no, it's you, you are the product, they're getting you for free.
Scott Dunn (13:51):
That's an interesting point. As I've looked at some documentaries about these types of things on social media specifically, it's pretty fascinating. When that realization hit you that I'm the product, I'm the thing, it's amazing, that's a great comment. We should all maybe wear our own logos as a specific product now, right?
Tim Wenzel (14:14):
There you go.
Scott Dunn (14:15):
That'd be interesting. So since we talked about all of this data being out there in the world, I would be remiss if I didn't talk to you a little bit about the Metaverse since you have come from that particular world. And I'm sure that our listeners will be interested to hear a little bit about that. Where once it was just a vision, social media was an interesting thing that was going to enrich our communications and create these new communities. I'd like to get your own personal thoughts on, has that really come to be? Do you think that the technology has delivered on that promise that the Metaverse is this wonderful social community place or not? What is it, what has it become?
Tim Wenzel (15:00):
Well, it's not fully here yet, obviously, but it is here in some environments. You can go to Horizons, which is a platform that Meta has, and you can use your virtual reality gear and you can interact with other people, there's places where this is at. And what they're doing right now is, they're just testing, how good can we make the experience? Because the money is in, will people want to spend time here, and when they spend time here, will they spend money here? And when they spend money here, what types of things are useful? But this is obviously going to happen, not just because it's been in movies for decades, but because it's been over a decade of building up to it, this has been on the roadmap for a long time for a lot of companies. And if you think about it, it's not that far off. People are like, am I going to put on goggles and go live in my basement? Like that movie Surrogates where they were hanging out in that bed that fed them and everything.
(16:03):
But I know people that don't eat a bite of their meal before they take a picture of it and post it. Just think of how much of our lives and preferences and our routines that we're already broadcasting. So how far of a jump is it to say, I'll put on these glasses and just broadcast most of my day because it'll get easier to meet up with friends and to meet contacts. LinkedIn, the app on the phone, you can already turn on your location and connections or anybody searching can see the proximity they have to you, so if you go to a conference, you can turn that on, you can track down connections that you have literally like, they're over here. Or you can look at who's in there, like, I want to talk to that person and go find them and then you can connect with them and talk to them.
(16:51):
All of the building blocks for this are here, it's just how immersive will it become and how quickly will that happen? And so that's going to be based on the quality of technology and the experience, the amount of data centers and pop sites that can be built because all of this data has to be processed quickly. And right now, you and I are looking at pretty high quality video of each other, and that's one thing, but it's only one camera view. If you are going to be immersed in an environment, you have to render almost all of that environment, almost all of that time really quickly, if I'm going to be turning my head and feeling like I'm there.
(17:29):
There's a lot of infrastructure that's being built around this and a lot of programming that has to get good, and the technology has to get good and inexpensive enough for most people to want to use it. But people used to say like, well, not everybody's going to have a computer, well, everybody has a computer, some people have several. Not everybody's going to have a smartphone, there's a lot of carriers that don't carry non-smart phones anymore. So it's getting cheaper and made available and the industry, society is pushing that direction, so it's going to happen.
Scott Dunn (18:01):
I've done some work with the National Center for Spectator Safety and Security out of the University of Mississippi, and I've seen situations where we've got different technologies to come in. One of them is, actually utilizing social media as a security tool. And of course, we've all heard stories of government agencies tracking people on social media, looking for threats on social media. Is that the good side or the dark side of the Metaverse? When we start talking about things like privacy, which we were talking about earlier, how does that all balance? How does that all work in your mind?
Tim Wenzel (18:41):
Well, it starts with understanding what you're opting into. Whether they want to believe it or not, but most people know that these smartphones that we have, they're capable of recording all the time everything that you say, and somebody else can have access to it and open up that microphone all the time, unless you take the battery out. How many phones can you take the battery out of now?
Scott Dunn (19:06):
You are saying zero for those of you that don't have video, which is all of you.
Tim Wenzel (19:15):
And so part of that is for design and making them smaller and the batteries have gotten bigger. But here's the thing, you have to look at what you're opting into and what you are allowing your technology to be around. Right now, you have Siri, you have Google Assistant, you have all of these assistants that are listening all of the time, and then you have these microphones for them that you can place in different parts of your home, so you don't have to have a little individual machine everywhere. I know people that have them everywhere in their bedroom, in their bathroom, and I'm like, you have literally taken away every bit of privacy from your home. And they're like, well, do you think Amazon's spying on me? I don't think they're probably spying on you, however, just search Siri voice moderators.
(20:07):
Here's a problem with voice recognition, we don't always speak clearly, and some of us don't have a Midwestern Northern American accent, we have other accents. And so if you speak in a muffled way or you stutter or you mumble, or you have an interesting accent, these devices record more of what you say and do around them and then people go over that to understand like, it thought you were saying this, but this is what you're actually saying, they're trying to build their algorithm to make it easier to recognize voice in all sorts of different ways. How much of your meetings is being saved on somebody else's servers?
(20:50):
What happens if you're a lawyer and all of a sudden there's a break in at your home and you call the police because you should, and then all of a sudden the police like, there's a Siri here, we're going to ask Amazon what they have, they may have video or they may have audio of the crime that can help. The police are a government agency, and now they have maybe one hour, maybe 10 hours of client calls, some of those clients may have business, court proceedings with that government.
Scott Dunn (21:24):
It's true, it's a really good point. But for me, I struggle with that balance. If it were my child that was missing or someone in my family that had been assaulted or murdered, I would want to use every tool, I would want law enforcement to have every tool at their disposal to find out who did that. And in doing so, they can make us all a little bit safer, but what is that balance between the safety and the privacy? You mentioned opt-in versus-
Tim Wenzel (21:58):
It's you.
Scott Dunn (21:58):
... Yeah. What you're really opting into, you may not understand, but even so, like you said, it's listening all the time, I always say, I've been married 48 years and there's only one person that knows me better than my wife, and it's Jeff Bezos because I've been on that Amazon thing since it started.
Tim Wenzel (22:19):
And that's the thing too. So people think of technology, even security people who aren't involved in that, and they're like, man, that's magic, that's awesome that they figured that out. It's not magic, there's a way in which this works it's easy to find the stuff out, we have the internet, it's not like you have to find an up-to-date encyclopedia in a library and figure out what's going on or know somebody to talk to them, you can find out lots of things on the internet. And so if you're going to have a Siri or Google Assistant, they're super helpful, maybe have them in your kitchen, maybe have them in the living room, definitely don't have them in your office, the bedroom, et cetera, because you don't really want what's being done in your bedroom and your office repeated.
(23:02):
Craft your life around, where do I expect privacy? And then leave these electronics out of it. And I'm not saying that law enforcement and government shouldn't have access in some situations, they totally should. And that's why they have legal agreements with these companies to say, in these circumstances, we may request this and blah, blah, blah, and that's why they have lawyers. I'm just saying, you need to understand what you're opt into because there's not a lot of people that are looking at Tim Wenzel and be like, how can I protect him better? Because there's too many people out there, right?
Scott Dunn (23:39):
Yeah, I think the temptation, it's too great. You're getting all the data, you've opted in, so that gives me carte blanche to collect, and I think that's where maybe the law could be a little tighter to help us, so that's very interesting. All right, I want to switch gears here for a minute. Do you have any other examples of how companies did or could use technology to help the community and to connect with the community? Are there other examples you can think of that-
Tim Wenzel (24:07):
Absolutely. Zoom has become a great enabler of connecting people, whether they're family, Skype is another one, and enabling people to be together when they can't physically be together. We'll go back to Facebook, if there's an earthquake in your area, you get to check in as safe. That's because there was earthquakes and somebody at Facebook said like, we have a GSOC full of people that can monitor users that have their app on in a location, we can communicate with them. Why don't we ask them if they're safe? India, there's difficult access to blood for transfusion, getting blood transfusions, going to the blood bank and getting your pints in is not really a thing over there. And so Facebook, again, they created a way that you can say, I need blood and I'm this type and somebody else can pick that up and say, I'm that type too, I'm going to donate for you, so there's all sorts of ways if we're thoughtful.
(25:05):
And that's why I say it comes back to you with technology, whether it's privacy or how you use it, social media is amazing. I was just watching the Last Dance, the Michael Jordan and Scottie Pippen in that era of the Bulls documentary. And on the final episode, they said, it's amazing that Michael Jordan was able to become an international phenomena, 208 countries, everybody knew about him, they bought his gear, and he didn't have the opportunity for any social media. So social media and all of these ways to connect with other people, it's a great opportunity if you want to use it well. Here's a problem, in this world today, most of the people who are the loudest are not rooting for the wellbeing of their fellow man, they are pushing their agenda, they're pushing their selfish thing, and it's often not really profitable for the community at large. So it's really on you and I to say, what can I do to make my world, my community a better place?
Scott Dunn (26:16):
... Another one that I've seen recently is using social media in the community to warn of certain safety threats, whether it's a spill, it's a accident, there are lines down. I've seen people do outreach in the communities, I know I have one here in the community I live in where we've got a coyote issue now. So they're warning people, coyotes are in this area, bring your pets in or those types of things. Or even people reaching out and saying, if you have a power failure, you can come to my place. So I see a lot of those real positive community uses going on by using the social media capability today, which I think is just fantastic. That, to me, is the real promise of the Metaverse, is that it creates a better world, not a worse world.
Tim Wenzel (27:06):
I think you're right. And the other thing is, there's some sort of a quote, which I'm probably going to butcher that said, for evil to prevail, all it takes is for good people to do nothing. So if you are seeing the negativity, you are hearing the negativity, you're hearing the hurtful things, and you're like, I'm not going to pay attention to that, turn it off, great first step. But what are you putting out there as an alternative? And I'm not saying you have to go fight with these people, just give somebody else something they're able to listen to, something that's uplifting, something that's meaningful, that's helpful. So we have to counter program some of this stuff with good things.
Scott Dunn (27:47):
I agree, I think there's more good people out there that can do that, and there's more good uses. So speaking of that, what's it look like for our future generations now with all this technology? Are they going to figure it out, are they going to be running around with a phone in their face all the time, or is it going to be a chip embedded in them? How's that look like for future generations? I've been seeing this ChatGPT and Bing's AI and all of that, it's a little scary, so what does it look like for them?
Tim Wenzel (28:20):
It can look like all sorts of different things, but let me bring something back. In the 1960s, how much technology was out there? Not a lot, radio, TV, some walkie-talkies, we weren't even into cell phones yet. And people still had work-life balance problems. People still had, I spent too much time in my hobbies and not enough time with my family. People still had the same problems we have today with less distractions. So the one thing that I say for the future generation, if you have kids, if you're around kids, if you're a role model for a kid, start talking about boundaries and healthy ways to invest your time, set up healthy habits.
(29:09):
We may teach our kids like, you know what? If you're going to have money when you grow up, because none of us believe social security's going to be here, you have to save. And so we give them a piggy bank and we start teaching them, spend some, save some. It's the same thing with investing your time, you have to be intentional with how you're going to spend your time and with whom you're going to spend it. And so I think that's number one, teach them boundaries and healthy time management habits.
(29:37):
Two, teach them about their information and how they put it out there and how it gets taken from social engineering to all of these text messages on the phones, to all the things that come in your email. What do we do and what do we not do? And how do we check it out to see if that email from Amazon to reset my password by clicking that link is actually from Amazon? So like we did Stranger Danger, we still need to do that, but we need to start talking about the virtual part of our life because more and more it's going to come together, the physical and virtual part of our life is going to be more integrated and less separated. So we need to teach kids all about their environment and all about their world in a little bit of a different way, but through a very similar lens of how we've done in the past.
Scott Dunn (30:29):
It's great advice, I like the way you put that. And you got into the second part of this, future look for me is, technology's going to continue to evolve over the next 10, 20, 50, 100 years, how do we ensure that proper stewardship with ethics and all the social concerns? And I think you hit upon that, it starts by teaching your children about boundaries, right?
Tim Wenzel (30:54):
It does, it also involves you making a decision, I will protect me. So we teach people like, if you're in a dark parking lot, don't go to your car or call a cab, or don't be putting earbuds when you're walking down a metropolitan area. Well, if you read terms of service and you say, I will not allow my data to be used for X, Y, Z, and it's in there, even though it's the coolest freaking thing. And so as things go, there's going to be a lot of popular things that you should teach your kids, you're not going to be involved in that because it's just a bad idea. And once you opt in, you can't claw your data back.
Scott Dunn (31:33):
Thank you, Tim, that's all the questions I had for you on this podcast. And I can see that you are a guy who works every day towards making the world a smarter and safer place, I really appreciate you taking the time.
Tim Wenzel (31:44):
Thanks lot, Scott.
Scott Dunn (31:49):
What a great conversation. I think it's important for us all to continue asking lots of questions and become professional learners as we continue to learn how to better use technology and unlock its potential for connecting us. I'm Scott Dunn, thank you for listening to Connect, a podcast from Access Communications. Thanks for listening to the Connect Podcast.
James Marcella (32:16):
The Connect Podcast is produced in collaboration with Gusto, a Matter company.