Security Exec Track: What the Board Needs to Know: Privacy & Compliance in the Mobile Era

 

Session Description

Privacy & compliance of mobile apps, OTT, SDKs

Session Summary

  •  Boards prioritize financial risk mitigation, litigation avoidance, and reputation protection in mobile security.

  • Nearly 30% of mobile apps now integrate AI, increasing privacy and security risks.

  • Mobile apps require unique risk management distinct from web platforms due to complexity and third-party SDKs.

  • Regulatory bodies are becoming more technically sophisticated, focusing on SDK governance and API mapping.

  • Third-party SDKs pose significant hidden risks and require rigorous dynamic and static testing.

  • Visualizing mobile app data flows and third-party interactions can powerfully communicate risk to boards.

  • Empowering privacy and security officers with budget and authority is critical for compliance success.

Session Transcript

 

 

Hey everybody, Andrew Hog here. I'm the co-founder here at Now Secure and I'm really excited today to have a fireside chat with Steve. We're going to be talking about boards of directors and mobile security and privacy. So, I'm going to hand it over to Steve for a quick intro and then we'll dive right in. Andrew, uh great to be here today. Uh um I'm from Norton Rose Fullbrite. I'm a partner in the New York office there and spend 100% of my time working on privacy and and privacy testing issues and uh for the ocean of compliance obligations that clients face these days. Yeah, I'm really excited about today. Steve and I go way back and um just a very interesting combination of really technical and an attorney and we just rarely see those two things together. So, there's going to be a lot of great insights in the talk today. Steve, I want to start out talking about board of directors. Um, it's kind of a a mystery for a lot of folks. Uh, who are the board of directors? What are their priorities? You know, what do they care about? Uh, and so, you know, I thought we would just spend a little bit of time talking about board of directors and, you know, we're talking today about mobile security, mobile privacy. So, um, you know, what do you think the things are top of mind for them and how can folks effectively communicate with senior management and boards of directors? Yeah, I think it's a a great question and it gets really to the heart of uh of compliance because it needs to be a top-down effort ultimately if it's going to have budget and if it's going to have teeth. Um boards of directors and my experience are really concerned about the dollars and cents and you know not encountering unnecessary litigation exposure uh or you know huge bad uh PR events. Uh, one of the things that we've seen in the cyber realm that I think we're going to um that we'll also see in the privacy space are shareholder derivative suits. And so these are suits that really where you sue the board for lack of appropriate due diligence and oversight uh over a particular area. And again, if if one wants a list of examples, you could, you know, it's a smattering of Fortune 500 companies and uh and some of the settlements and awards in that area are are absolutely staggering. So I think that the first thing to bring to an to the attention of boards of directors and to chief counsel, general counsel is that this is possible and the size of the exposure uh these days whether it's because of cyber or uh privacy class actions, regulatory actions, shareholder derivative suits, etc. is is potentially huge. And so I I would start there with the board of directors and then kind of move backwards. you know, their first question should be, well, how do we reduce the risk and comply? And so, you know, that then it helps to drive the discussion and set priorities, but there's plenty of examples right now of companies ringing up really big dollar hits on um litigation. So, absolutely. Yeah, it makes a lot of sense. I mean board of directors obviously operate in looking at the overall uh company steering it at the highest levels. Uh thinking about things like risk, how do you quantify them? Um are there particular um ways to quantify risk to the board? Are there frameworks or standards? I know um you know when I was new to joining a board of trustees things like enterprise risk management, risk registers. Should mobile be plugging into this common language? um or is there something unique and mobile's different and so it it's you know there's an opportunity to kind of bring a concept or or something new to discussion there. So I think mobile stands on its own um and should be treated as kind of a unique and standalone item. I also think, and I know we'll probably get into this uh uh further in the discussion, but it but it's worth noting, I think we're about to see an explosion of um of of risk and litigation and and regulatory actions because of the increased use of AI agents and AI agentic uh workflows that will be baked into SDKs. Um and it's the wild west. It's not like this is well trodden ground where there's lots of experience and so uh I think that in the mo and the mobile devices and mobile apps are going to be the key places where this is going to be implemented and and as we know as opposed to websites I think where it's easy to come easier to see data flows and test mobile is tough and it and it's hard to get compliance right it's hard to get testing right and I think that what boards should be thinking of is what is right around the corner and what's right around the corner is is an increased risk and regulatory attention on something that companies are really ill-prepared to to tackle and need to invest time and effort in. Yeah, makes a lot of sense. Steve, one of the things, you know, uh we talk about a lot on the on the board of trustees that I'm on is AI and a lot of people look to me as the director uh that's going to know it all, right? because a lot of these folks come from a financial background um and they're looking for expertise whether it's other you know board members trustees or or outside experts uh we've seen in our data and and have found that um you know there's a lot of traction when talking to executives and boards uh that bringing up this topic of AI uh is is really powerful because um everybody's talking about AI they know enough to say I'm using it I know there's risks but I don't know much beyond that uh Our data shows that nearly 30% of apps now integrate AI in some form or fashion. U and I think that that would surprise people. You know, one out of three apps or if you've got a hundred apps over 30 35 of the apps on your phone, they're currently using AI. Are you aware of that? Have you thought through the risks uh in terms of uh intellectual property, sensitive data, PII, etc.? Um, and so that's a really great way I would say to kind of catch people's attention, tie it to something that they're thinking a lot about today and worried about uh, and then help them understand um, how that risk plays out from a mobile context. Yeah. So, oh, go ahead. No, no, that I I think that's exactly right. And I think, you know, and explaining low-level technical risk and where the rubber meets the road can be difficult for folks who don't have a technical background, uh, who are on boards or sit in legal departments. But I think with AI in some cases it's it's easy to at least conceptualize the risk because all of these kind of fancy new services that are being talked about and planned and rolled out involve taking uh personal data um might be content you've interacted with content or text you've created your voice uh your image you know all of this and then applying it you know applying it to AI are applying AI to it. And uh and but the but the problem is is that grants access to really the crown jewels of of of any you know mobile app or company service and that is the personal information uh the you know the very detailed personal information of the user the customer the patient the account holder and um that's something that you know up till now I think security teams uh have been you know very very uh invested in guarding right I mean you you build the walls around this you test to prevent uh access to it and now caution is sort of being thrown to the wind to roll out these these services but but I think it's something that that can be you know more easily explained maybe than some other types of cyber risks. It's like, yeah, you're going to expose all this to some, you know, AI endpoint or uh, you know, and and you have to trust them as if you would trust the innermost sanctum of your own, you know, operation. So, yeah, absolutely. Yeah, I think you're right. It's a great way to kind of explain the overall risk in a way that will resonate. Um, well, let's kind of talk through we touched on it and I think we'll we'll we'll zoom in a little bit first from a regulatory standpoint and then we'll kind of step back out. Um, but one of the first things that boards are going to think about is well, what are we legally required to do? What sort of regulatory requirements do we have on us that we have to comply with? Um, and um, it'll naturally go there, right? And and so we'll start with that one. like what does it look like out there today from a uh regulatory environment from an enforcement standpoint as it pertains to security and privacy particularly with with mobile? Yeah. So I think it's in the in the process of being ramped up. I mean, one of the cases that um that came out of California um was uh it was actually a kids app, but the language that and it was in 2024. The language that um the California attorney general used to describe the responsibility of the company with respect to um the mobile app and the operation of the mobile app. they spoke in terms of SDK governance and you know this is not a term that I that I was seeing from regulators or plaintiffs uh you know five years ago four years ago uh but I think that one of the big trends is we're seeing an increasing technical sophistication on the part of regulators on the part of planist council on the part of even the media right to unearth and report stuff but when you have a regulator um you say look what are you doing around SDK governance which is of course to say what data does it process what does it do what does it transmit what you know what's its purpose and to be able to do you know change management around that or to do periodic testing or or even real time testing of that I think is just a total gamecher and then similarly in the New York uh the New York AG recently in a a settlement with a carrier an insurance carrier uh wondered why the carrier hadn't done API mapping. Now API mapping is not a word that's not a phrase or a word combination you would have heard come out of the mouth of an attorney general four or five years ago. So they're thinking at they've made they've really gone up the learning curve. They're thinking in terms of and they're and you know I think it was actually a now secure blog post once where you know mobile security begins with the server I think was something I remember that was from some now secure materials. So but they're thinking SDK governance API mapping they're thinking in terms of the system and endpoints as well as what's happening locally. So there's um and again we see that come out in the in the in recent enforcement actions. We've certainly seen it on the privacy side with lawsuits around mobile apps um where plaintiffs experts have done uh dynamic testing in order to identify data that's being transmitted off the device that's not anticipated by the user according to the pliff. And um and so it's uh it's just getting a lot more dangerous out there with that increasing technical sophistication of regulators, plaintiffs. Right. Yeah. Yeah. That's fascinating because I know we hear a lot of times that regulation is lagging far behind and whatnot. But you know I too have had discussions with a lot of regulators who really have dug in deep to try to understand this. And you know, when you think about a mobile app, something like 50 60 70% of that mobile app is actually third-party software that you know, if you're an enterprise that your developers put into the app, uh software that has to be tested because in the end when you ship that app, it has your brand name on it. You've shipped it and signed it. This is now yours. And so thinking about that third-party risk management and recognizing that there's all of this software. What's it doing? Where is it sending it? have you disclosed it? Um and uh so we too are seeing regulators kind of be very thoughtful about that and even if the regulations themselves change a little bit more slowly there is a sufficient generality to them uh that can come back and say like these are the things that you needed to be able to comply with and and you haven't done this even in this kind of this new world. Um so we're seeing the same there as well. Um, I think when we when we talk about making this compelling and helping boards understand why it matters, regulation is a very natural place to go. Nobody gets excited about it, right? But it's a natural place to go. Um, but, you know, there are other impacts and you kind of touched on them earlier. Um, you know, some of them off the top of my head would be the reputational impact. You've you've you've had a breach. You know, how do you quantify that? It's difficult, but you've got a reputational impact. um you have the legal exposure, right, uh that we've, you know, the the the class action, the longtail on that, the regulatory stuff, the shareholder. Um and then and then I would classify things into kind of operational, right? I I did talk to a friend who worked at United Health Group and and they said that, you know, every meeting they went into after the breach was about the breach and and so their ability to move the company forward strategically to drive value to their customers or shareholders was significantly impaired retaining and hiring employees and whatnot. So, you know, there's this natural tendency to talk about regulation um and there's this much bigger uh scope, right? And y what are your suggestions on on or your experiences around tying those things together? Is it something that resonates with boards? Do you try to quantify it? Um yeah, I'm just curious your thoughts on again that wider aperture. Yeah. And I think you know um there is there is that that additional element and it's large of what's the prospective value of the the app be it in fees in uh advertising if it happens to be a monetized app. Um, and if you get the ick factor attached to you because even if you haven't been sued, but you've you've appeared on page A1 of the Wall Street Journal, um, with a big blow up and and rundown of all the things you're doing and that, uh, they allege are problematic. Um, that can be there's a sting to that that that has a long tail. And um it's almost in in some senses like there's a lot of there's a lot of lawsuits where you know the settlement there may be a range in settlement values and that's that's knowable but once you have brand damage that happens that can be um that can be just as just as damaging for the company and it's it's certainly it's something that's hard to quantify but uh but um but something that's that's important and should be brought forward to to boards, you know. Uh yeah. Yeah. You mentioned, by the way, oh, I'm sorry to interrupt. I'm sorry. You mentioned before about, you know, the fact that such a large amount of any mobile app consists of third party code, you know, as a result of the huge modular development um you know, approach that everyone uses and has to and really has to use. Um but there seems to be a mindset I think in companies where they think that well since that code doesn't get executed in the perimeter of our environment at large with cloud services and the rest um you know that it doesn't get the same level of vetting and sometimes no vetting you know um and yet that's where all the action is for the risk. Um and and those things are hard to vet. I mean it takes it takes a company honestly like like now secure that has the capability and and the technical expertise and depth if you're talking about obfiscated code you know and and and huge huge amounts of executable code that needs to be kind of digested and analyzed. Um I know that's not quite the board question but you had mentioned it and I think it's such I think it's such an important part because it's you know it's like well it's not in our environment so it's it's but it's worse it's in your user's hand you know. Yeah. No, I think that's great. I mean, I actually hadn't thought of it that way and that's really interesting. I always think about it's bundled into the app and you're responsible for it and yes, yes, and yes, but but folks might kind of put a partition around, well, if it's not running in our cloud, it's not our problem. But, but again, it's not only your brand, um, but you've bundled this up and told people to use this and so you have to be thinking about that. That that kind of pulls us, I think, you know, into a little bit more technical side. I mean primarily our audience here are are managers and seuite and board but um you're also you're very technical. You've done a lot of testing on privacy at a website level. We've collaborated from a mobile standpoint. Sure. Um and you know let's take a couple of minutes and and and talk through a little bit about what what are some of the techniques for vetting those third-party SDKs and and you know how much can you say well this is operating not in my cloud it's in their cloud. um does that matter and if I may throw a third one at you and if they have asked if they've asserted certain things right if they've said oh this is what we do or whatever like can you can you rely on that third party um so we'd love to delve into a little bit more on the technical side of of what folks need to be thinking about or doing on that third party SDK risk yeah and and to and to start with like the legal lens that focuses in then on the technical um actions like the legal requirement now that's baked into all these state privacy laws. uh one of the requirements is proportionality which looks at um what's the nature of the service that's being provided to the user and what's the nature of the data or the data processing that's going on to provide that service and you know am I collecting precise geolocation in order to operate a flashlight app right that fails on the proportionality test it's not anticipated by the end user and so but the only way to and and another concept would be minimization. Are we collecting more data than is necessary to actually you know carry out the operations of the service of the mobile app and you know from a technical test how that translates into technical testing is you a company really has to understand what does each SDK do and I think that um and that that's going to involve dynamic testing network traffic analysis but also static analysis of the code uh and I know that one of the things that now secure Bert does. Um that's really powerful is uh you know even at runtime getting inside the code and uh of the of the mobile apps during execution and and you know being able to map those processes kind of in real time. Uh so I think all of those are necessary to understand what the SDK is doing. And you know, one of the things I would say too that is is important in terms of the static analysis of code or sort of the code sort of standing by itself is we can do the network traffic analysis and see well what data was collected in real time when the app was used when the core features of the app were used. But one of the things about you know uh mobile SDKs is that there can be a latent data transmission risk where okay we didn't hit the trigger point that where the data was transmitted right but the static analysis of the code will uncover that latent data transmission risk. So I think that you know between the dynamic and static testing that the you know you run the kind of the full scope of that testing around the SDK and then again the legal piece we've got the requirements proportionality minimization but then how do we manage risk and from a legal standpoint? Well we know now we know what this mobile SDK does and maybe what risks it presents or the type of code that's in it. Now we go to the contract with the vendor and there and we want to make sure we've got obligations that correspond to what the risk is, you know. So you want to try and offload transfer risk via indemnity provisions um you know via other responsibilities in a data uh data protection provision as well that outlines what you can and can't do uh as the as the vendor or the SDK. And that's kind of like another important legal piece that sits side by side with the technical analysis. Yeah, that's really great. Um it's um it's kind of interesting like again you have this legal view, you have this technical experience and so you kind of combine these two worlds together. So when you're thinking about you know presenting this risk and quantifying it uh we talked a little bit earlier about this but when you think about like how do you visualize it? How do you make it so that you know we get past the jargon? You know, are there certain KPIs, two or three that really matter? Is there a way that you've found you could kind of visualize these risks? Uh are there maps? I I remember back in the day there was this map that that obstensively showed like the, you know, attackers and where all their attacking and they had a map of the world and things would shoot this way and that way and everybody went to that dashboard and sat there and stared at it for a while and got excited about it. It was it was kind of silly but it captured people's attention. was a way to visualize it and I just wonder you know do you have techniques around making this consumable and understandable taking that technical the legal side of it and bringing it all together in something that you know boards and and seuitees can really appreciate understand. Yeah, I think you know on on the on the one hand even aside from whether there's anything nefarious going on I think what's visually very compelling for boards and for legal um departments is like you show them on a screen a a dynamic test in real time and just run a network traffic capture for two minutes of their mobile app and you see 3,000 network requests and responses with 150 different parties and you this is what the risk, you know, envelope uh encompasses for you. And so, and most the 95% of the time, um, people are absolutely shocked at the amount of interactivity that the mobile app has with all these different third parties. Um, many of which are sort of anticipated. But we also see and and I'm sure you see this a lot as well is the um there's a lot of inadvertent uh data transmission sharing leakage that goes on and it's part of the you know kind of ongoing risk management of of these things. But in terms of illustrating it for legal and for the board I actually think there are some compelling visuals like that you know or the number of different countries that their SDKs hit. you think that it's only talking to servers in the United States, but you do a reverse look up on the public facing IP address and you find that this is indeed not the case, you know. Right. Yeah. Absolutely. Yeah. And um so that you know that's all part of it. And then I think that and it depends on the particular mobile app. If you've got a a like a financial services um mobile app uh you know that has kind of one set of um uh kind of performance requirements around it and and and the company will know how many users use it and kind of what the benefit is to the company. Sometimes with in other industries where um there might be advertising that's monet that's where the mobile app is monetized. There can be a very um a very uh clean direct estimate of what the risk is for having a falloff in the usage of that app if you get a brand hit. You know, if you've got $150 million of mobile ad revenue every year on an app, uh, and you get completely hammered in the press, uh, there you can estimate that some portion of that is Yeah, it's great. Yeah. Quant. Yeah. Yeah. Quantifying that could be huge. Well, I'm going to put one last question at you because we have just a couple of minutes left. So if you were to kind of sum up from executive management from boards what they really need to be thinking about in terms of preparing for to manage this the privacy compliance security risk for mobile. Do you have a couple of uh either sage sage points of advice or hey in the next three or six months this is really something you should probably task the team and go go take a look at. Yeah. So I think a couple things I mean on the technical side uh clearly the integration of AI services and uh processing is is is going to be a huge risk and that has to be put on the radar in a big way so that people look at it and deal with it. Um and and I think also the ongoing risks of just sort of SDKs as we have come to know them and love them over the past you know decade too. uh those risks are ongoing and substantial as well. I think organizationally though there is also what I would say because I see it all the time across all businesses no matter you know whether they're midsize or they're global is that um companies sometimes don't do a great job of empowering their uh with authority and budget their chief privacy officer or their CISO so that they can go and address these things and you know it becomes trit to talk about things like you know a culture of compliance, but it's actually a real thing. You have to want to um you know uh put put this first and put it and in fact put it first you know organizationally budget-wise if you want to see the fruits of of that at the end of the day to because just giving it lip service or if you know product teams run rough shod over legal departments then it becomes difficult to achieve these objectives. um you know and I think that some of the organizations where I've seen um that do this really well they have people on the product side that are uh you know high up in the company you know that maybe have the ear of the board that are really tuned in to some of them have a cyber background for example or security background um and they tend to be really tuned into these issues but I think I think there's sort of this twin goal of yeah, we got to focus on the technical pieces and the risks and and that, but then organizationally there are real things that a company can do that will advance the ball and make the the risk management piece so much easier. Yeah, that's fantastic. Well, Steve, I knew this would be super interesting. I think very actionable, really valuable to everybody attending today. So, I just want to thank you for your time. uh great insights and uh kind of looking forward to keeping on the collaboration for the next 10 years. No, look, no, it's great and it's great to thanks for inviting me on and I look forward to the uh the Freda session that you guys have coming up. So, thank you. All right, Steve, take care. Take care. Yep. Thank you. Welcome to the Mar. I'm Alan Snyder and we're going to talk about step three of the MARMM program. This is where you bring together uh step one where you defined your impact tiers and the app attributes that matter to you to make it a high, medium, or low impact to your business. And step two where you did the asset inventory of understanding all of the mobile apps whether ones you built or ones that somebody else built and you use and you put sensitive or critical information in and you start to categorize and put things together. This is super important to the program because how do you know what level of testing you should apply unless you understand what category that app should be in? Now this requires you're getting information about the app. You need to understand uh whether the app has PII, whether the app has critical information such as IP, uh financial transactions. You need to understand does the app have the ability to track geoloc, how many endpoints, how many downloads. So, you're going to need a lot of information. Highly recommend you use now secure. We can actually tell you uh pretty much all of those items, right? We can't we can't tell you brand uh impact, but we can certainly tell you all of the other attributes of that app and how we would categorize whether it's high, medium, or low impact to your business. So once you have that, it's also important to keep in mind apps will change over time. Sometimes they will lose functionality and be downgraded. Sometimes they will gain functionality and information and be upgraded. So this is a continuous process that needs to be applied. This has been your MARM minute. [Music] Welcome to the MARM Minute. I'm Alan Snder and we're going to talk about step four of the MARMM program. This is the part where you really now that you've got your apps uh categorized and classified in terms of business impact, you need to give some thought to what is the appropriate level.

16 results found