The beginning of 2024 saw approximately 3.4 million apps available for download from the Google Play Store, and around 1.9 million apps available in the Apple App Store. That total continues to grow and more than 52,000 new apps were added to Google Play in February alone. Mobile app developers strive to rapidly introduce new features and streamline the development process across platforms. One way to do this is to take advantage of iOS and Android Software Development Kits (SDK).
Mobile SDKs are predeveloped collections of software libraries that give developers a shortcut in introducing new features to a mobile app. These software components enable developers to rapidly deploy new features in their apps because they only need minimal knowledge about the codebase that is being integrated.
Imagine your mobile app is an automobile. You could certainly design and engineer each and every component top to bottom. But that’s time consuming and incredibly costly. As a manufacturer, you’d rather adopt technology that’s already proven to work, so you bring in components from other manufacturers. These components integrate into your overall design to ultimately result in the final automobile. You saved a lot of time, significant cost and many issues have already been worked out for you. That’s exactly what iOS and Android SDKs do for you.
It may surprise you to know that a recent review showed more than 50% of a mobile app is third-party code (SDKs); many apps contain as much as 80% third-party code. SDKs offer unparalleled convenience, but behind this veil of “plug and play” lies a myriad of independent software components that, if not evaluated or implemented properly, can pose significant privacy and security risks. Considering more than half of a typical mobile app is made up of third-party software components, you must understand the iOS and Android mobile security risks they present.
Improper Mobile SDK Implementation
Physical components, much like software components, don’t just drop in place and magically work as intended. They require some configuration. Bolts must be tightened to specific torque specifications. Part housings will need to be designed with specific tolerances. The list goes on. Now imagine that the ignition you decided to integrate has three input wires, but the rest of the ignition system in your automobile only has two outputs. If you connect the two and leave the remaining wire unplugged, will it still work? Maybe, maybe not. At best, maybe some features become unavailable; at worst, it doesn’t start or is inconsistent. Either way, variables have been introduced into your automobile platform that could cause any number of unexpected problems down the road.
SDKs work the same way. Even the most secure SDKs can be implemented improperly, thereby introducing risk where it wasn’t intended. As an ISO 17025 Certified Laboratory for the App Defense Alliance Mobile Application Security Assessment (MASA) program, my team has evaluated countless mobile apps for security and privacy concerns. A worrisome trend we have noticed is that SDKs which are integrated into the mobile app are introducing vulnerabilities not because the SDK is inherently insecure, but because the integration was incomplete or erroneous.
For example, we may see cryptographic libraries in use that are perfectly capable of generating sufficiently random numbers as part of the algorithm, however the mobile app has chosen to use a different way to generate those numbers and feed them to the cipher initialization. In both cases, a random number is provided, but in this specific case, the random number isn’t cryptographically strong or generated according to industry standards. Actions as innocent as this are incredibly difficult to pick up in traditional source code review. Although they may be observed in static binary analysis, this isn’t always the case and context is often lacking (e.g. Is this random number generated to be used in an operation that’s relevant to the security or privacy of the user?).
Another example was the highly publicized Firebase Cloud Messaging (FCM) misconfiguration where certain keys included in popular Android apps were actually private access keys, not public. Some of these occurrences stemmed from integrated SDKs.
Defective iOS & Android SDKs
Using our automobile metaphor once again, let’s now assume that your integration of components was flawless. You took great care to implement each component exactly to specification and the final product works well, so you rush into production. All good, right? Maybe not. Automotive manufacturers don’t choose components such as tires by chance, they conduct extensive testing and evaluation of different options to ensure they meet their performance and durability standards. They make sure their tires have been tested under a variety of conditions. Ultimately, they strike a balance of performance, safety, cost and regulatory requirements to provide the best overall for the vehicle.
As they evaluate all these factors, they take great care to evaluate the component manufacturer’s supply chain to ensure quality and reliability in production capabilities, quality control processes and sourcing of raw materials. Imagine the damage to brand reputation if the tires chosen were defective and ultimately caused accidents. That happened in the late 1990s and early 2000s with Firestone tires that were prone to tread separation, ultimately leading to loss of vehicle control, rollovers and accidents. It’s highly probable that additional testing and quality control checks would have prevented these defective components from reaching a production stage.
Now let’s refocus on the SDK. When looking to address mobile app supply-chain risk, the problem set can be split into two categories: Open-Source SDKs and Closed-Source SDKs. Both are similar, but have some nuanced risk differences that shouldn’t be overlooked.
Open-source SDKs offer many benefits such as cost savings, however they also introduce risks. While many open-source projects have active communities that address security issues promptly, we’ve found many projects where there are outstanding issues that have yet to be resolved due to lack of contributions or are simply deprioritized due to lack of urgency. While there may be GitHub or GitLab repositories, forums, documentation and community resources available, there’s no guarantee of timely assistance for these issues or questions; in fact we’ve seen issues raised in prominent open source projects as a result of work done by NowSecure that remain open to this day.
Mobile App Dependencies
When the NowSecure AppSec team conducts mobile penetration testing, part of the process includes identifying the SDKs the app has integrated with and checking those against the latest version. It’s important for us to identify outdated SDKs in use. Sometimes, we even see versions of projects that are no longer actively maintained or have been completely abandoned. Projects that are not actively maintained cannot respond to changes in the security landscape and keep their projects secure, thus we recommend prioritizing a migration plan for those projects. SDKs that are out of date may be missing relevant security updates put out by their developers.
Managing the dependencies of an app is critical to functionality and security. Keeping the SDK versions current is a major part of that, but it’s easy to forget that the SDK itself has dependencies. Changes or updates to dependencies may introduce new bugs or break existing functionality and so maintainers may not rush to implement updates. Unfortunately, this lag in updates may result in relevant security issues going unaddressed or even worse, undetected. Regular dependency vulnerability checking is an important preventative measure, but as noted before, is only effective when vulnerabilities are reported and addressed in a timely manner.
Another concern is the deliberate insertion of malicious code into an open-source project via a commit to the open-source project’s repository. As noted before, open-source projects often don’t have enough of a development and review community to effectively vet each code commit that’s submitted. This leaves an open door for malicious actors to potentially compromise large numbers of individuals who use that SDK. In 2022, this exact thing happened (sort of) with the “compromise” of the colors.js and faker.js packages. The developer of the packages committed the offending code themselves. While ultimately not malicious, but more mischievous, it proves the point that unintended commits can wreak havoc on the SDK’s users.
Using closed-source SDKs may offer benefits absent from open-source SDKs, but still present certain risks. With closed-source SDKs, developers don’t have access to the source code, which can make it difficult to understand how the SDK internally operates. It also limits the options a developer has if issues are detected. When selecting a closed-source SDK, review what security testing has been done by the vendor, if any. Ask the SDK vendor questions about any vulnerability disclosure programs they run and how they will respond to issues that are disclosed. If the SDK vendor fails to prioritize security or respond promptly to reported vulnerabilities, developers who have integrated this SDK may find themselves at risk.
In some cases, the app developer may find themselves directly responsible for integrating malicious code. In 2019, several companies found themselves on the receiving end of an enticing offer from a Hong Kong based advertising and analytics company, Elephant Data: integrate a few lines of code and get paid $1,000 per month for every 100,000 users. Subsequently, it was discovered the app was silently loading and clicking on invisible ads on people’s phones, thus generating fraudulent ad income and in some cases, adding charges to the users’ phone bills. These malicious SDKs masquerade as legitimate analytics platforms, blending into the larger advertising ecosystem that mobile app users have little control over or transparency into. This is not an isolated event, unfortunately. While the Apple App Store and Google Play introduce new mechanisms to scan apps prior to their release and new mobile security and privacy requirements, issues still occur.
Even the most secure SDKs can be implemented improperly, thereby introducing risk where it wasn’t intended.
Mobile Standards Compliance
One of the most infamous violations of automotive regulations and industry standards was a scandal that came to light in 2015. Volkswagen installed software in their diesel vehicles designed to cheat emissions tests resulting in billions of dollars in fines, numerous recalls, and legal settlements. In a previous example, I mentioned Firestone tires. What I didn’t mention was the reputation damage that was done, not only to Firestone, but to Ford who used those tires as a result. Use of components that violate industry standards can have an immediate and long lasting negative impact on brand reputation.
Many mobile app developers are unaware that they are also responsible for the practices of the SDKs they integrate and may be subject to procedure or even legal action if they do not take this responsibility seriously. Even minor issues, such as having your app store submission rejected, can have financial implications. Google makes this expectation very clear to Android developers:
As a Google Play developer, it is your responsibility to ensure that any SDKs you are using do not cause you to be in violation of Google Play’s Developer Program Policies.
Apple reminds iOS app developers via a similar warning:
As a reminder, when you use a third-party SDK with your app, you are responsible for all the code the SDK includes in your app, and need to be aware of its data collection and use practices.
A report from PWC’s Consumer Intelligence Series notes that in general, consumer trust is fading. The report presents that when surveyed, only “25% of respondents believe most companies handle their sensitive personal data responsibly”. They go on to say that “88% of consumers agreed the extent of their willingness to share personal information is predicated on how much they trust a given company.” It’s no surprise that privacy concerns and data handling transparency are now just as important as software security in the eyes of consumers.
As a mobile app developer, it may be difficult to know what to pay attention to in the ever-changing world of mobile security standards. To strengthen iOS or Android mobile security, be sure to follow the OWASP Mobile Application Security Verification Standard (MASVS). Android developers should also pay close attention to the ADA MASA program which uses a subset of the OWASP MASVS to support industry-wide adoption of AppSec best practices and guidelines.
Google Play allows app developers to showcase their dedication to security and privacy by undergoing an authorized third-party MASA validation and in return grants them special badging in the Play store. The NowSecure Services team can conduct this ADA MASA independent security review and expects more program developments this year to further incentivize developers to obtain this validation.
All Android developers who publish apps to Google Play must complete Data safety declarations to disclose how their mobile apps collect, share and protect user data. Google emphasizes that if an SDK is included in a mobile app, the developer is responsible for the SDK’s data collection behavior, even if that functionality is not used by your application. The Google Play SDK Index provides insight into registered SDKs and often includes information helpful for accurately completing the Data safety form.
If your mobile app accepts payments, you’ve probably integrated a payments SDK to do the heavy lifting. Pay attention to the PCI Mobile Payments on COTS (MPoC) standard which builds on the existing PCI Software-based PIN Entry on COTS (SPoC) and PCI Contactless Payments on COTS (CPoC) Standards. App developers are still responsible for ensuring that their MPoC certified SDK is integrated properly and meets the requirements covered under MPoC.
OMB Memo 23-16 (and associated CISA software attestation form) stipulates specific requirements for software being sold to the U.S. government:
- Critical software vendors must file the attestation form for ALL releases after June 8 or the federal government cannot use the software
- ALL software vendors must file the attestation form for ALL releases after Sept. 8 or the Federal gov cannot use the software
This CISA Attestation form “must be signed by the Chief Executive Officer (CEO) of the software producer or their designee, who must be an employee of the software producer and have the authority to bind the corporation.” In addition, it requires that “The software producer employs automated tools or comparable processes that check for security vulnerabilities.”
Important for SDKs, the form states “Software producers who utilize third-party components in their software are required to attest that they have taken specific steps, detailed in “Section III – Attestation and Signature” of the common form, to minimize the risks of relying on such components in their products.”
Select Safe iOS & Android SDKs
SDK security testing is essential for mitigating risks, ensuring compliance, maintaining trust with users and protecting the software supply chain. By instituting a program to vet and continuously inspect not only applications, but SDKs, organizations can enhance their overall security posture, reduce the risk of app store blockers and decrease costly bug fixes.
NowSecure has several solutions and services to help you select safe iOS and Android SDKs and implement them securely. NowSecure Platform automated mobile application security testing and NowSecure Mobile Pen Testing as a Service (PTaaS) uncover security, privacy and compliance issues with the mobile apps you build and deploy. NowSecure Academy training helps upskill developers in secure coding practices. And NowSecure experts can conduct special SDK pen testing to secure their mobile apps against supply-chain attacks. Reach out today to learn more about our offerings.