Security of Mobile/Tablet Devices
Commoditization of security
A lot has changed since the introduction of the first smart phones in the early 2000s. We now integrate many of our everyday needs into these singular devices. Our communications, our calendars, our photos, all of our social interactions are commonly captured within, or managed through, these systems. Increasingly we are adding to these personal aspects more business and commercial functions; payment, ticketing, identity documents, even using them for store management, kiosk functions, or as the human-machine interface for a plethora of other devices and systems.
In this way, the systems that manage and interface with our data have been centralized, commoditized even, into these mobile devices. Devices that were not originally intended to serve such purposes.
Is this a good thing or a bad thing?
Through a series of blogs, we will outline how these systems can be used, how they can be secured, and what unique aspects they have to which it is important to pay special attention. In this particular blog, we discuss the security aspects of Android based mobile devices specifically. We trust so much of our information to them, but how can we be sure that this information is acceptably protected? As a developer or deployer of applications or systems that use applications, what are the ways in which the security of these systems can be compromised, and how can you best protect against such compromises?
It’s not as bad as you think
The good news is that many of the mobile/tablet systems being used today can be considered some of the most secure consumer-grade devices that have ever been available. They integrate many features specifically designed to protect data and applications which are run on them – mandatory access controls, address space randomization, trusted execution environments and embedded secure elements, pointer authentication controls and no-execute bits … the list goes on.
Therefore, although there is often a lot of commentary about the latest vulnerabilities or application issues for mobile/tablet devices, much of this is at best poorly interpreted to indicate that all such devices are insecure and this is just not correct. ‘Security’ as a concept requires an understanding of what you are trying to protect, and what level of protection is required, and for the vast majority of users the mobile and tablet devices they use can be considered ‘secure’ for their purposes – if they’re kept up to date with patches, and if the controls they provide for security use are correctly implemented.
Gartner recently released a document comparing many of the major operating systems in terms of security, and in fact through this their finding was that an up-to-date and patched version of Android is one of the most secure system you could use (title of most secure went to Chrome OS). There are certainly points of argument to be had with the methodology used here – it does not consider some of the architectural aspects such as pointer authentication used in more recent iOS systems, and the scope is very specific on the Android side (covering only implementations on Google Pixel devices) – but it does speak to the fact that mobile security has indeed come a long way.
This blog is not intended to cover all of the details of the controls implemented in each mobile operating system – if you are interested Google has a great document for Android and Apple something similar for their own iOS devices – but instead look at the implementation details of these controls, what can go wrong and how to prevent this as a developer or user of such applications.
OK, maybe it is as bad as you think
Unfortunately, implementation errors and lack of understanding of the controls that these platforms do provide are common. These issues can be exacerbated by the lack of updated systems that exist in the field – it’s common for Android systems not directly produced by Google (that is, the Pixel line of phones) to have some delays in patches or OS updates. This has become better over time, with Google addressing these issues directly with their ‘project Treble’ in Android 8, the continued split of important functions into components that Google can update directly (through projects like ‘mainline’), and creation of new ways to update components that were (for various reasons) remaining unpatched. Google now specifically calls out devices that are implemented to receive more regular updates in their Android One and Android Enterprise Recommended programs.
However, it is simply correct to say that many mobile / tablet systems have a continuing issue with timely updates. The problem with this is that these systems represent a very large and very interesting target for offensive security research and development. As we continue to develop the very science of security, as we continue to discover entirely new types of vulnerabilities, it’s impossible for any system to be considered ‘secure’ for an extended period of time without also ensuring that it is patched against these new types of attacks.
Beyond the Operating Systems themselves, these systems are perhaps more easily defined by the applications that are run on them. Although some of the initial plans for smartphones famously did not include application stores, they are an essential part of any such device today. This has lead to common platforms boasting millions of applications, each of which is a potentially hostile program waiting to be executed on a user’s device.
Default applications/excessive permissions
One of the primary security issues with Android, perhaps, is the number and type of applications that are loaded by default onto some mobile systems, so called ‘bloatware’ that may have increased permissions or security issues. This is not so much an issue with Pixel devices purchased directly from Google, or iOS devices where the OS is tightly controlled by a single product vendor, but it definitely is an issue for many of the Android devices sold around the world (with that issue being more pronounced in some geographies than others).
It is these default applications – applications that the user has no real say in their presence on the device they are using – which often expose the greatest amount of risk. Beyond applications, each mobile system vendor and mobile network operator may manipulate the underlying operating system to a greater or lesser extent; adding their own application store, their own GUI tweaks, their own applications and drivers, and any of these can result in security issues as well.
Many default applications are provided with ‘system’ privileges, or execute under the user-ID of the platform provider, which gives them additional functions that are not usually exposed or available to applications loaded by the user. This means that any security issue that is apparent in these applications can lead to a greater level of compromise over the device.
As vendors or distributors of mobile systems, it is strongly encouraged to not make modifications to the permissions of default Android applications, and to keep to a minimum any additional applications that are loaded on the system by default. Also, ensure that any additional applications or drivers are updatable in a secure way.
Device configuration and legacy issues
Modern operating systems are complex, and the very breadth of their security implementations can often make the creation of a custom version or installation of that Operating System even more complex. When a vendor ships a new Android device, it has many different hardware interfaces which must be abstracted by the Hardware Abstraction Layer (HAL), there are many different UI elements that the vendor may want to tweak, and there are new features which the vendor may want to add to differentiate their solution.
All of this customization can often run foul of the security features that underly the Android Operating System, and this can lead to disabling or modification of features which otherwise would be left in a more secure state. Vendors of Android platforms should be very careful to reenable any such disabled features, or remove debugging and other functionality, prior to shipping a completed system.
Technical debt / historic APIs
An issue faced by all computing systems, by all implementations of computing systems, is one of technical debt. Because security is a moving target, and because we continually advance the security of our operating systems and computing platforms, applications and the interfaces they use tend to ‘age’. It’s not feasible to ‘turn off’ older APIs or applications simply because they have been found to provide or facilitate insecurities in the platform – doing so would instantly render all existing systems using them inoperative – so there must be some lag, between the enablement of increased security and the disablement of older systems that lack this level of security.
This leads to technical debt. We see this in historic support for older applications in desktop operating systems as well but it’s perhaps more pronounced in mobile systems, entirely because their security levels continue to advance so rapidly. For example, in each new version of mobile operating system it is common for security features to be added and restrictions to be placed on the use of what were previously considered to be overly permissive APIs – but there remains a need to support older applications as well.
At the time or writing, Android requires any application submitted to the Play Store must support) an API level of 28, which corresponds to Android 9. However, lower API levels may also be supported and can be set as minimums if the application is loaded through another means (such as ‘side loading’, or use of a third party app store). This can allow for applications to bypass security features otherwise baked into the Operating System – for example, loading an Android application from outside of the Google Play Store can allow for the use of deprecated accessibility APIs that capture user touch events or provide for other unacceptable levels of access that have been previously prevented.
As each version of Operating System becomes more secure, so to do the minimum requirements for the devices that ship with that version of Operating System … sort of. Some of the minimum requirements apply only to those devices that go through their initial Android Compatibility testing with that version; any devices which are upgraded to a new Operating System version after this initial testing may not implement or meet the minimum requirements.
This makes sense, of course, as you don’t want to limit the roll out of the majority of the new features provided by an Operating System upgrade due to some small limitation on existing hardware. However, it does also mean that as an application developer you cannot necessarily rely on a one to one mapping of Operating System version to minimum hardware requirements, and as some hardware features are mandatory even for upgrades, it’s not always the case that every device will be able to be updated with a new OS.
Provisioning of user applications
Of course, as noted, technical debt is a problem for general purpose computers as well – perhaps more so, as they have a longer history of programs and systems that they must support, which has led to the general lack of success for the ‘store’ model of distribution that is commonly used in the mobile arena.
The ability to easily, and securely, access a huge range of applications through the OS ‘store’ is definitely one of the greatest benefits of mobile operating systems, but it is also a challenge when we are attempting to use such applications to replace otherwise bespoke systems primarily because each ‘app’ must always start out the same when first downloaded. However, there is then usually a need to personalize that app with specific data for the phone and/or user, usually obtained through a backend host.
This personalization stage is problematic because it is very difficult, perhaps impossible, to authenticate an application as genuine at the very first execution and connection to backend systems. Common solutions tend to use of embedded secrets (protected using ‘whitebox’ cryptography), or remote ‘attestation’ systems, to authenticate an application instance.
Storing data on the phone
Once a device is ‘provisioned’ and being used, it is implicit that there must be some unique data stored by that application. This may be unique identifiers, more complex data structures, or unique application specific cryptographic keys which must be generated and stored on the device. There are many options for the secure storage of these values on a mobile device, from simple file-system level permissions through to protection in separate hardware components on the device.
The most simple access control method is using the file system of the device. Here care must be taken to ensure that the correct access permissions are applied so that the file is not world-readable by setting the file as ‘app-specific’. It is important to note that even when this is done, all applications which are signed by the same developer key execute as the same user and therefore will be able to access any files even if permissions are limited.
From Android 7, there has been the ability to apply ‘File Based Encryption’ (FBE) which encrypts files in storage based on user credentials – so that if the device is a multi-user device, or has multiple user profiles, then access to applications and files for those applications can be limited to a specific account or user.
Files located on external storage may be easily extracted and read. More recent Android versions allow for ‘adoption’ of removable media with the on-board storage, and utilize encryption to secure this media. However, only since Android 9 has ‘adopted’ removable media worked with the FBE and for this reason (as well as others, such as reliability and speed) use of external storage for secure files is generally not recommended.
Indeed, generally it would not be considered sufficient to allow solely on the use of file system security to protect sensitive information in an application, and as such use of additional protection mechanisms – most often application based encryption – is strongly recommended. However, this type of protection presents a quandary; although the encryption itself can be relied upon to prevent direct ‘reading’ of the file contents, the security provided is only as strong as the protection applied to the cryptographic key used.
Fortunately, Android natively provides methods for the generation and use of keys for file encryption, such that applications do not have to directly access or use the keys themselves.
Key storage and security
Therefore, the use of encryption to secure data can be thought of as deferring the complexity of protection from the file(s) or data elements themselves, to the key(s). The good news is that because cryptographic keys are much smaller than most files (often much less than 1KB) they are somewhat easier to store securely. Android provide a number of specific storage methods for protecting cryptographic keys which may be used by applications to directly encrypt data for both storage and transport, as shown below.
The storage method you use will depend on the level of security you want, balanced with considerations for interoperability. Software only methods, such as Whitebox cryptography or software based TEEs are the most widely deployable across the population of devices, but also present a base level of security compared to other options. There are many academic papers discussing attacks (and mitigations of these attacks) on Whitebox Cryptographic systems and attacks such as differential computational side channels and fault injection are often possible.
Hardware TEEs and software based enclaves that use hardware features to separate out memory regions (ideally using encryption as well as other memory isolation features) provide an increased level of segregation by ensuring that operations and data manipulated within those environments are protected from access by systems executing in the Rich Execution Environment. However, side channel and RCE attacks remain possible on these systems as well.
Because of these types of attacks, storage of the cryptographic keys is often best implemented in completely separate silicon – using a discreet ‘chip’ such as an embedded Secure Element or Secure Enclave processor. Devices from Android 9 and above support the use of such an embedded hardware storage system, referred to as ‘Strongbox’, and requirements for Common Criteria approval to EAL 5+ on such internal security chips provides a very high level of assurance to the security provided.
Of course, before you store a key, you must have a key to store! Generating, entering, decrypting, or agreeing upon keys in mobile systems can have its own challenges – some standards (such as PCI CPoC) require at least sourcing entropy seed data from external inputs, such as a backend HSM, attached dongle, or various user inputs and device sensors to help with the generation of random data.
Such sensors can have their own issues, however …
Exploiting sensor data
In the previous section we mentioned side channel attacks as they apply to the extraction or determination of stored secrets such as cryptographic keys. However, side channel attacks can also be applied to recover data as it’s entered into the device – through monitoring of the light sensor, the accelerometer and gyroscope, even the Wi-Fi signal
Protecting against this sort of leakage during operation is difficult without increasing protections from the Operating System to disable background sensor use. However, it is important to understand that such attacks are possible so you can build these into your threat models as you develop solutions.
One of the easiest things you can do to secure your application is to ensure that the connection it establishes with the backend systems is protected using TLS. Android makes this relatively easy, but you have to remember that TLS is a suite of protocols and connection methods, and not all of these are going to implement the security you want. Where the authenticity of your backend is important, always verify the certificate provided by that backend (in a process called ‘certificate pinning’), to make sure you are not just talking to any TLS host – you’re talking to the host you actually want to talk to!
Just as important as the certificate you accept is the ‘ciphersuite’ you use to secure the connection once it’s established. This is chosen based on the common implementations supported between the server and client application (on the mobile device), highlighting that it’s not ‘just’ the mobile application that impacts the security of any implementation – how you configure any backend systems is also vitally important.
For high security implementations, you may want to consider additional encryption of sensitive data beyond the protections provided by the TLS connection itself. This can help isolate that data from exposure in backend systems, where TLS connections would normally terminate at the network boundary, as well as implementing general ‘defense in depth’ security to reduce the risk of exposure if there is an issue that caused by exposure of certificate data, or a new vulnerability found in TLS.
Privacy versus attestation systems
As we increase the use of COTS systems for use in different verticals and with different sensitive information, “attestation systems” are becoming more and more popular. These systems collect data from a platform and transfer it to a remote processing environment for analysis – allowing for this data to be compared to similar data from many other systems, or cryptographically verified against a securely held private key.
Google implements it’s own attestation system known as ‘SafetyNet’. This helps provide information on the underlying platform, and the app itself, in regards to if it is considered a high risk or not. However, Google specifically calls out that this is system should not be used as the sole protection or attestation method for an app – it’s a component of a solution you need to build up yourself.
Safetynet can also allow for ‘key attestation’ back to Google services. This can be used to both validate the security of keys generated and stored on that device, as well as confirming that the phone is a genuine Android device and not an emulator and has an intact chain of trust for the boot process. Very recently, Google seems to be implementing updates to this which may make it more useful for detecting ‘rooted’ phones – which is a good thing for security.
What is good security?
Security is a relative term. What may be considered ‘secure’ for one use may be insecure for another use case. When creating Android applications, Google has some good guidance on things to consider in relation to general security. If you are creating applications that have more stringent security requirements than that of a ‘normal’ application, you may want to consider going beyond this guidance as well.
- Don’t make files world readable
- Don’t use default / common cryptographic keys stored in your application for on-going security
- Do use FBE where possible
- Do use application based encryption for sensitive data
- Tie this to time and authentication based keys for assets that are only required temporarily (such as keys directly managed by the application)
- Don’t manage keys directly in the application where possible
- Do remember that any other application signed with your developer keys shares the same user ID
- Do use TLS with good ciphersuites and cert pinning
- But don’t pin to silly cert data!
- And don’t rely on TLS only in high security implementations
- Do secure your app-store interface and credentials
- Do use SafetyNet and key attestation, as a part of a more complex application security and attestation implementation
- Do plan for OS obsolescence and don’t assume features based solely on OS versions
- Do target the latest APIs
- Do use program obfuscation
Over this series of blogs, we’ll be going into more depth on some of these topics, what the issues are specifically for various implementations and standards, and revealing some new types of vulnerabilities we have discovered in our ongoing research to help make the (mobile) world a safer place.
The views and opinions expressed in this article do not necessarily represent the official position of UL.