Samsung Keyboard Vulnerability Learnings

nowsecureThere has been lots of press about the recent Samsung keyboard vulnerability. The vulnerability comes about because a new language can be downloaded under a privileged context which can be network hijacked to run arbitrary code.

Many articles mentioning this vulnerability are over sensational because it’s very unlikely that the user/app would download a new language pack AND be on a hijacked network taking advantage of the vulnerability. However, I think it’s more useful to pull apart the vulnerability and look for simple learnings to apply to other existing and new apps.

There are three main problems…

New Android ‘enjarify’ Decompile Tool

It’s very easy to reverse engineer most Android apps using dex2jar, JEB or Dare and there are even online tools that can reverse engineer an app without having to install any tools.

Each tool has its own limitations and those limitations are often used by other obfuscation tools to make reverse engineering more difficult. However, to make things even easier, a new tool enjarify has been ‘unofficially’ released (presumably means there’s no support) by Google that claims to resolve many of the limitations of dex2jar such as support for unicode class names, constants used as multiple types, implicit casts, exception handlers jumping into normal control flow, classes that reference too many constants, very long methods, exception handlers after a catchall handler, and static initial values of the wrong type.

I can’t see why Google would want to release such a tool other than it’s the result of a Googler’s 20% ‘free’ time. It will probably encourage more copied apps, ip theft and thwarting of in-app purchasing.

However, it does serve to emphasise how much more sophisticated and easier decompilation has become over time. You shouldn’t rely on the fact it’s difficult to do nor assume what might have protected your app in the past will protect it now or into the future.

Protecting Android Java Source Code

A common question I am getting from clients at the moment is “How do I protect the (Java) source code” in a shipping app. The short answer is you can’t. No matter what you do, a very determined hacker can recover something that resembles your code. However, you can make it much more difficult. I have written a lot about obfuscation and re-packaging on my Android Security site. You might also like to read about using the NDK and tamper prevention as it’s also possible the recover the code from optimised dex/oat files and even from memory.

The thing with this and many other aspects (e.g. UI design, testing) of mobile development is that the chosen strategy should usually depend on the actual project. Some developers tend to be dogmatic and mandate ‘this’ and ‘that’ approach but do no listening, questioning or assessment. There are many ways to do things and some might be better than others for a particular project or might indeed not be worth doing at all.

iOS vs Android: Which is more secure?

blueboxYesterday I posted about company app, platform and device preferences where Good Technology identified that iOS remains the most used device by enterprises (companies). One of the reasons for this is that iOS is perceived to be more secure than Android? But is this true?

About a year ago I posted how Marble labs found that iOS and Android were equally vulnerable to attacks. More recently, Adam Ely of Bluebox had a post on the Bluebox blog asking ‘iOS vs Android: Which is More Secure’. He explained that while iOS might be perceived to be more secure, it has had more vulnerabilities. He also talked about the Android and iOS sandboxes and correctly concluded that…

“With jailbroken devices, counterfeit devices and vulnerabilities, we have to assume in many cases, especially in BYOD environments, that the underlying operating system will be breached just as we assume with the operating systems on our laptops and servers”

The solution to the security problem doesn’t come from answering the question which platform is more secure. As I mentioned last month, you will never have complete app security. We can’t trust any end point and have to instead concentrate on protecting the sensitive data appropriately and as best we can.

New Mobile Security Wiki

mobilesecuritywikiIf you are interested in mobile security you should take a look at the Mobile Security Wiki. It provides details of forensics, development, static analysis, dynamic analysis, reverse engineering tools as well as obfuscators, testing distributions and example apps. It also references libraries, best practices, books, papers and presentations.

You will Never Have Complete App Security

When I speak with clients, there always seems to be be the impression, on their part, that things are either secure or not secure. Unfortunately, whether it’s desktops, laptops, servers or smartphones, the principle is the same: You will never have complete application security.

It will always be possible to fool users into installing things or doing things they shouldn’t. There will always be vulnerabilities that allow root and hence allow, for example, memory dumps of decrypted data. There will probably always be NSA backdoors and the possibility to eavesdrop on radio frequency (RF) noise. There will always be some users that root their devices making things considerably easier for attackers.

This doesn’t mean you should give up and not consider security at all. For all apps, simple safeguards, for example, keeping data in the Android sandbox, provide basic protection with negligible extra effort. At the other end of the scale there’s a class of apps, for example banking and payment, that needs to make it algorithmically time consuming (via encryption) or extremely technically difficult (via tamper protection) for attackers to read sensitive data. You will never have complete application security but you can have high security that, for all normal intents and purposes, will keep your sensitive data safe.

Android App Hacking Getting Easier

appsecIn my post on my Thoughts on Google’s Android Security 2014 Year in Review  I mentioned that security isn’t only about potentially harmful applications (PHAs) being installed. It’s also about the ability to easily obtain information from stolen devices and reverse engineer apps.

Today I came across a tool from AppSec Labs, AppUse, that enables easy offline reverse engineering of apps. It brings some well-known command line tools, used to reverse engineer APKs, together with a hooked ROM to allow access to things (e.g. files, communication, database, encryption) you can’t normally see externally. This is all wrapped in an easy to use window UI. This tool will be mainly used for analysis of malware and penetration testing. However, it’s obviously also possible to use it for nefarious purposes.

If you have intellectual property within your app, think your app might be copied or your app needs to be particularly secure, (eg banking, payment, enterprise) you will want to look into obfuscation/packing and tamper detection.

Thoughts on Google’s Android Security 2014 Year in Review

androidI have finally got round to reading Google’s ‘Android Security 2014 Year in Review’ (pdf). I believe it’s mainly a public relations exercise to assure everyone that Android is safe and that Google is being proactive in improving security. However, having read the report it’s easy to come away with the impression that everything’s ok. There are few places in the report I thought “Yes, but…”.

First an obvious one. Google say they “provided device manufacturers with ongoing support for fixing security vulnerabilities in devices, including development of 79 security patches”. However, what they don’t say is that very few of them made their way onto consumers devices.

Google say “Fewer than 0.15% of devices that download only from Google Play had a Potentially Harmful Application (PHA) installed”. This doesn’t sound many. As an end user you will probably be comforted by such a statistic. However, what if you are a company with say 1000 employees? Statistically, at least one of them might be leaking company information. What if you are a bank with millions of customers using a banking app? If your app doesn’t adequately secure data then a very large number of people could be affected. I think what this means for developers is that just because there’s a low chance of infection, apps should still take exceptional steps to protect their own sensitive data and not solely rely on the fact the platform is secure most of the time. The fact that Android is “secure most of the time” is only of significance for end users.

androidphas

There’s lots of emphasis on Google’s Verify Apps that checks apps at time of install. This won’t catch everything. Attackers are getting good at installing skeleton apps and later downloading extra functionality after Verify Apps has stopped looking.

Also remember, security isn’t only about PHAs being installed. It’s also about the ability to easily obtain information from stolen devices, reverse engineer apps and other such activities that can cause nefarious deeds without even installing an app under Google’s Verify Apps.