I have always found app T&Cs to be one of the most contentious parts of an app and one of the things that change most often prior to publishing. Someone tends to draft something and it subsequently gets changed in many subtle ways as it does the rounds in the company.
So how do you get your first draft? What if you are a ‘one man band’ and need something simple and suitable? Ali Muzaffar has a great blog post on the topic. There’s also a free policy generator by Nishant Srivastava based on Ali’s template.
I previously wrote about the new Android M Permissions scheme and how, while more transparent permissions is good for everyone, they introduce a lot of complications. These complications are yet another example of something that developers usually know a lot about but their product managers and some designers won’t know how to resolve. This leads to lots of trouble agreeing/deciding how screens and messages should look and feel, especially when developers aren’t included in the pre-development stages of a project.
There’s discussion that even top apps might be avoiding the move to targeting Marshmallow so as to avoid the complications. I have one such client project where I did some small changes that involved fixing on M devices but I didn’t have the time budget to go through the new permissions implications with my client. Hence the app stayed targeting Lollipop. I have another client who isn’t interested in M at all until it is seeing a greater uptake on devices. I suspect these scenarios are typical at the moment. However, most apps will one day have to ‘bite the bullet’.
This week’s Android Dev summit, particularly day 1, has some great recommendations on to how to handle the new permissions scheme. However, the real problems are with edge cases that can seriously bazooka your app. The first was an Android Dev summit audience question and involves what happens if the user taps “Don’t ask again” and the user then has to go to App… Settings to fix the problem, The Android team said that if this happens, you have lost the trust of your user and that’s that. However, Manideep Polireddi has some thoughts on how you might recover from this situation. Another edge case is if the user does “Reset app preferences” as explained by the CommonsBlogs.
Stepping back, is all this worth it? What is this all about? There’s a very recent article on the Forrester blog where it says…
“Consumers are more willing than ever to walk away from your business if you fail to protect their data and privacy”
Some might say that this is something for older users and that younger millenials have a different attitude to privacy. However, the Forrester research shows this isn’t true…
Forrester says that privacy concerns are very much alive and are set to be a competitive differentiator in years to come. For apps, this means that those that do it well will be rewarded with better user retention.
Engadget posted an interesting article today on ‘BlackBerry reveals the lengths it went to make Android Secure’. This is after BlackBerry CEO John Chen previously revealed that BlackBerry could quit the handset business next year if it doesn’t sell enough handsets.
With the new Android Priv device, BlackBerry is trying to use its reputation as a secure device vendor to differentiate itself from the very many other Android vendors.
Security researchers will, no doubt, be eager to test the security claims and hence it remains to be seen if BlackBerry championing a secure Android handset is a wise or foolish endeavour.
This got me thinking about security as a differentiator. This might equally be applied to apps as well as handsets. Consumers are becoming ever wiser to privacy and security concerns and in the right circumstances this might be able to be used to tip the balance in favour of the app you are creating. However, as with BlackBerry (and indeed Snapchat), it really depends on how secure your solution really is.
Security is becoming more and more important. What with the latest SSL vulnerabilities, NSA/Snowden/GCHQ, user privacy concerns and more sophisticated malware, mobile app developers continually need to put more effort into app security. There’s a particular class of apps, for example banking and payment, that must
be as secure as possible. I recently came across a great white paper, Secure Development Process (pdf)
, by Penrillian that nicely defines these ‘secure projects’ as…
“Projects where someone could get significant benefit illegitimately from a security weakness in the deliverables”
If you are developing an app such as this then you would do well to take a deep look at Penrillian’s recommendations.
I suspect as mobile becomes ever more pervasive, some of these process areas might become standard for a greater proportion of apps and not just ‘secure apps’.
Techcrunch has an interesting article that discusses app information privacy within the context of user information being ‘sold’ as part of a business model that is the hidden cost of free. The argument is that this information is often required to allow service providers to provide targeted ads. However, I see this as only one part of the problem.
What isn’t mentioned is the use of data to improve an app. Yes, 40 percent of apps collect information that isn’t actually needed for the app to work but in some cases this data is also used to analyse and improve the application. The most successful applications, including Google’s, have depended on extensive data collection to anonymously identify and classify users, work out what’s successful in the app, what isn’t, and all this within specific use contexts. Explaining this to the average user is difficult and they are bound to not fully trust developers’ intentions and opt out if they can. From a developer viewpoint, the problem with any scheme that gives the user more choice is that it’s likely to cut off opportunities of using such data to improve an app.