There’s an interesting recent article at Technology Review (published by MIT) on How Data Storage Cripples Mobile Apps. It describes how the performance of apps can be limited by the speed of in-built or removable memory rather than the processor or network speed.
In most cases memory throughput is defined and measured for sequential reads/writes while apps are more random which results in, orders of magnitude, worse performance. As the article says "design choices that app developers make can mitigate the data storage bottleneck". So what are these choices? Here are a few thoughts from apps I have previously worked on…
- Think about caching database and file data in memory.
- It’s often more efficient to load in large files via fewer large reads rather than many small ones.
- Consider what’s really going on. For example, on Android, shared preferences are saved to an xml file. Reading and writing lots of small shared preferences can be slow. Consider aggregating them up and/or caching settings.
- Design for passing information between screens rather than having each screen re-load itself
All this will not only have an impact on how long the user waits for long running tasks but also provide for smoother display of information. Also, fewer database or file accesses results in simplified (fewer cases for asynchronous) code which in some ways balances the extra complexity of having to cache information.
However, always measure first and optimise later so you don’t end up wasting time optimising the wrong things.