On the future of Apps
The Death of the App as We Know It
As phone technology progressed, apps became the standard for deploying neatly packaged software to the end user. With the data saved locally and the size of average app constantly increasing, phone manufacturers were forced to focus on storage as a main feature of a phone. Batteries became thinner, storage became lighter – but we’ve reached a standstill that calls for a different approach instead of upgraded parts.
Mobile apps rely on bulky software being stored on the client of the phone. The focus on client side data has major benefits, but at this point I think many of the benefits have been outgrown. One big pro to a focus on client side apps is availability. Because all of the data is stored locally, it can be accessed anywhere. During the days where phone signal was very slow and less widely available, this approach made sense. Users would need access to services during times with no signal, so naturally developers would want their apps stored locally for the user.
This becomes an issue when the data is unneeded. For example, you will never need to use a banking app without a wireless connection, so why should it be taking up memory on our phone when we don’t need it? It’s unused space for a service that might not be needed frequently. This goes for many apps that don’t serve a purely offline purpose. Quite simply, these apps are taking up too much space – and we have too many of them. The amount of apps we store directly on our phones adds to the problem. Many of these apps have dependencies that directly overlap, but need to be installed on a per-instance basis. So on a phone, there might be 60 apps, and only 5 of them get used daily. then maybe 40 apps that are used at some point within a week, and 15 that are very rarely used. This is a huge amount of unneeded clutter that could directly be fixed by a change of approach in software.
Now, we’ve come to a point where phone users have signal in most places. In the near future, signal coverage will expand and become faster. I think that a point will be reached where the speeds of satellite data becomes a practical solution for the majority of end-users. With 4g speeds being close to speeds of wireless data transfer, the rates are fast enough to cover most media consumption. This type of streaming approach allows for a much more flexible design in hardware.
To paint a clearer picture about how this change in approach might benefit others, I’ve created an example: I have one powerful desktop hooked up with gigabit ethernet and fast transfer rates. This computer is always on and handles most of my day-to-day tasks.. But what if I’m on the go? Well, I have a remote desktop service constantly running on the machine, so if I ever need to get on it, I can stream my connection from anywhere with good enough internet. This allows me to have the power of a full-fledged desktop on any machine with enough of an internet connection. I can have raspberry pi’s scattered throughout the world, and I now have a high performance desktop tailored to my liking wherever I go.
This approach is highly scalable, and already has been used effectively by developers. Instead of customers buying an expensive computer that they need to replace or upgrade in a few years, they could purchase a subscription to a powerful server that scales in processing and storage as the user needs it, so they are never paying for too much or too little. The user would need to buy a much cheaper streaming device with long battery life and a large screen, while saving room/money on storage, memory, and processing. The streaming devices could be highly diverse, as remote desktop services are available on most operating systems already.
While it is cheaper, faster, lighter, and exact – but what is the cost? Security, Privacy, and Accessibility are the main issues that I think could cause harm.
This design effectively centralizes all user data into one area/company, which many would see as a bad thing. It’s like hiding all of your money under your bed – if a thief finds it, you’re screwed. With the recent IWannaCry virus shutting down hospitals around Europe, it could be argued that a huge remote desktop web service would be highly vulnerable to that sort of attack, which could permanently damage customer’s data. This is true, but the remote desktop providers could effectively guard against this by automatically updating their customers operating system and security to the latest software, which many customers would not do if they had a physical computer instead. When looked at like this, it could be seen as a safer more secure approach – while people would be getting individually hacked less, the company would be targeted much more frequently.
Another problem with the centralization of personal computers is that you no longer truly own your data. At the very least, companies will be monitoring your activity on these remote computer subscriptions. But, it could be much worse – If the company decided to become malicious, the data could be used to blackmail and exploit people. This should be the biggest issue among consumers.
Lastly, accessibility would be hindered. This design is strongly based in accepting that if you don’t have internet, you’re not going to be able to use your computer. In a world where everyone used satellite streaming when without internet connection, ths wouldn’t be that big of a deal. But, it definitely isn’t for everyone. Chromebooks have the same caveat, following a very similar streaming design.
I think that these negatives are already problems for most users, and because of that, this is a viable design that allows for a lot more mobility at cheaper prices. Time to say goodbye to apps, the future is streaming.
Let me know what you think!
This youtube video covers a lot of what I talk about