I’ve been following the iPad a bit, and I hear a lot about how the iPad will change apps forever.
Then it struck me.
Apps are software. No more, no less. You cram software into a cellphone, and it’s called an app. Sleek, magical, revolutionary (apparently). Then you port it to a tablet, and it’s a bigger app (more revolutionary).
I realized that I’ve bought into the idea that an “app” is somehow different or better than what we call programs. It’s all just crap on a computer. How exactly is this different or better than anything before it?
Well, we gathered tens of thousands of them in one place, made them all bite sized, put them at your finger tips whether you’re at a urinal or in a pew, and marketed the holy hell out of them.
My scepticism has gotten a bit dull lately. Obviously needs a fresh edge.
Well, it has created a new distribution model for applications, analogous to how distribution of music and video is changing from hard media to on-demand Internet distribution via iTunes. I don’t know if I’d call it particularly revolutionary. Apple maintains a very tight control on what is acceptable and how it is tested, so while there are thousands of iPhone/iPad app developers out there, the hurdles for getting an app distributed are still pretty significant; beyond the ability of a hobbyist. Apps are pretty limited in ability, as well. And except for limited exceptions, apps don’t really interact with one another, so it isn’t as if they form building blocks of larger computational structures; you can’t network a cloud of iPhones or iPads together and distribute computational tasks or anything like that. Apps are fun and were a brilliant marketing hook, but I don’t see them as being the beginning of some stunning new paradigm in computing.
I’m pretty sure it came from the same place as Java Apps, and then Flash apps, and now AJAX apps, etc. Each of these need a separate Application to actually run them. So they’re more like miniature applications. Apple noticed that the name was catching on, and decided to use it for their own miniature applications on the iPod (as the OS is itself was more like an application than an OS), and kept the name on the iPhone and iPad.
Because when I recommend that my 70 year old step-father download and use Firefox as a “program” on his PC, he won’t have anything to do with it. If it didn’t come from Best Buy, he doesn’t want it because he doesn’t trust it. But I can recommend just about any “app” for his iPhone and he’s all over it, and will pay for the privilege.
Why? Because there’s no fiddling settings, no drivers to install, no malware to worry about, no scary steps like “installing it” or weird places like “directories” to figure out.
Maybe there is something revolutionary here after all, though not very recently. Once upon a time, the way users were supposed to interact with computers was by writing programs they could apply to their specific needs, and it was programming languages that users would compare for their convenience and so forth. The revolution would have been trying to write a program that could be sold for other people to apply to their own needs, such as word processing or accounting. At the time, this revolution was maybe not as easy to notice as one might think, because (as no doubt was pointed out then) the implementation of a programming language that users would normally use already IS an application program. But the revolution was to try to make it so users did not even have to think about how to solve their own problem.
I still use SAS (once the “Statistical Analysis System”, from SAS Institute) more than weekly, and know that it was available in the late 60’s. It still uses a “cards” statement to signify that data records immediately follow, which used to happen on punched cards. Users write programs of a sort, but also have canned “procedures” for regression, graphing, and so forth. Was this a programming environment, or an application, or an example that the two are no different? What about now?