I don’t consider that a wall, I consider it as the OS being smarter than not only the average by 99% of the users. I remember having to configure a USB wireless receiver for my desktop. I’ll take build in stuff, which after you set the password you never have to worry about any day. Any my newer phone is much smarter at this than my older laptop.
I’ve been setting up PCs since AT&T 6300s, and it has gotten a lot faster and simpler, so I can devote my time to copying over my wife’s bookmarks and stuff. And when everyone has cloud OS’s, we won’t even have to do that. I don’t with my thin client at work.
Here’s one thing going on here: The makers of GUI-based systems (which is kinda-sorta what I think of as walled gardens, sort of) sat down and tried to think of all the objects a system should have and all the things you might want to do with those objects. Then they give you icons, menus, and stuff like that to do all the actions they thought of that you might want to do.
With a more flexible UI (and I’m thinking of a Unix-style CLI with lots of general-purpose simple applications), you can do all sorts of more-or-less sophisticated things that the GUI designers never thought of.
Clear example: drewtwo99 has repeatedly asked how one might write a script to do some kind of text processing. Various people have helped him write bash scripts and even fancy one-line awk scripts to do things like scan a directory and perform some action on every file there, or concatenate a bunch of files and do some transformation on every line in those files, and other kinds of stuff like that.
I just don’t know how you do things like that, unless you have some kind of CLI-style interpreter to fall back on. Even if you launch your script by clicking on an icon for it, the script is still written in a CLI type of language.
Programs still run other programs by sending them command lines with command-line arguments, even if it’s all done behind the scenes. A user-accessible CLI just makes that interface directly user-accessible. Microsoft, in particular, seems intent on trying to hide this away from view as obscurely as they can.
Best leave that to the Federal Department of Operating Systems I guess.
I’m a developer, and I work with many open source developers. I’d say 95% of them use Macs (as do I). Mostly I think it comes down to them working well out of the box, without tinkering required - there are no up-front hurdles. But it has a proper Unix command line and all the right tools when they’re needed.
I want my WORK computer to just work, uninterestingly.
I want to play with computers, but that shouldn’t be the same system I have to pay the bills with.
I don’t need all the customizability, but I like it to be there. Because when it isn’t, what happens is that you can’t do seemingly should be simple things.
Example: I got to use an Ipad for a conference a few months ago. They had set it up so you couldn’t install any apps. It just had the built in items and the app for the conference.
I used it to take some pictures at the conference. There was no way to copy them to a flash drive or upload them in the browser. The pictures taken on the Ipad were stuck on the Ipad unless you installed an app that could access them.
Very frustrating. The basic operating system of any device that can create files (pictures) should permit the transfer of those files (pictures) to another device.
So… considering you know that the conference organizers deliberately limited the ways in which you could use the iPad, is there any evidence either way to speak to whether they wanted you to be able to take photos at the conference (and just forgot to “turn off” that feature, or were unable to), or whether they wanted you to be able send whatever photos you took?
As a conference organizer I find this bizarre. I can see crippling functionality for kids, but why would they give you an expensive piece of hardware and then break it? Did you have to give it back? What kind of apps did they have?
In any case much as I dislike Apple, you can’t blame them for this one. I bet there is some way of doing a hard reboot on the system and getting it back, but I’m no expert on these thingies.
The software side is currently free enough that I don’t much care, as long as I get to stay away from Apple products at work.
The hardware side annoys me more, since I foresee the move to MCM will be widely adopted, which could result in a slit throat for my hobbyist PC building.
Walled gardens suck. They don’t just stop tinkering. They cut off legitimate portions of the OS. They funnel what is and is not allowed to come into that garden. I’m all about apps. That’s not tinkering.
Neither Mac OS X nor Windows 8 is a walled garden. Windows RT is a walled garden: desktop apps that you might want to run can’t even be tried in Desktop mode (only Microsoft Office), and you can’t install anything from anywhere but the Microsoft Store. iOS devices also have a walled garden, and it’s worse because you have to pay Apple for the privilege of allowing other people to use your software, even if you want to give it away for free.
A walled garden doesn’t actually protect you. The point is to force you to do certain things a certain way. Security would just as well be accomplished with a locked door instead of a wall. You’d have to deliberately unlock it. Fortunately, the U.S. government currently seems to agree and allows jailbreaking, which is metaphorically installing a walled door.
Walled gardens are about control. It’s not libertarian to want to not be controlled in how you can use your own property when it doesn’t hurt anyone else. A better metaphor would be civil rights.
Some in it, if I interpret these comments correctly, argue/fear that the presence and popularity of walled gardens will eventually cause them to supplant everything else, leaving nowhere for freedom-loving programmers and users to go.