Then we have a problem. If it ever came out that Microsoft was spying on its users then I would either switch to an earlier version of Windows or switch to a Mac.
Been there, done that, three times!
Then we have a problem. If it ever came out that Microsoft was spying on its users then I would either switch to an earlier version of Windows or switch to a Mac.
Been there, done that, three times!
Wow, that describes how I spent last week perfectly. I wanted to install a program, but it told me I’d need some other program, and then that program told me I’d need some third program, and I never did get any of it working right so I ditched it to go back to XP, except RedHat formats the hard drive in some funky way that happens to prevent an XP install so I dusted off my floppies and found a DOS 6.2 boot disk, ran it, used FDISK to erase the non-DOS partition and I’m back on XP now.
Life’s too short, man.
The spyware possibilities boggle the mind! :eek: :eek: :eek:
As it happens, I’m going on a four-day Linux administration course stareting Tuesday; the last one was really good and got me started (the machine I’m posting this from is dual-boot XP home and Novell/Suse 9. Hopefully, this intermediate course will get me past the ‘first simply resolve dependencies, then just compile the makefile and…’ bit that seems the stumbling block for most Linux application installs.
Nah, Suse all the way 
I run Ubuntu Linux, and their software update tool, Symantic, is by far the easiest method of installing software that I have ever seen, on any operating system. Everything is taken care of for you. My only complaint is that the list of available software is so large it can be rather daunting. On the other hand, it is a very new distribution, and this has the potential to become its killer application (when combined with the fact that they don’t charge anything for any of it), so I’m sure that all the bells and whistles will get added.
As for MS’s Black Box, I think history has already shown that the average consumer readily trades privacy for convenience.
Indeed, and so do many users, such as myself. Using Linux forces you to consider intellectually what is happening on your computer, instead of groping around in a fake tactile world to get things done. It takes some research and practice – you cannot jump in head first – but ultimately it is the better approach.
As to the ease of installing software, I find Debian’s package system to be not just better but revolutionary in comparison to installing software on Windows: the ease and precision with which it lets you manage the software on your system is amazing. It is something that Microsoft cannot match because its strength derives from the nature of open-source software. And as waterj2 has pointed out, there is a GUI for it called Synaptic which makes it all very straightforward.
Yeah, I miss the old ./configure, make, make install process, but Ubuntu makes it so easy to deal with stuff without trying to find obscure libraries or work out dependencies on your own, and unlike Debian, you have access to new software. And their package database is pretty full, so by the time I’ve heard of anything cool, it’s probably there. The main problem I had when using Debian was that if I tried to get something new, it wouldn’t work with my old libraries, and the older software wouldn’t work with the newer libraries. With Ubuntu, I just fire up Synaptic every six months to update everything, occasionally in between to check for security updates, and any time I think there might be something I don’t have but looks cool.
Really, I think Linux has only a bit further to go before it can claim to well surpass Windows in this regard. Where it needs to catch up is in hardware support and in the quality of major applications. The Open Office suite, for example, is basically a purposeful clone of MS Office, and its advantages (such as producing files that are zipped human readable XML text, rather than MS’s proprietary .doc format) pale in comaparison to its disadvantages (mostly in the bells and whistles category, and wizards) for most consumers, especially when added to the whole difficulty of migration.
As to a black box in Windows, it seems fairly innocent, merely trying to provide more info to MS about what causes the system to go down. These sort of bugs can be very difficult to troubleshoot, and an increased database of the exact circumstances of each failure would help fix them. Due to the fact that Windows is deployed at many large companies on a very large scale, and these companies may very well be concerned about their privacy, I doubt that MS could afford to implement this feature without proving that it does not, will not, and preferably could never mishandle this data, or implement an opt-out option.
See, i’m not sure i agree with this line of reasoning. Or at least, not for everyone.
Sure, if it interests you then i think it’s great to have a better knowledge of what’s going on in your computer. While i’m far from being an expert, over the past year or so i’ve made a definite effort to understand my computer better, and i like the fact that i can now do things that i didn’t understand before. I’ve even contemplated trying out Linux, and might give it a go when i get some spare time.
But for many people a computer is simply a tool whose only value resides in what it can help them do, whether that be email, or internet browsing, or word processing, or whatever. There really is, for these people, very little added value attached to knowing what is going on beyond the “fake tactile world” of Windows. To make the blanket statement that delving into the workings of the computer itself is “the better approach” ignores the fact that different people use computers for very different reasons. And, for the vast majority of computer users, the convenience of an interface like Windows or the Mac operating system far outweighs any benefit they might derive from the sort of effort required to get Linux up and running.
I must have been unclear, because I mostly agree with this. I was arguing against the apparent presumption of Mangetout that user-friendliness is categorically good and +MDI’s statement that a lack thereof is a “problem.” My point is, or should have been, that for some of us it isn’t a problem, not because we are elitist or “live in our own world,” but because the unfriendly methods let us do things more efficiently and with finer control.
I appreciate that dumbing-down of processes can lead to people atempting tasks that they simply don’t understand (and that this may lead to big problems later on), however…
-Let’s not pretend that even a command line is any less ‘fake’ than a WIMPGUI in terms of what is actually going on inside the computer. A command line is a GUI; just a very rudimentary one.
-But that’s not really what I was talking about; intuitive ease of use is one thing, useful documentation is another; some of the Linux applications I tried to install on my first time around had installation instructions supposedly targetted at complete newbies, but that missed the mark by a wide margin - I think some of the guys writing the applications have a tendency to project their own idea of ‘simplicity’ upon the minds of their users. “To install, simply resolve dependencies, then just compile, set permissions and install” is not nearly explicit enough.
None of that would matter if there was an easy way to acquire the basics, however it seems like you either have them and assume everyone else does, or you don’t have them and don’t have a clue how to acquire them.
No, a command line is not a GUI.
A command line is a kind of interface, but it is not a graphical one. And to call it “rudimentary” is just ignorant.
As to a substantive discussion of the difficulty of open-source software, you may have a point about documentation, but overall I think you are too negative. Your insistent example is having to compile a program from source, but you ignore the fact that binary package systems render this unnecessary, and actually make installation simpler than on Windows.
It’s a very large problem when an operating system is being targetted at a general audience like Linux is. Linux is no longer the preserve of computer scientists and database administrators, it’s being picked up by companies who are now targetting the desktop market as more and more “regular users” are experimenting with it. Just count the number of threads started in MPSIMS lately asking for advice on getting started, for instance.
<sigh>
A command line is still a symbolic visual representation of what is going on inside your computer - one that does not directly depict the electronic operations that are happening in the silicon. OK, clearly my calling it graphical is too easily misunderstood (as indeed ‘rudimentary’, by which I meant Of or relating to basic facts or fundamental principles), but what I’m trying to say is that you’re only seeing a visual interpretation of what the computer is actually doing, regardless of whether you’re using a command line or a Windows-type GUI.
We could probably have an entirely different discussion about which is the more faithful or useful a representation.
And since we’re throwing around definitions, I’ll add:
The question is not whether it accurately represents what is happening on your computer. I could never claim to understand how Linux or computers in general “work.” The difference is that command-line work forces your understanding to a certain level of abstraction, whereas a GUI perpetually relies on brute sensory information. It’s like the difference between adding in your head and counting on your fingers – the latter is easier to learn, the former is a better long-term solution.
However, this is straying far from the point. I am not trying to prove that the command line is “better” than a GUI – I was only responding to Mangetout’s apparent disgust with the former (they “actually like the command line”!) with an explanation of why some of us appreciate it.
Indeed, it can be a problem for those for whom gaining market share is a priority. For a humble user such as myself, it is not. I think we are arguing from different premises.
At $1.95 a pop, you might make some substantial money by writing the no-knowledge-or-brains-required version of that.
Of course, it’s really the firewall companies who should quickly offer this functionality.
You have gravely misunderstood the thrust of my statement; I harbour no disgust in this matter. I was merely pointing out that the development of Linux is carried out by those who are very comfortable with the command line and perhaps therefore don’t have such an elevated sense of need for anything else (for other people than themselves).
Theres a vast body of work in Human Computer Interaction on Command Line vs GUI interfaces and the general thrust of it is, if you were forced to use command lines until you became an “expert” user, the command lines are vastly, stupendously more powerful that GUI’s for certain tasks. However, if your a novice user, the GUI’s are vastly, stupendously easier to use so you never learn to be an expert with command lines.
The reason being, with a mouse, your generally presented with 7 + or - 2 objects from which to choose and each choice takes about 100ms to make for an expert user. With a command line, you have roughly 30 choices per keystroke and each keystroke takes about 10ms. So in the time its taken you to make 1 choice out of 7 options in a GUI, you could have made a choice out of several million with the command line. The difficulty is that nearly all of those several million choices are not what you want which makes it much harder to figure out what you do want.
The thing I love about GUI’s is that I’m always reading the TEXT associated with the icon because there are so dang many different icons these days, I can’t remember them all!
But, as a long time user/developer of very low level “stuff” who agrees that command lines are a must at times, and, because I live in this reality, someone that uses the Windows GUI daily, I agree that Linux must be as simple to use as windows for it to have any kind of chance, at all, on the desktop.