I also would recommend Ubuntu. Part of the reason you’ve not gotten an answer yet is that nobody here is familiar with Linspire or it’s package management system. Ubundu is very easy to set up, is based on Debian (a popular distro) so you’re going to be able to get help easily, and uses the apt package management system. With the last all you’d have to do is type ‘apt-get install gaim’ and everything you need would be installed for you.
I just now see that Linspire uses it’s own package system they call 'Click ‘n Run’, with packages available here. It seems you’ll need a subscription after the 30 day trial runs out though, so unless you’re comfortable paying them for support, I’d go with something else.
As far as I understand it, programs need to be compiled on your own system because they will then include whatever versions of the relevant libraries happen to be current on, and specific to, your machine and the distribution you’re using; the alternative is precompiled packages, but often their environmental requirements will be quite specific. I suspect the firefox installer will either have picked the most appropriate precompiled package from a large list, or will have done the actual compilation for you. There are a variety of reasons why not all programs do this, but I’d better not say any more, lest it be misconstrued as a rant.
I’ll add another vote for Ubuntu. In addition to being a (relatively) user-friendly version of Linux, it also seems to have developed an active community around it. The forums are a good resource.
As non-IT-professional who has used Linux on and off for about 18 months, my experience is consistent with GorillaMan’s remarks. Installation and configuration is the achilles’ heel of Linux on the desktop. That’s why I expect that Linux will break into the mainstream (if it does) through corporate environments, where systems are administered by professionals.
The good news is that using Linux is, in my opinion, just as easy as using Windows. Once you get things up and running, your experience should be much better.
Suppose for a moment a company or organization that provides all the software you need. They give you a URL to point your computer to, or a set of CDs, and you say, “Give me an operating system, a word processor, a web browser, and a drawing program.” That software is then copied straight to your hard drive. They make sure that the programs do what they claim to do, don’t contain malware, and integrate smoothly with the system. They also design their tools so that third parties can use them to expand the system with other software.
It sounds like a brilliant idea, so why haven’t you heard of it? Because with proprietary software, it isn’t possible. If an application is closed source, no one can study it or modify it in any way. Thus, at most, this organization would be little more than a reseller.
But with the advent of open source software, it becomes a real possibility. And this is what Linux distributions do. They collect the software you’re likely to need, put it together into a nice package for you, and provide it to you with the knowledge that it will work and no questions asked.
That is the theory of managing software on Linux. As you can see, there is a raging debate over how well it actually works, and whether current implementations adequately realize the ideal I’ve just explicated. If you want to hear that argument out, post a thread in GD or IMHO, or just search those fora.
Now, if you find a piece of software you want to run, don’t just click “download tarball”; the first thing to do is to go to your distribution’s website or your package manager and search for the program. If it isn’t provided, look for an unofficial package, or else an unofficial repository that you can get it from. If you can’t do any of that, it is probably a very new or very obscure program and you should try to find a better known equivalent. So, unless you are bent on having the very latest version of a program, or on modifying its code, you should only need to compile from source as a last resort.
Moreover, rather than stumbling around the Internet, as you seem to be doing, you can use your package manager to find software. Specifically, Aptitude has an excellent search mechanism and a categorical browser that divides everything into understandable sections – Web browsers, Text editors, FTP servers, etc. It’s ncurses-based and has a slight learning curve, though, so you may be more comfortable beginning with Synaptic.
Finally, I will echo the others’ suggestion that you switch to a more known and respected distribution such as Ubuntu.
Another happy Ubuntu user here. In my opinion, Synaptic is the easiest way to add software that I have seen on any operating system. You just look through the list of software available (divided into categories, but still rather large to read through), find what you like, perhaps search for things you want, check the ones you want to install, then hit “apply”. A couple more prompts to OK things, and you get new software downloaded and ready to go. Since Linspire is Debian-based, like Ubuntu, you ought to be able to do the same thing, though I don’t know what sort of repositories Linspire has.
I’m a long-time Red Hat/Fedora user with some Slackware experience and I want to try out the Debian scene with one of the distros from that camp. One thing that is holding me back is my utterly craptastic Internet connection: I am limited to a dial-up that maxes out far below the 56k I’m theoretically entitled to. It also cuts out at odd intervals and I have to redial to reestablish the connection. There is no way in hell I’ll ever be able to download a distro using the connection I have. I have the best possible hardware at my end – an external modem and a fast machine – but there’s nothing I can do about what my ISP is willing to sell me at my price range.
So, would synaptic or apt-get be worth it for me? Fedora has something apparently similar called yum, but I’ve never successfully used it: All the download servers I have access to time out whenever I try to download something.
You probably already know this, but OSDisc.com will send you physical install media for pretty much any distribution you want to try, and their prices look very reasonable - as indeed they should be.
If your internet connection is shaky and slow, this could present problems for any automated installer, simply because some of the packages you’ll be downloading (assuming they’re things that are not bundled on the install CD) might be quite large; I don’t know if any of them use a download manager that can pick up where it left off.
The other thing I forgot to mention (in passing) is that Linspire is the distribution that was originally going to be called Lindows - a distribution of Linux capable of running Windows apps (presumably including WINE in the bundle). Microsoft stopped this from happening and Linspire was the result - a Windows-look distribution, but without any capability to run Win apps.
I’m personally more familiar with cheapbytes.com, but thanks for the reference.
And this is precisely what I was afraid of. wget can pick up where it left off, and that has saved me more than a few times. If none of the package systems use wget or something with similar capabilities, that reduces me to doing dependency-tracking by hand. Not a very enjoyable way to spend an evening.
Caught it in preview:
You can still download and install WINE on your own, and it isn’t limited to any specific distro.
I just installed Ubuntu on a spare machine here at work; overall, I’m pretty impressed - I’ve toyed with RedHat, Suse and a few others before. This one is pretty smooth - GAIM, firefox, OpenOffice, XPdf and the Gimp are installed by default, as well as all the stuff you’d expect like a mail client, a few text editors etc.
What is really impressive is that, in terms of speed, ease of operation etc, the machine feels like, say, XP running on a 1.5GHz processor; in reality, this is Linux/Gnome running on a Pentium III 233mhz. Pretty clever.
It detected my graphics card OK, but I don’t seem to be able to change the screen res to anything above 640x480 and I haven’t tried installing anything that isn’t listed in Synaptic package manager yet.
First, many (if not most) programs are hobbyist, not commercial. They are released out of benevolence or shared interest in the application. As a software developer myself (sort of), writing a general installation program is tedious, more effort than it’s worth, and a waste of time. (Note that I’m talking specifically about hobbyist programs.) Commercial applications generally do have some sort of installation program. One of the most notable things about Debian (from which Ubuntu is derived) is the amazing set of packages that are available for installation in binary form via synaptic or its kin. When someone creates a package available through such a package manager, they provide information about what libraries the package relies on. When installing, the package manager identifies any conflicts that may exist or installs additional packages that are needed. Windows is not immune to this (cf. dll-hell).
Second, Linux users tend to be more computer adept (not necessarily, mind you, just generally). One aspect of this is that you learn not to trust executables without knowing their source. If one compiles a program from source, one can always look at the code to confirm nothing malicious is contained therein. Not that most people actually do that, just that the option is there. Since the option is there, it keeps application providers honest to some degree.
Third, Linux allows you to install only what you need or want. If you don’t want a graphical system, don’t install it, for example. This freedom means that there are absolutely no guarantees about what is installed on someone’s system. Hence, the use of a configure script, which generally checks library availability and will adjust the installation accordingly.
Fourth, compiling from source allows one to tweak the compiler optimizations for a particular system. One sees this often in discussions concerning Gentoo. This is also the reason that distros have various versions, such as the x386 (targetted at any Intel 386-class processor), k7 (targetted at Athlon systems), etc.
A lot of “linux” programs are open source, and if you want to distribute the source-code it pays to make installing from source as straightforward as you can; the most popular way of achieving that is via the mostly auto-generated “configure” scripts - they provide a reasonably uniform way of compiling and installing programs from source - typically
./configure [options]
(to configure the program - typically, you need not specify any options, “./configure --help” lists available options if you need them)
make
(compile the program)
make install
(install the program)
Making an automated binary installer takes more work - especially if you want to support multiple linux distributions, have different build options, want to support other unix variants like SunOS, FreeBSD etc - it’s possible you’d need a different installer for every combination of those factors which can be completely impractical for a small team of developers to manage.
So yes, it’s usually harder to make a binary installer than a source-code installer, and most open-source projects just don’t have (or don’t want to spend) the man-power to support that extra work. For reasonably popular projects, this means the linux distributor (i.e. debian, redhat, suse etc) provides a binary package and the infrastructure to deliver and install those packages.
And my cursor hovers over the “Report this post” button.
My suggestion, bouv, is that you wipe your hard drive and reinstall Windows while you wait for Linux to finally leave the hands of the hobbyists. Open source is a great idea but, until it is taken over by people who are interested in your productivity instead of your learning a bunch of arcane commands, it’s a toy.
And it sure does come with a lot of goodies installed, but guess what my kubuntu didn’t come with pre-installed? Compilers. You know, the things I NEED if I want to ./configure something?
WTF? Whose brilliant idea was that?
Not sure how to; the instructions for adding WINE using synaptic said something about that, but the dialog I got wasn’t anything at all like the one shown in the tutorial. Also, I can’t seem to get this installation to connect to the internet - the network card must be working, as it’s seeing my router and grabbing an IP address, but it seems to be having problems resolving names - it’s pointing at the right DNS, it just doesn’t work.
I’m going to wipe and install Kubuntu tomorrow anyway, as I prefer KDE to Gnome, so It’ll be a day or so before I get to play with it again.
Kynaptic (KDE’s synaptic) is buggy if you attempt to access and install shit from the universal libraries. Also, you have to add the universal libraries from terminal. So, if you can do that however, you can then download synaptic which means you can do this in a gui, and it isn’t nearly as buggy.
In Synaptic crack open repositories and check every box. You should get ten million* more libraries. Unless of course you have to add them in. Check out ubuntu’s site and type in repositories, it comes up with many useful links.
I also prefer KDE to gnome, but that may change once I fully exhaust all possibilities for widget like accessories for my desktop. (Speaking of which, does anyone know of a rainlendar-type app for KDE?)
Also, while I’m at it, does sourceforge only have CVS not apt and is this subscription dependent?
*slight hyperbole