Die-Hard Geeks, Let's Talk Linux!

So long as your time is worth nothing. :smiley:

You can even try it without risking any of your data (or your current OS). Try an Ubuntu live cd - or use the Wubi installer.

I’ve been using Kubuntu for the past 5 years, and while it has sometimes been frustrating, the current *buntu distros are very user friendly.

Microsoft doesn’t play nicely with the other children.

If you want to dual boot, the trick is to install windows first. Leave the space for linux unpartitioned (whether it’s going to be another partition on your main disk or another disk completely). After windows is installed, then install linux. Your linux installation will easily find the unpartitioned space and will also set up the boot manager properly for you. It’s relatively painless and it doesn’t give XP a chance to bitch about it.

If you forgot to leave space for linux, there are partition managers out there that you can use. They come with all kinds of warnings about how they might completely trash your windows install but they’ve worked fine every time for me (YMMV).

The major pros of Linux in my opinion are:

  1. It’s free.
  2. It’s much less likely to be the target of malevolent software than Windows.
  3. It’s incredibly customizable and powerful, which leads to greater efficiency of your workflow and generally greater power (once you get things set up).
  4. You can learn how everything works. The source is available, so you can dig into the inner workings as deep as you care to go.

The cons are basically the other side of each of the pros

  1. The free support that’s available is often fragmented and difficult to understand unless you have a fairly high level of skill and thick skin.
  2. There’s less software available for it (although for many uses, the best-of-class software is on Linux.)
    3a. There are many choices. It can be frustrating to have to make choices that you don’t really understand. (Using a pre-fab user-friendly distro like Ubuntu can help a lot here.)
    3b. It’s powerful enough that you can totally screw things up with a very short command if you don’t know what you’re doing.
  3. You might have to figure out how it all works. It can be tricky to figure out what to do when things aren’t working. You might end up spending a lot of time reading man pages and googling error messages you don’t understand.

I like open source software for server-side stuff (no so keen on Linux as an everyday desktop OS). With products that are inherently technical, such as VPN gateways, web servers and so on, the more techie approach of Linux, BSD et al is no disadvantage, because you have to kind of know what you’re doing in the first place. I would rather set up a server via simple config files than have to install some management GUI tool that works like nothing else on Earth, as you often have to do with proprietary equivalents. Everything just seems more straightforward with this type of software, in the open source world.

Another advantage of the software being free is that you don’t have to get your head around manufacturers’ Byzantine licensing models. The supplier of our VPN software has an ever-changing licensing scheme so complex and baffling that it makes grown men cry. People end up buying licences they don’t need because nobody really knows for sure what is necessary, and this stuff isn’t cheap. Microsoft, meanwhile, have even organised training events so that you can understand their licensing structure.

All of this is blessedly absent with open source. Download it, install it wherever you like, done.

Yes, you’re right, you can go back and forth easily. Your “host” operating system is still running while the VM is running, and you can run other apps there at the same time. With some proper configuration, you may be able to transfer files directly from the “real” machine to the “virtual” machine and back. Alternatively, if you have a local network, your VM can participate on that net just as if it were a separate machine, and you can transfer files or set up shared directories that way. (Yes, Linux systems are able to play nicely with shared files and directories from Windows machines, and vice versa more-or-less.) You can set it up so that the VM will share the host’s system clipboard, so you can cut-and-paste back and forth also.

Your VM exists in the form of a collection of files in a directory in your “host” file system. If you have enough disk space, you can install as many of them as you like. If you have enough RAM and your CPU has enough penguin power, you can even run more than one of them at the same time, and they can all be “networked” together. You can set up a VM just to experiment with, and when you are done (or if you make a mess of it), you can simply delete it. Just remember to copy all files that you want to save to somewhere else first, if you can.

The VM can be any operating system you want as long as it runs on the same hardware platform as the host machine. Thus, you can run Linux on a Windows machine, or Windows (even Windows Server) on a Linux machine.

Except as noted above, your VM’s are fairly well isolated from the host machine and from each other. That is, nothing horrible that happens in a VM can mess up your host machine or another VM.

There are two major sources of VM software that I know of:
VMWare (but I don’t know if they have a free version).
Oracle Virtual Box (originally Sun Virtual Box), which does have a free version last I looked.

ETA: You can even play games like creating a file server or a web server in the VM and putting a web site there, which you can then access from other machines in your home LAN. It’s great if you’re trying to learn about these kinds of things, to be able to give it a try.

UPDATE:

I actually forgot about this thread up until very recently. I went out of town for the week of Christmas and didn’t have regular internet access, and subsequently forgot.

Well now I remembered this thread and now I am even posting from the Firefox that came with Ubuntu 11.10. I downloaded Oracle Virtual Box and got it all set up yesterday.

So now that I do actually have Ubuntu, may I humbly request some good resources for troubleshooting and some for high quality programs?

And any and all other tidbits you’d all like to chip in are of course always welcome.

Well I just finished using my first two lines of terminal work!

Not that I haven’t used the command line before in Windows (rooting my phone dontcha know), but to do so in Linux was special.

:slight_smile:

I took a community college class about a year ago in Linux System Administration. It wast he first time that class existed, so the instructor was mostly making it up as he went. We were required to create THREE Linux virtual machines (Debian, Fedora, and OpenSUSE) to play with. We were also given access to a free public real-life Unix system where we could create our own web sites, and we were asked to create and maintain a detailed “log book” of everything we did with our VM’s.

I was the only one in the whole class who did. Everyone else just did some superficial remarks for a few days and then pretty much gave up on it.

My log is at http://doggie.freeshell.org – Take a look. I documented a lot of our assignments, and my work on them, in detailed step-by-step form, with lots of supporting pages too (like, for example, links to shell scripts and transcripts of my working sessions, etc.) By intention, I wrote them up in enough detail that I could (hopefully) repeat the assignments by just following along step-by-step.

Covered_In_Bees, take a look there and poke around to see some of the stuff we did. If you’re just playing around trying to learn what-all you can about Linux, you could try doing some of the same things (like installing Apache / PHP / MySQL and get them all talking to each other, and create a web page that you can get to from your host machine).

You might find that your Ubuntu system comes with sqlite already installed. (It might be called sqlite3 or similar.) This is a simple light-weight SQL database manager you can play with, if you don’t want to go all-out and install MySQL or PostgreSQL yet.

That log is awesome Senegoid, lots of information. I’m bookmarking it for when I’m more ready to go that route.

Question though for anyone with experience with virtual machines: How much RAM should i allocate to the VM? I have four GB installed. Right now I told the VM to use 1GB and things have a bit of lag to them when I try to do things. Sometimes it’s a very small almost unnoticeable lag other times I’ll click on something from the launcher or within the application list and have to wait a few seconds for it come up.

Would it tax my system too much to use 2GB for the VM as long as I’m not running more than one? Right now all I’m really doing within Ubuntu is internet browsing and then on my host machine I’ll have Steam open.

Well, I can’t say much about how much RAM would be optimal. I know that you can create a minimal system with just 500MB. I am working with an old 2.4GHz machine that only has 1GB and about 40GB of disk. I’ve given half of that to my VM. The 500MB for the VM is only used for the VM when the VM is actually running, of course. And for the disk, I chose “dynamic allocation” which means the machine “appears” to have 20GB but actually, it only allocated disk space on-the-fly as it is used. So it’s really slow and crunky. But for my purposes, which is just “fiddling around” with it, it’s fine. If you want to do anything, like, actually useful, maybe not so much.

I also took classes in Windows Server 2008 administration. For that, I installed a VM here with an entire Winsows Server system running! Yes, with only 500MB RAM and 20GB disk. Unbelievable that it actually works, but it did! I have another nearby physical box – an old Win XP mochine, also with 1GB RAM. I was able to set it up as a client station administered by the virtual Windows server.

Ah, but that was 2 years ago. I’ve forgotten just about everything by now (although I still have the textbooks and the VM). If I actually somehow manage to get a job doing actual Windows Server or Linux administration, I’d have to learn a lot of this stuff all over again.

ETA: I might also add that my “host” machine runs Ubuntu Linux, with Firefox 3.something, and recent changes in AT&T’s web site have deliberately made it not run with that any more. But I don’t want to mess around with upgrading Firefox right now. So to pay my bill on-line now, I have to fire up my Linux VM which has a newer version of Firefox (it’s called Iceweasel, but actually it’s just a variant of Firefox), and I can pay my bill there.

From what I’ve been taught in my classes, VM’s are really the “IN” thing in big data centers these days. You can run several on one physical machine, which supposedly makes fuller use of its resources, using up CPU time that might otherwise be dead. I’m not sure I see the logic of that.

Back-up are easy. The entire VM is just a collections of files in a directory. All you have to do is zip up the whole thing and save that somewhere. To do a full restore, just unzip it. Likewise, it’s easy to set up a prototype machine, fully up and running, and save a copy of that. Then you can create all the additional VM’s you want by simply copying that. You can move a VM from one physical machine to another by simply copying it – both VMWare and Oracle VBox have options to import a complete VM from outside.

There is even a standard (several, actually) for the format of the VM files, so VMWare can import an Oracle VM and vice-versa, if you’ve chosen one of the compatible file formats.

That mid-term exam we did was done that way. The teacher provided us with a pre-built system, and a set of additional things we should add to that. (That’s what you see documented in that mid-term/final log.) For the final exam, we just took that and added still more things to it. To submit it, we just zipped up our whole VM to turn in, and the teacher could run that himself to see what was there.

ETA: And if you look at some of the system configuration files (the ones that the VM application maintains, not the internal ones in the guest machine), you’ll note that they are just XML files that you can open with any editor and read what’s in them.

Glad you got it working. Years ago I tried it, got some web browsing, POP email and a few other things working, but it just was not as fluid and nice as XP and had other complications, so I ditched it.

Here is the tribute from back then which matched my experience titled ‘Every OS Sucks’

I think for the most part that tribute has helped and OS’s are somewhat better today well for Windows and Mac, I haven’t ventured back into Linux but would expect some simular improvement over the years.

every so often I get it in my head to use Linux (usually Ubuntu) on my laptop. I inevitably end up back with Windows because some program or two is either not available in the *nix world, or what is available is substandard. For example, old console emulators. There are “top notch” emulators out there for Windows and Mac OS X for any one I’m interested in (e.g. SNES, TG16, Genesis) but on Linux if they’re available at all they pretty much universally suck. MAME is there, but gmameUI is a slow and buggy POS (I mean, I’ve got an SSD in this thing, it shouldn’t take 15 seconds for the program to become responsive!)

I also find web browsing to be noticeably slower in Linux vs. Windows on the same system. especially if the site is heavy with flash, javascript, and/or video. Firefox just feels less responsive than on Windows, and Chrome is less so but still perceptible.

Then there’s the little hardware foibles to address. my laptop is one of those with switchable graphics adapters (Intel GPU integrated, AMD Radeon on the mainboard.) X/drivers doesn’t work properly with this at all, and I had to manually edit /etc/modprobe.d to disable the Radeon GPU. If I didn’t, it would sit there uselessly burning power while the fans were screaming.

dual-booting, to me, is a waste of time and disk space. I’d rather just use the OS that does 100% of what I need than switch back and forth.

jz78817: Your mention of responsiveness is akin to the lag I was talking about. Are you sure it’s not a RAM issue or is it truly the OS taking slightly longer to recognize what we’re doing?

And it makes me sad emulators have been so bad for you. This feels like the type of environment where emulators would kick a lot of ass.

Off topic: Speaking of MAME, where are you getting your ROMs these days? A few months ago I was trying to MAME up and running again and downloaded the newest version which was a huge mistake. T’was a mistake because none of the ROMs I could find were meant to work with the latest version of MAME.

And yes I could have gone digging for the version the ROMs would all work with but wasn’t that interested.

I’ve been using some flavor of Linux on my home machine for probably 10 years now. I’m far from an expert, but have been able to fix almost any trouble with googling. Started with Suse, now on Ubuntu but thinking about jumping elsewhere because I hate their latest desktop.

My biggest love, over XP, is the responsiveness and workflow of Linux. I still have to use XP at work, and you will not believe how many menu delays, hourglasses, extra unnecessary clicks, inconsistent dialog box behaviors, and file organization quirks are inherent in Windows.

On Linux, with the custom tailoring of every click, window, function behavior and shortcuts to as many virtual desktops as I wish, my workflow feels like I’m willing the computer along by force of thought alone, a mix of The Matrix and Minority Report.

Just go ahead in XP (or Win 7 even), and do the “open control panel” and “add remove programs” navigation. The stupid pauses, thinking, display refresh halting, of a stupidly basic function that should have millisecond response from the OS? Linux fixes that.

I refuse to talk linux unless you say sudo talk linux first.

GargoyleWB: I’m getting plenty of menu delays and hourglasses (well, whatever it is) in Linux too.

Linux can completely eliminate hard drive latency? howzat?