Mac or PC? You make the call.

ummm… yeahh…
Wow what an incredible response to this thread.
In your OP you said “PC owners prove me wrong”. How about a PC owner proving you right?
I do not own a Mac but my next computer will definitely be one. I went through Win 3.1, Win 95, Win 98, and Win 2000 and I think they got progressively WORSE. Their complexity increases and the ability to figure them out decreases.
I had a bitch of a time just to get 2 Win 2000 computers connected to a router. I was naive enough to think that I would have my own home network, swapping files, sharing printers, etc. It was rough enough just to get both of them hooked up to a modem.
The Win 2000 alleged “help” screen shows 20 topics per window. Making a rough guess, you would have to press the Page Down key 1,000 times if you wanted to look at ALL the topics listed there !!! Don’t you think they could have made the damned sytem a little more user-friendly?
So, I say the heck with PC’s and bring on the Mac !!!

Wolf,

While it’s not that confusing, you should know that the Help program in Jaguar ummm… what’s the word… ah yes, sucks more than anything that has ever sucked before. It’s not just slow, it’s glacial.

Word is that Panther, which has integrated the Safari WebKit, is much better in this area. Hard to imagine it getting worse.

One other little feature that is nice for use in places like, say, the SDMB

http://homepage.mac.com/komodro/screen2.jpg

System-wide spell check. Any application can have spell check if the programmer wants to, just by enabling it. Stuff like that is what a lot of mac users mean by thoughtful touches. I’m quite impressed that this thread has yet to become a complete hate fest.

I’m a Mac user, have always been a Mac user, and I don’t foresee any possibility of me switching. I will however salute the PC, which has come a long long way.

I would not want to spend my day using XP, but I could get work done and it wouldn’t be that bad. Unlike the bad old days when your choice was between System 6 and MS-DOS.

The main areas where owning a Mac instead of a PC still really really rocks are:

a) Absence of the “application window”, so you can arrange document windows from different applications on-screen to your convenience. I’ve actually heard some Windows users cite this very difference as an example of Windows superiority, so YMMV, but damned if I can understand what’s inherently great about a multitasking OS in which you can only conveniently view one application at a time?!??

b) Absence of spyware and viruses, for all practical purposes. Yes, it’s a side-effect of low market share for the most part, but man alive I’m glad I don’t have to deal with that stuff!

c) Boot from damn near anything without drivers and without having to change settings and jumpers and stuff. If your main OS gets hosed, boot from something else, simply by popping it in and/or waiting for the hardware to look around and find it when it tries to boot. CD, Zip drive, second internal hard drive or partition, external hard drive, whatever. Boot from your iPod. Boot from an Orb. Boot from a memory stick. Boot from a hard drive attached to a different computer. PCs are more versatile than they used to be about booting from somewhere other than C, but judging from the posts I read it’s still a headache especially for the inexperienced newbie.

KDE, which I use in Linux as my GUI has this as well. Pretty useful. KDE is filled with little pleasant surprises. I’ve also noticed that Apple’s Safari browser uses Konqueror’s rendering engine. :cool:

Yep and KHTML is a surprisingly nice rendering engine. There are a lot of really interesting things going on in Linux, and the OS itself is amazing. I just wish that it was a bit more polished and consistent. One certainly can’t complain, and the Open Source community as a whole deserves some serious accolades.

Hopefully someday Linux will manage to be a real competitive solution for the desktop. I am considering building another PC out of some older components as a really inexpensive Linux box to just stuff in my closet and work as a server for my home network. It’d be nice to just load the thing with storage space, and use it as a file/music server.

The problem with Linux is that there’s no governing authority to impose a UI and consistent functionality. So long as each Linux program tackles UI aspects like keyboard bindings and the like on its own, Linux will suck for the desktop.

Final Cut Express/Final Cut Pro is not as easy to use as iMovie, but that’s not surprising – FCP is truly an industrial-strength program for folks who do video work for six-figure projects. It is, from what I’ve heard, definitely spanking Avid’s ass in the market, due to its flexibility and affordability.

Meanwhile, iMovie and iDVD kick major ass on the consumer side of the fence. iMovie, in particular, is a system-seller app in my book – it is worth buying a Mac just to run iMovie.

What makes iMovie really cool are the third-party plugins you can buy for additional effects, titles, and what-have-you. I’m tempted to pick up the “laser blast” plug-in and do my home version of Star Wars. :wink:

About the only thing that I’ve seen affect performance in MacOS X is insufficient RAM. Anything under 256MB is tight, and I’d recommend 512MB or more if possible.

I believe the ADC port is actually an industry standard; it’s just not widely used.

I have yet to get a kernel panic on my MacOS X setup, and I’ve been running for over a year now. If it wasn’t for the software upgrades, I would easily have 200+ day uptimes.

As long as we’re turning this into a MacOS X love-fest, :wink: I’ll point out that it’s a perfect platform for software development, especially for internet stuff. Apple gives you a free set of developers’ tools, complete with their Project Builder/Interface Builder IDE, as well as C, C++, Java, Perl, Apache, PHP, and a truckload of documentation. You get all the benefits of running a UNIX workstation or Linux box, along with the benefit of running “mainstream” apps like Microsoft Office or iTunes.

Governing authority? I don’t quite understand. In the KDE Control Center I can customize keyboard shortcuts/settings and the changes are system wide. The default choices are pretty much standard. ctrl+X cuts, ctrl+V pastes. I beg to differ that Linux “sucks for the desktop”. I’ve been using it for quite some time, I enjoying using it and am satisfied with it. I know people who aren’t even proficient with computers who use Linux and are happy with it.

I am a PC user. OS X looks good enough that I’d like having a Mac to play with, but I have invested way too much time in learning the x86 platform to “switch” now.

The built-in color management features since Windows 2000 (ICM 2) can certainly do what you need for print graphic design. You do need to use decent PC hardware (ie none of that brand name or budget crap) and software, but that’s true for the Mac as well. As a managing editor for a print magazine I worked with people on both platforms and while my Mac experience was mostly second-hand I never saw any technical reasons to support the notion that Mac is significantly better for graphics work. We had more problems with material from PC users, but those were always due to ignorance rather than technical limitations.

I’m pretty sure that most Windows applications are compiled with Microsoft’s compiler.

I have only ever used Macs (except for things like PCs in libraries, etc), so I lack critical perspective on PCs. However, I think OS X is the greatest thing since sliced bread - with two crucial caveats, which are that it’s slower that OS 9, and that it uses more power (which is an issue if when I’m using my laptop on battery).

What I like about it is that it is UNIX, plain and simple. I learned (some) UNIX (Linux, actually) at a job I had, and there were several things about it that I liked. With UNIX, you are in total control of everything, and if you know how (a big “if”), you can save a lot of time and effort by using the UNIX command line. I also like to tinker and mess around with how things work, which is easy to do in UNIX (sometimes too easy, in fact…). None of this was possible in OS 9 (at least as far as I know).

The drawback was that I found (and still do find) UNIX intimidating from a problem-solving perspective: there is so much know-how involved, and the documentation is so decentralized that solving a simple problem can involve a tremendous amount of research (scratching through manuals, googling and reading message boards, etc).

But with OS X, I can have my cake and eat it. I can boot up in Aqua (Apples proprietary shell), and use all the programs which have been designed specifically for OS X, and start of Classic, and use my older Mac software. I can boot up to a UNIX console command line, and start of the X11 server and run one of the UNIX open-source shells, such as Gnome or KDE, and use the mountains of free software which accompanies those. Or, I can boot up in OS 9 and pretend that OS X never happened. And if I wanted to tip $300, I could buy VirtualPC and run Windows on my Mac as well.

In OS X, I can do any and all of the UNIX tricks that I want, or I just treat it like an old-fashioned Mac. I find that I do both: I do a lot of file management from the command line (copying, removing, renaming large amounts of files), but because I’m lazy, I still use the Finder search utility instead of find on the command line. And if I have problems, I can either try to pretend that I’m a UNIX hacker and try to fix it myself, or I can just rely on the Apple documentation or utilities.

I also find OS X to be very reliable. It does crash from time to time (some issues with my 3rd-party CD-R/W drive and power management), but I crash programs a lot, and it’s nice not to have to restart the whole computer. I also like how the file system is organized. I keep all the files and applications that I wouldn’t want to lose in my ~ (“home”) directory, which is all I need to back up. The rest of the system is more or less disposable.

Also, while I mentioned above that I find OS X to be slower than OS 9, I did find that I can improve its speed by changing some preferences (turning off font-smoothing and the annoying little Finder animations, for example), and using some free- or share-ware applications which optimize different aspects of the system such as application launching. So speed is negotiable, at least to some extent.

And as far as software availability goes, there are HUGE amounts of free or inexpensive software for UNIX which are being recompiled for OS X every day. One of my favorite time-wasting strategies is to browse for free OS X software, and it’s stunning how much there is out there. I don’t the proportion of UNIX software in the world compared to Windows software, but Mac users now have access to it, in addition to the traditional Mac-only catalog.

Some Mac users have complained that OS X doesn’t let them see certain files: actually, it does, if you know how to ask it. O’Reilly has come nice books for learning UNIX: I recommend in particular “Learning UNIX for Mac OS X” and “UNIX Power Tools.” I also have several Linux books which I refer to, since Linux and Darwin/BSD (the flavor of UNIX in OS X) seems to be very similar in most respects, so an introductory UNIX book might also be helpful.

I have seem some mention that Apple might port OS X to the PC platform. That would really be having your cake and eating: get a really well-designed, stable, and largely open-source operating system on hardware that you can assemble yourself to your specifications.

One other thought: I would recommend Neal Stephenson’s “In the Beginning was the Command Line…” to anyone who likes to debate the Mac vs PC question. He is a former Mac user who know uses Linux, but who has some good things about PCs. He complaint about Apple is that they have done an excellent job of marketing themselves as the computer of groovy, creative, individualist types, when in fact they’re total control freaks. Stephenson argues that when all is said and done, the command line is still the most powerful interface to a computer (barring gaming, of course), and that the classic Mac OS went overboard in rejecting it.

That new G5 is the fastest computer in the world.

And will be made obsolete by the end of the day - if it hasn’t been already. :wink:

Yeah, by the G6. HEHHEHEHEHEHEHE.

Macs use RISC (Reduced Instruction set computer (I believe. I think it stood for something earlier on, but was changed)) architecture,which means that the computer can process the instructions at much faster rates than those with CISC(Complex Instruction Set Computers) arichitecuture(PC’s), but there are many more instructions in RISC, but that doesn’t matter because a RISC architecture can process instructions at much higher rates than CISC architectures.
And now, that Macs OS runs on UNIX, it should be even faster and more efficient.(SP?)

I’ve worked for printers in too many capacities to completely agree with that. You can do it, but your output still won’t be as seamless or predictable as it will from a ColorSynced Macintosh.

ICM 2.0 is worlds better than previous Windows color matching functions, but its not yet in ColorSync’s league. ColorSync remains the industry standard because it can guarantee far more accurate color than even ICM2, and from a far wider range of devices.

While you need decent printers to get the output right, you can use ColorSync just fine on any Macintosh (with the normal LCD caveats applying to Apple’s new iMac).

Well, then the Intel machine advantage might not be as much as I thought, speedwise.

Interface czars. The kind of people Microsoft and Apple have who declare “menus will always work like THIS,” “icons will always work like THAT,” all programs will include X or Y, etc. Try to do that, and the Linux hacker bitch that you’re contrary to the spirit “open source.”

Philosophical nonsense like that can get bent. I want every program on my OS to work like every other program on my OS. Linux, with its dual, competing UIs, is a user nightmare. It doesn’t prove nearly the consistency or predictability of Windows or Mac OS.

Program widgets differ from app to app, whether a program is MDI or a pale version at a Mac-style interface varies from app to app.

To say nothing of user-hostile things like the way OpenOffice.org 1.0 requires you to set the spacing between paragraphs by centimeter or inch or whatever measurement you have chosen for your rulers, and not by the more logical points that Word allows. Gotchas like that abound in open source software, where there’s no czar that forces the hackers to write usable code.

They generally write code and write programs for people who like to fiddle with technology, who like to customize, who think like programmers. Such an interface is worse that worthless to me, it is hostile.

RISC has fewer instructions, but can spin through them faster, theoretically doing more at any MHz rating than a comparable CISC system. This has been born out by the early Power Mac vs. Pentium PC experience.

However, the core of a Pentium IV processor is actually a RISC unit, making this a moot point.

I forgot about the Pentium IV having a RISC processor in it anymore. RISC, I was wondering if I had that backwards with it having more instructions.

a) RISC has fewer instructions in and of itself; as a result of that, applications compiled to run on a RISC platform will accomplish tasks using more total instructions (because each task will be broken down into several small instructions rather than a few longer instructions).

b) The Pentium IV has a RISC core but it still has to deal with the instructions in the actual code, which are still the moldy old ISA instruction set with their variable-length CISCness. To compensate, the P4 devotes a huge amount of attention to decoding and out-of-order guesswork. That Intel has pulled this off for as long as it has is a real testimonial to the folks at Intel, not a commendation to the legacy instruction architecture itself. Ars Technica goes into excruciating detail about what the P4 has to do before the RISC core gets to play with any code fragments.

There’s so much money to be made in furnishing PCs with fast processors, and the Athlon was putting Intel’s feet to the flame (and vice versa) so competition + bucks to be made resulted in some serious R & D efforts. The results are actually astonishing. It’s like watching the Vikings figure out a viable way of sending wooden rocket ships to the moon, and then doing it so well that their turnaround time has been beating the steel-and-titanium competition for years.

The engineers continue to say that the long-range money is on the steel-and-titanium though. Regardless of whether that takes the form of PowerPC chips or Itanium chips or other non-IA-legacy-codenative chips or all of the above, the general consensus seems to be that the future execution of legacy Intel Architecture instruction set code will be through emulation. The fact that they were saying that as long ago as 1995 (= the astonishing wooden ships thingie) notwithstanding.

Not sure that first part was clear, let me try again. RISC, the instruction set, has fewer instructions. CISC has more of them, and therefore for any given task there may be a CISC instruction for performing that specific task whereas you’d accomplish it in RISC by organizing your smaller pool of available RISC instructions into a sequence that does the same thing. So your application, when compiled to run on a RISC processor, will accomplish tasks using more total instructions, although many of those instructions will be the same instruction appearing over and over again in different sequences.

It’s like the difference between writing things in a pictographic alphabet like Chinese or a phonetic alphabet such as English or Latin. I compose my posts using only 26 characters plus some punctuation marks. Someone participating in a Chinese-language message board composes posts using symbols where each symbol is an entire syllable, and uses fewer total characters as a result, but has to have a keyboard capable of providing input of any of thousands of different possible characters. My keyboard (and typing skills) only have to cope with a tiny set, so I probably type a lot faster.

Or… suppose you set out to code the human genome and you set up a library of codes, each of which coded for a different chemical process. When you got done setting up your library and writing out the genome, you’d have something a lot shorter than DNA in terms of the total number of code segments, but your library of necessary-available codes to work with would be huge. And awkward. Compare that to the system in which you just use guanine, cysosine, adenine, and tyrosine, but use them in different sequences and then use those sequences in different sequences and so forth. That’s RISC.