03-27-2009, 06:23 PM
I had some random thoughts while reading this thread that I'm going to share. If you just want my verdict, scroll down to the bottom of the post.
1. Back in 2006 or so, Apple switched from PowerPC to Intel processors. With that move, basically the only difference between a PC and a Mac, from a hardware standpoint, is a security chip in Macs that OS X checks for when it boots up. This is Apple's way of making sure you only run OS X on Macs (Google "Psystar" or "Hackintosh" for further reading).
2. If you absolutely must run Windows software on a Mac, you can use Parallels or Boot Camp. What's funny about this is that at one time, there was a specific MacBook Pro that ran Vista better than any other laptop on the market. On the opposite end of the spectrum, you can go the aforementioned Hackintosh route if you're prepared to do some fighting and compromising, but you're likely not in the 0.000001% of the population who would read that and go, "Oh, that sounds like a blast! Where do I start?"
3. If you want to be able to easily crack open the case and upgrade parts yourself, get a PC. I've read horror stories about how Apple makes their cases overly complicated to get into. If you want to buy something off the shelf and just use it (Like most people), then it's down to a matter of how much you're willing to spend.
4. Apple may have 10% or so of the market now, but that's all they need. They enjoy much fatter margins than other computer companies, and they have just enough marketshare to force people to support them (There were articles on tech websites about companies being forced to support Macs around the time they had 7-8% marketshare). I really don't think Apple wants to be the only game in town, because right now, they're kind of the computer industry's equivalent to a european automaker. They charge a premium, and if they got into playing the volume game to please shareholders, I think that would ultimately hurt them.
5. Linux may have only around 1% marketshare, but back when Vista was 6 months to a year old (And even now, I would argue), Ubuntu completely stomped it in terms of stability and usability. I switched some time in the summer of 2007, and haven't looked back since. It definitely isn't for everyone, however.
6. hoshi, one of the problems with Windows is that in the early days, there was no concept of user permissions, or not running as administrator all the time, which all the UNIX-derived stuff has had since at least the early 80s. Don't believe me? Find a Windows 98 box with multiple user accounts and see if you can browse and access everybody else's files. Without getting into too much technical detail, Microsoft decided in the 90s that they were going to encourage programmers to code every app under the assumption that the user was logged in as administrator. That's starting to change with Vista and its UAC feature. You don't even really need anti-virus software anymore, as long as you password-protect your admin account, create a user account, and only use that user account. If you hit cancel on most of the UAC password prompts that come up, you're pretty much not going to get a virus. Doing this gives you a rough equivalent to sudo on Linux/BSD, only it's a little over 25 years later AND you have to actually to create another account.
My verdict: Do like hoshi said. Either build a mid-range gaming box or have one built for you, and be sure quality parts are used throughout. This is one area where all the OEMs, even Apple, cut corners. If you're using solid hardware, it may cost you more right now, but you can be sure that some random part won't take a dump in the next 6 months. In fact, I'm typing this on a mid-range gaming box that I bought back in 2004, and other than a network card dying, I've had no hardware problems. Just stick to quality components, and you won't be crying about Dell's motherboards being DOA like Hot Wings.
1. Back in 2006 or so, Apple switched from PowerPC to Intel processors. With that move, basically the only difference between a PC and a Mac, from a hardware standpoint, is a security chip in Macs that OS X checks for when it boots up. This is Apple's way of making sure you only run OS X on Macs (Google "Psystar" or "Hackintosh" for further reading).
2. If you absolutely must run Windows software on a Mac, you can use Parallels or Boot Camp. What's funny about this is that at one time, there was a specific MacBook Pro that ran Vista better than any other laptop on the market. On the opposite end of the spectrum, you can go the aforementioned Hackintosh route if you're prepared to do some fighting and compromising, but you're likely not in the 0.000001% of the population who would read that and go, "Oh, that sounds like a blast! Where do I start?"
3. If you want to be able to easily crack open the case and upgrade parts yourself, get a PC. I've read horror stories about how Apple makes their cases overly complicated to get into. If you want to buy something off the shelf and just use it (Like most people), then it's down to a matter of how much you're willing to spend.
4. Apple may have 10% or so of the market now, but that's all they need. They enjoy much fatter margins than other computer companies, and they have just enough marketshare to force people to support them (There were articles on tech websites about companies being forced to support Macs around the time they had 7-8% marketshare). I really don't think Apple wants to be the only game in town, because right now, they're kind of the computer industry's equivalent to a european automaker. They charge a premium, and if they got into playing the volume game to please shareholders, I think that would ultimately hurt them.
5. Linux may have only around 1% marketshare, but back when Vista was 6 months to a year old (And even now, I would argue), Ubuntu completely stomped it in terms of stability and usability. I switched some time in the summer of 2007, and haven't looked back since. It definitely isn't for everyone, however.
6. hoshi, one of the problems with Windows is that in the early days, there was no concept of user permissions, or not running as administrator all the time, which all the UNIX-derived stuff has had since at least the early 80s. Don't believe me? Find a Windows 98 box with multiple user accounts and see if you can browse and access everybody else's files. Without getting into too much technical detail, Microsoft decided in the 90s that they were going to encourage programmers to code every app under the assumption that the user was logged in as administrator. That's starting to change with Vista and its UAC feature. You don't even really need anti-virus software anymore, as long as you password-protect your admin account, create a user account, and only use that user account. If you hit cancel on most of the UAC password prompts that come up, you're pretty much not going to get a virus. Doing this gives you a rough equivalent to sudo on Linux/BSD, only it's a little over 25 years later AND you have to actually to create another account.
My verdict: Do like hoshi said. Either build a mid-range gaming box or have one built for you, and be sure quality parts are used throughout. This is one area where all the OEMs, even Apple, cut corners. If you're using solid hardware, it may cost you more right now, but you can be sure that some random part won't take a dump in the next 6 months. In fact, I'm typing this on a mid-range gaming box that I bought back in 2004, and other than a network card dying, I've had no hardware problems. Just stick to quality components, and you won't be crying about Dell's motherboards being DOA like Hot Wings.