Okay, everyone
stop for just a moment. Take a breath.
There, doesn't that feel better?
The reality is that on the desktop market, Apple really doesn't compete on either the business or the consumer level as far as providing cheap and versatile machines. Apple doesn't even seem to
want to, and that's fine. What Apple sells is more of a "whole package" thing that, while there are indeed PC system builders who do similar or the same, just isn't the case with the more modular and varied PC market. This isn't necessarily a bad thing for Apple as a company, either, since the fact that they don't have to compete on an equal product they have a lot more tight control over the build specifications of their machines, lending to a "tighter ship" perception of the product from a consumer perspective. That Apple indeed does also spend to over-engineer some things on their systems also polishes their products as single appliance units instead of machines that need to be tweaked or have 3rd-party stuff added to operate like a user might want. Again, this isn't a bad thing, though it does have the result of raising the price to the consumer-- something that a "large enough" and fairly dedicated number of consumers see as a value add that has little or no equivalent elsewhere. These factors have come together to establish and maintain a very tight and strong brand loyalty to many customers who buy Apple products (but not all of them).
PCs, on the other hand, are a more nebulous and vague creature, ranging from piecemeal systems from hobby or enthusiasts to the "full system" offerings of the big-box sellers (Dell, HP, Lenovo, etc.). That there's such a wide range of different tastes or models is due largely to the IBM clone market that spawned the modern desktop PC market, and that system where different vendors are making similar-yet-slightly-different hardware has pushed component makers, system builders, and software developers to have to try to keep up with the fastest and newest as well as the older and less speedy components, which means a whole of lot factoring in for lowest-common-denominators in their processes that a less varied system wouldn't have to face. There have been some system builders who have tried to go to a more "tighter ship" build model, and to varied degrees of success-- one could argue that Dell and HP have found a fairly reasonable balance of propriety and variety (ability to upgrade with 3rd-party components), while we could view Sony or IBM as having gone a bit too far on different occasions, resulting in premium systems that lock customers into their proprietary solutions. Suffice to say, the PC clone-bred market doesn't take well to too much proprietary hardware in their systems. Further, the varied competition from multiple manufacturers has promoted a great deal of the advancement in RAM, CPU, and video card technologies, with competing vendors attempting to tweak even a slight advantage over their counterparts. This has, in essence, been both one of the single greatest benefits in the distributed market and one of the greater hurdles for the operating system and software developers in a rapidly-changing and advancing market. Brand loyalty has still managed to surface in this market as well, though unlike Apple's "full system" brand loyalty the PC market version tends to focus more on manufacturers (AMD, nVidia) or development companies (Microsoft, various Linux, Blizzard Entertainment), with loyalty to system builders (Dell, HP) being less prevalent at the consumer level (but present at the business level).
It's that brand loyalty on both sides that's really propagated many myths, outright lies, distortions, and uneven comparisons between the Mac and PC crowds over the years, most notably the meme "Mac vs. PC" in the first place. After all, Macintosh computers
are PCs, just PCs of a different make and model-- this has become even more evident in the past few years, as Apple has shifted their internal hardware platform to the same architecture as the rest of the PC market. Naturally, what the comparison really implies is a Macintosh computer running MacOS versus a non-Apple computer running Windows, effectively ignoring the steadily-growing base of Linux users out there, lumping all system builders into one monolithic category (along with the piecemeal vendors), and continuing the lack of any real 1::1 comparison of qualities. Even now, you're rarely going to be able to make a 1::1 comparison of hardware or software, and that's specifically because Apple still maintains a very tight control over its supply line while the rest of the computer markets have varying degrees of mixed parts versus proprietary systems. Configuring a mid-range Mac computer and a mid-range Dell or HP computer won't yield the same hardware, and debates comparing them delve deep into semantics to hide brand loyalty influence on the arguments (for both sides). Realistically, the hardware between any two comparisons are usually similar enough for a rough, "good enough" comparison outside of brand loyalty pedantry, but there's always enough semantic wiggle-room for brand loyals on both sides to still claim superiority due to the lack of an apples-to-apples comparison.
However, even if there were a very precise 1::1 comparison, another factor that comes into play is one that Apple has been employing at least since the release of the OS X operating system, and one that other software application developers (most notably Adobe) have employed for far longer: optimization during development. To keep it simple (though I'm sure there are plenty here who could go very complex on the topic), when a software writer is building the code for a given software, they can optimize it to a reasonable level of specificity depending on the platform they're writing for, the variation in the abilities of the different hardware it must run on, and the sophistication of the code compiler that converts the code into an executable form. For software developers, this can often provide key advantages when compiling for specific configurations or hardware components-- game developers do this often enough, usually optimizing for video cards. Adobe has notably been optimizing its creative suite of software for years, more often on Apple's hardware than anything else, in particular their G3, G4, and G5 computers. Since Apple has moved to x86 architecture, I'm not sure how much Adobe has optimized for them, but it is known that Adobe optimizes for 64-bit processors on Windows now. Apple has been optimizing their operating system with each new release (so, on practically a yearly basis) with every 2-3 generational updates to the hardware, far more often than their "enemy" Microsoft is known to do so-- this is actually where the "clock speed isn't everything" meme took root, and that idea is a mixture of fact and subjective opinion. Since Apple's advent into x86 computing, however, the amount of optimization that could be done has been limited, since many of their binary (executable) files needed to be backward-compatible to some degree. Apple began facing a similar conundrum to its counterparts in the personal computing field: having to maintain reasonable support for vastly different hardware (in Apple's case, two different architectures). Considering the level of challenge and the fact that there isn't any real history of a slow-down in performance on Macs, it's safe to say that the company handled it pretty well. Microsoft, since the change to Vista, has been working on doing optimizations of its own, the first of which was seen in Vista's Service Pack 1 and the latest of which can be experienced in Windows 7. In fact, Windows 7 has reportedly seen more optimization than was typically done in Windows OS changes, most of which surrounded older code that has been out of use for a while or for more forward-facing technology (like 64-bit computing). Conversely, Apple's newest OS seems to have done much the same as far as sweeping under-the-hood alterations and optimizations, though with less worry about old code (though some subsystems are gone) and also (like Microsoft) tweaking their system to be more forward-centric for things like 64-bit computing. The results for both so far seem to have had positive results.
That very basic run-down on optimization is just to underscore that slight hardware differences aren't always the only thing that can make a fairly big deal out of comparing two systems, as is often done in the Mac vs PC wars. And honestly speaking, Linux and BSD can go further that MacOS or Windows in terms of optimizing, and (Ducky, feel free to correct me here, but) Solaris machines far surpass even Apple when it comes to optimizing their systems for very specific hardware. So, when it comes down to comparing just hardware, the comparisons fall short because the software running on them isn't necessarily a variable factor that could be switched between the hardware with no changes in outcome (in other words, there's nothing to suggest Windows on Apple hardware runs faster than on a PC and vice versa). Comparisons are always inevitably going to be a "total package" (Mac) versus a "compiled package" (PC) and, as such, be limited in scope as far as determining value.
Now, when it comes to dealing with close approximations, then the perceived values begin changing and the obvious brand loyalty factors come back into focus. Comparing a Mac system to a custom-built is going to be missing things like warranties on the system as a whole, but for some this is going to count as a value because they can replace parts separately while to others this is a negative because they don't have the time and/or inclination to try to take their computer apart any time something goes wrong. Comparing Apple Mac builds to big-box system builders (like, for instance, Dell) are going to require like model comparisons even if the semantic hardware specs aren't the same. Case in point: you could compare the
lowest-priced iMac to the
lowest-priced Dell AiO, though you're going to lose some screen size, and to bump the CPU up to 3.16, the RAM up to 4GB, the video card up to the same model, and including a wireless keyboard/mouse brings the cost to $1035 for the Dell Studio One compared to $1200 for the Apple iMac-- still an obvious lower cost for the Dell AiO PC over the Mac, but there are differences in configuration (like the screen size) that are ultimately going to be a subjective value judgment (are 2" of diagonal space worth nearly $200 US?). Moving to the professional models, the Mac Pro would compare to Dell's
Optiplex line of computers, but
good luck in getting a similar enough comparison to be meaningful-- Dell's systems are capable of RAID without an add-in card, Apple's systems come with 3D-centric video cards while Dell's with 2D-centric cards, and so on-- though in general the costs for a pro system of fairly reasonable specs are going to come within $100 or so of each other, and in all honestly with machines like that then
purpose comes into play in terms of value above a measly few hundred dollars one direction or another. The main point is that arguments from both sides are laced heavily with hyperbole and exaggeration-- while Apple's shift to x86 architecture has made Apple computers
competitive, this does not justify claims that Apple computers are cheaper than like systems from non-Apple sources, nor do the configuration differences justify claims that Apple hardware is exorbitantly higher than comparable non-Apple systems. This is brand loyalty at its most obvious.
Now, there are segments of the computing market where Apple has some room for criticism-- in my own field (IT), Apple has very little in the way of support tools for their machines as workstations that are part of a larger inventory, while HP, Dell, IBM and others have significant advantages-- but in terms of retail consumer computing the only real factor between choosing a Mac or a PC is personal taste, and
there's nothing wrong with that. If there were really something inherently superior with Macs or PCs, the entire debate would have practically ended at least a decade ago, if not two decades. That arguments still persist in using objective criteria to support subjective opinions in the Mac vs PC battles just supplies more fuel to the fire and more food for the brand loyalties people have.