• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

The Gates legacy

You're calling me computer illiterate? Hilarious. :rolleyes:

User friendly has EVERYTHING to do with operating systems, because operating systems need to be used by USERS.

Yes. Now explain to me why the OS has to be user friendly. Then spend a few minutes talking to someone who knows what an OS is and they can explain the difference between an OS and the user interface. (rule 8)'s sake - CS 101. They taught this stuff in school in my day.
 
XP was a crashy, unstable piece of crap with compatibility issues until at least SP1, just like Vista. And 98. And 95.

Okay, that's a lie. 95 was never any good, even patched up. ;)

Dang. I say one thing positive of MS and it gets shot down too. :o
 
Drivers. You knew there was an driver for XP but a click away. I think this is why more people have not jumped to Linux and now Vista.
I never had problems with drivers for essential peripherals with Linux.

I once installed Windows 2k3 at work for a Samba course. The machine had a common Intel EE100 ethernet card. W2k3 didn't have a driver for it. So how am I going to get it without network connectivity? :jaw-dropp

ETA: I think the centralized driver development of Linux, where all drivers are part of the kernel, is much better than the decentralized approach of Windows, where hardware manufacturers put out low-quality drivers that run in kernel space.
 
Last edited:
I never had problems with drivers for essential peripherals with Linux.

To be fair though who decides what's essential? I know several techie friends who dumped OS/2 (which they otherwise loved) because of driver issues. I'm okay with linux support for my Canon MP210 but my wife isn't interested in techinical details, she just wants it to print.
 
This strikes me as thoroughly hypocritical, essentially saying that Netscape should have had a completely free hand through its "Netscape-enhanced" marketing to break web sites for users of anything other than Netscape's browser.
I don't say that. Browsers send out a "User-Agent" string. Webservers can serve one or another file depending on that. In a time where standards were still very much in flux, a decision to do two designs made sense; e.g., Netscape supported frames from version 3 on. Microsoft willfully used the same User-Agent string as Netscape so that webservers could not distinguish on that.

...while this is just plain historical revisionism by someone who presumably does not remember the controversy Netscape's non-standard tags produced before IE even entered the picture. The whole idea of using HTML to control layout at the pixel level was something Netscape invented, very badly and outside of any standardization process.
What tags are you referring to the "control layout at the pixel level"??? Netscape did invent quite some additional tags, in a time that standards were very much in flux and browser development was faster than standards development.

Let's see what Browser warsWP says about Microsoft's tactics:
* Microsoft created a licensing agreement with AOL to base AOL's primary interface on IE rather than Netscape.
* Microsoft purchased and released a web authoring tool, FrontPage, making it easy to utilize proprietary extensions and non-standard HTML code in web pages.
* Microsoft included support for CSS in IE. Some web designers found it easier to write their pages for IE only than to support Netscape's proprietary LAYER extensions.
* Microsoft locked up a large portion of the Macintosh browser market in 1997 as part of its agreement with Apple that year. The agreement made Internet Explorer the default browser on the Mac for five years.
And I can add to that that Microsoft arm-twisted OEM PC manufacturers not to install Netscape on their PC's with Windows pre-installed - see Judge Jackson's Findings of Fact.

I give you that IE 4 was better standards compliant than Netscape 4. But the FrontPage business at the same time is perverting standards. As did Microsoft with their Jscript variation on Javascript. And the Microsoft Java implementation with unauthorized "extensions" - Microsoft paid Sun $2 billion in settlement for that.

Finally, while IE4 was better standards compliant, IE8 still does not implement CSS2, a 1998 standard - and Microsoft has publicly declared while developing it it had no intention to adhere to the standard.

In conclusion, your post strikes me as thorough Microsoft apology.
 
To be fair though who decides what's essential? I know several techie friends who dumped OS/2 (which they otherwise loved) because of driver issues. I'm okay with linux support for my Canon MP210 but my wife isn't interested in techinical details, she just wants it to print.

See my story. I consider disk and wired network peripherals as essentials. When you have those, you have the possibility to hunt down that obscure audio/accelerated video/wifi/webcam/... driver as long as you want. Or have a friend help you do that.

However, to be able to use that w2k3 machine in the network, I needed the Linux box next to it to download the driver. I think that's ludicrous.
 
Yes. Now explain to me why the OS has to be user friendly. Then spend a few minutes talking to someone who knows what an OS is and they can explain the difference between an OS and the user interface. (rule 8)'s sake - CS 101. They taught this stuff in school in my day.
Why the hell did you trim the rest of my post where I explained the rest? Do you just like beating strawmen, or is it that you can't make a coherent argument?
 
Computers also need to be used by power users who do understand the command line. Computers also need to be used by administrators. For such groups, a command line interface is indispensable. A computer needs both. MS Windows has been sorely lacking in the command line interface.
Agreed, but power users are a very small share of the PC market. When Windows came to the scene, it provided a very friendly interface that allowed PCs to be used by the general user, and its integration with DOS programs and backwards compatibility broke very little existing software.

That's why Linux distributions assemble those upgrades and test them before putting them out to the users.
As does Windows (Something that Wudang can't seem to grasp, but I guess he's stuck in uber-power user "if you can't hop in and rewrite part of the operating system to make it work you don't deserve a keyboard" mentality)
What's xenophobic about Apple? Its GUIs have been much more consistent than those of Microsoft. And OS X offers the best of both worlds: an absolutely slick GUI with the best command line there is :).
It's xenophobic to non-native hardware and non-approved software. It offers less developer tools than Microsoft and is much pickier about what can and can't run on its hardware.

That's why its users have grown to accept things the general market won't. There was much wailing and gnashing of teeth when Vista 64 broke some programs, yet it never broke as much as Leopard did, proportionally.
 
Why the hell did you trim the rest of my post where I explained the rest? Do you just like beating strawmen, or is it that you can't make a coherent argument?

Excuse me? Are you seriously asking me why I trimmed all the waffle where you confused the OS and the GUI and then ask me why I trimmed all the bits where you embarassed yourself by confusing the OS and the GUI?

Answer: I was trying to save you the embarassment when you talk to a computer literate friend and find out what an OS and a GUI are.

Here's a hint. I have used several OSes where the GUI was replaceable.
 
It's xenophobic to non-native hardware and non-approved software. It offers less developer tools than Microsoft and is much pickier about what can and can't run on its hardware.

That's why its users have grown to accept things the general market won't. There was much wailing and gnashing of teeth when Vista 64 broke some programs, yet it never broke as much as Leopard did, proportionally.

Good point. And with better division and documentation of the divisions between OS and GUI there might have been less broken.
 
See my story. I consider disk and wired network peripherals as essentials. When you have those, you have the possibility to hunt down that obscure audio/accelerated video/wifi/webcam/... driver as long as you want. Or have a friend help you do that.

However, to be able to use that w2k3 machine in the network, I needed the Linux box next to it to download the driver. I think that's ludicrous.

Yes but if you don't know what the word driver means (outside of golf) you're stuffed. I know what you mean ( having just installed OpenSuse 11 on a PC connected to a canon MP210 :-) ) and network connectivity is definitely a necessary condition, sadly it's not a sufficient condition.
 
Excuse me? Are you seriously asking me why I trimmed all the waffle where you confused the OS and the GUI and then ask me why I trimmed all the bits where you embarassed yourself by confusing the OS and the GUI?

Answer: I was trying to save you the embarassment when you talk to a computer literate friend and find out what an OS and a GUI are.

Here's a hint. I have used several OSes where the GUI was replaceable.
So have I. And the problem is with them is, the OS can never break the GUI, or your average consumer will be pissed beyond all belief. That means that large-scale PC operating systems need integrated GUI and OS (Note that Windows, Mac, and Ubuntu have all chosen to deliver a GUI with the OS, and update them together). OS and GUI are synonyms, for all intents and purposes, in the modern PC world.

So go pretend that the 0.5% of the population who can be buggered to figure out whats wrong with a GUI when the OS breaks it is actually relevant for anything other than server applications.


Good point. And with better division and documentation of the divisions between OS and GUI there might have been less broken.
No, because Leopard fundamentally changed the OS (NOT the GUI). It was actually done with 10.1, but they included legacy environments until 10.5. They basically told their stable of developers "Update or your program breaks."

Microsoft never does that. Imagine, for one second, that a mission-critical piece of software breaks because of a new version of an OS. That happened to a small number of applications under Vista and people FLIPPED.
 
Agreed, but power users are a very small share of the PC market. When Windows came to the scene, it provided a very friendly interface that allowed PCs to be used by the general user, and its integration with DOS programs and backwards compatibility broke very little existing software.
Friendly interface??? Compared to what? I used MacOS, Sun OpenLook and various X11 window managers at the time, and they were way more user-friendly.

As does Windows (Something that Wudang can't seem to grasp, but I guess he's stuck in uber-power user "if you can't hop in and rewrite part of the operating system to make it work you don't deserve a keyboard" mentality)
Windows does not. Its drivers are written by hardware vendors. Whose software developers are not the most distinguished kernel developers, to say the least.

It's xenophobic to non-native hardware and non-approved software. It offers less developer tools than Microsoft and is much pickier about what can and can't run on its hardware.
I can't judge about the availability of "developer tools". vim, make, gcc, javac and perl are my developer tools. They're more easily available on MacOS than on Windows (where I'd have to install cygwin first). When you look for something GUI-y, Eclipse is available for MacOS.

The "xenophoby" to non-native hardware is actually a plus. It makes the OS more stable. MacOS users have less problems with stability of their machine than Windows users. I wouldn't be surprised at all that the hours spent on tampering Windows machines for such issues make their TCO actually higher than of a MacOS box, despite the lower purchase cost.

That's why its users have grown to accept things the general market won't. There was much wailing and gnashing of teeth when Vista 64 broke some programs, yet it never broke as much as Leopard did, proportionally.
Did it? Evidence?
 
Friendly interface??? Compared to what? I used MacOS, Sun OpenLook and various X11 window managers at the time, and they were way more user-friendly.
Compared to any command line interface. Oh and the most common Open Look GUI was called Open Windows. Guess why?
Windows does not. Its drivers are written by hardware vendors. Whose software developers are not the most distinguished kernel developers, to say the least.
Windows comes preequipped with most drivers at the time of the disk launch, and now checks for drivers updates as part of its automatic updating system. Microsoft also works closely with companies to develop the driver updates. So overall, I'd say they generally do.

I can't judge about the availability of "developer tools". vim, make, gcc, javac and perl are my developer tools. They're more easily available on MacOS than on Windows (where I'd have to install cygwin first). When you look for something GUI-y, Eclipse is available for MacOS.
Oh for pete's sake. Those aren't developer tools. Those are languages.
The "xenophoby" to non-native hardware is actually a plus. It makes the OS more stable. MacOS users have less problems with stability of their machine than Windows users. I wouldn't be surprised at all that the hours spent on tampering Windows machines for such issues make their TCO actually higher than of a MacOS box, despite the lower purchase cost.
And makes it harder to develop software on the platform. Windows is less stable, but more open, which gave them a huge advantage early on in that more developers chose Windows over Mac for their programs. This produced a positive feedback loop (more programs = more functionality = more programs to develop for) that led to Windows being the primary environment for the home PC.

Windows won precisely because it lacked this xenophobia to both hardware and software. They won because of their developer tools and openness. You don't seriously think they won because of their frontend or because they beat apple to the market (and yes, I heard someone use that argument once), do you?
Did it? Evidence?
http://en.wikipedia.org/wiki/OS_10
 
So have I. And the problem is with them is, the OS can never break the GUI, or your average consumer will be pissed beyond all belief. That means that large-scale PC operating systems need integrated GUI and OS

OFFS! No, no, no! That has nothing to do with it. If you had the slightest shadow of a clue about systems development you would be hiding in a corner crying. Let's take baby steps - do you know the term VM?
 
OFFS! No, no, no! That has nothing to do with it. If you had the slightest shadow of a clue about systems development you would be hiding in a corner crying. Let's take baby steps - do you know the term VM?

Go play in the back with the rest of the 16 year olds who think they're so great because they can do in 10 minutes what I can do in 1, but with more commands.

If the OS and the GUI are developed separately, one day you will have a GUI that breaks or loses functionality thanks to an operating system update. The only way to prevent that is to do the GUI in house with the OS, or accept that you won't release updates frequently or quickly.

Why do you think that every single "Operating system" that has achieved significant usage comes with its own GUI? Ubuntu is an OS and a GUI (technically GNOME, but they install together by default), Mac distributes its OS and GUI together, and so does Windows. And they all have very user-friendly GUIs as defaults.
 
"OS and GUI are synonyms, for all intents and purposes, in the modern PC world. "

CAM and medicine are synonyms, for all intents and purposes, in the modern medical world.

If you mean Homeopathy and Acupuncture, then you better quickly go and tell the GNOME guys that their GUI doesn't do anything at all whatsoever, and that they're wasting their time on stuff that does nothing. :rolleyes:

That'll be funny. Quick, demonstrate more of your amazing knowledge. Wow me!
 
Compared to any command line interface. Oh and the most common Open Look GUI was called Open Windows. Guess why?
Great, MS-Windows 2/3 was better than DOS. Why the OpenLook GUI was called OpenWindows? Maybe, because, any GUI has windows? The name "windows" is not reserved for M$ Windows.

Windows comes preequipped with most drivers at the time of the disk launch, and now checks for drivers updates as part of its automatic updating system. Microsoft also works closely with companies to develop the driver updates. So overall, I'd say they generally do.
You didn't read my little tale about the Intel EE100 NIC. The idea of "certified drivers" is also a very recent one, btw.

Oh for pete's sake. Those aren't developer tools. Those are languages.
vim? make? I just gave an impression of my personal idea of program development.

And makes it harder to develop software on the platform.
How the hell does less choice in hardware make it more difficult to develop software? Apart from the fact that hardware should be hidden behind APIs. You're right in one thing, that Apple has been more picky on their third-party developers to adhere to interface guidelines - which makes for a more consistent user experience. But I've seen also a couple of programs that didn't adhere to that.

Windows is less stable, but more open, which gave them a huge advantage early on in that more developers chose Windows over Mac for their programs. This produced a positive feedback loop (more programs = more functionality = more programs to develop for) that led to Windows being the primary environment for the home PC.

Windows won precisely because it lacked this xenophobia to both hardware and software. They won because of their developer tools and openness. You don't seriously think they won because of their frontend or because they beat apple to the market (and yes, I heard someone use that argument once), do you?
In the beginning, Microsoft has certainly wooed their 3rd party developers. Now not so much anymore. But the battle between Apple and Microsoft had been decided much earlier, and for two very simple reasons: IBM had introduced the PC, and that's what the business went with. The clones gave the PC an affordable price for the home consumer. DOS/Windows was a necessary evil that came with it.

Huh? The only thing I see is that Leopard dropped the Classic emulation. About time too. No need to carry endless amounts of old cruft with you.
 

Back
Top Bottom