8
« on: January 06, 2011, 07:57:12 pm »
Probably a lot of threads like this, but I'd like to hear your opinions.
Personally, I've never touched Windows 7. My experience is in OS X and Unix.
I think that OS X is basically a lesser version of Unix. The OS X layer of Unix is very unremarkable - nobody uses or cares about Carbon, Objective-C, XCode, etc. It adds no functional power to Unix, which is sad because it takes a very powerful kernel as its starting point. What OS X does, instead, is make Unix painless. So it's OK for the end-user but useless in a server-side/software dev. environment.
Unix, in its unadulterated forms (BSD, Linux, etc) is small, powerful, and reliable. Its advantage over OS X is basically
OS X = Unix + Garbage
=> OS X is worse for all its garbage
Its advantage over Windows 98, NT, XP is clear: Unix applications, unlike Windows applications, don't crash the system when they crash. Windows 98, NT, and XP simply allow applications too much access to the operating system so that, when they crash, they often drag the OS down with them.
But what about Windows 7? I've never touched it. If a process is going haywire, is it able to do a better job disposing of the process than its predecessors? Unix is able to do this by restricting kernel access much more than previous Windows releases... has Windows 7 done the same? I wonder how the boot-up and login speed compare to that of Linux.
Furthermore, how is Windows 7 as a server? Does its size (the kernel + its libraries must be at least twice the size of a Linux kernel) often rule it out of consideration? Clearly, Windows 7 is a good platform for Software Dev, since I've been hearing that .NET is basically the best programming environment (overtaking Eclipse/Spring/Java). But what about as a server? Another thing that probably rules it out as a server would be the security vulnerabilities... from all the application vulnerabilities to something like the NetBIOS spoofing that iago wrote about.