All Things Microsoft > Microsoft Software

How to make your Windows machine more stable and secure

<< < (7/36) > >>

muzzy:
i have to disagree with you about this. the concept behind open source software is peer review.

basically, and i am sure you know this, if the source code is open, then potentially thousands upon thousands of people are looking over it, with a view to wiping out any holes, malware, inefficiencies et cetera. with closed source code like mswindows, only the microsoft developers get to see it, therefore only they get to bugfix it. thousands versus perhaps one floor (at the most i suspect) of nine-tofivers.

Too bad this doesn't work in practice. People don't read uninteresting parts at all. Was it in PGP or where exactly, that key generation wasn't very random at all and nobody noticed? The code was so obviously flawed that everyone should've realized it's broken. Yet, nobody noticed, for a full year. Just because the code is available doesn't mean that anyone's going to review it. In linux, this means that rarely used drivers and other rarely used things wont likely be read by many people.

and there's my second point. these open source coders are all (...snip...) doing it for the love of it, while the coders at microsoft are being paid a salary to do it. amateurs will naturally have a more personal interest in fixing bugs and making stuff work right. people who have to file paperwork and who will collect their paycheck whatever happens are less likely to be quite so ambitious and successful from the point of view of "good" code, in my opinion.

Just because people are interested doesn't mean they're good. Amateurs write crappy code and don't even realize it themselves. To them, usually the only thing that counts is that code works. Theoretically this is OK, programming should indeed be goal oriented and the primary goal is to have something that works. However, just because something seems to work doesn't mean it does work. In many opensource projects I've seen, there are clear indications that the developers don't even know the language they're using. These include C++ projects where pointers are tested to not be null before deleting them, "OO" code where all classes are glorified monostate patterns or worse, and all sorts of stuff that just makes you go wtf. I'm well aware that similar stuff happens in commercial products for the same reasons, but professional programmers tend to always write better code than a bunch of amateur geeks.

Basically, and i am sure you know this too, the whole thing is explained *perfectly* in ESR's book "The Cathedral and the Bazaar" which i cannot recommend enough, if you are not familiar with it already.

I'm familiar with it, however you are making some funny assumptions there. First, you assume that the only difference between professional and amateur is that the professional gets paid and amateur doesn't. A lot of free software development is done by professionals who are developing software as a hobby, too. That stuff tends to be good, and these people tend to know what they're doing. However, a large amount of the amateur development only works because the said amateurs need the software and are going to fix the bugs when they run into them. If you assume that professionals write as sloppy code as average amateurs but won't fix issues unless they get paid, then obviously you can conclude the very same things you already assumed. See a flaw in logic here?

Now, even while Linus is taking good care to see that totally complete crap doesn't get into the kernel, the submitted patches are still what they are. There's this great saying, "If operating systems were beer, linux would be an empty barrel into which everyone could pee", which is something I think a lot every time I have to go through some sources.

Also, if we extend the quality comparisons to the userland, Microsoft still has professionals writing most of the non-kernel parts of the system, while what you get to run on linux comes from zillions of sources and are subject to zillions of different programming practices and levels of testing and such. Reminds me of this one "secure finger daemon" and the funny advisory about it on bugtraq. Whoever wrote it, decided to make the socket calls blocking, so the damn thing could be DoS'd by merely opening a single connection against it. Further, it contained serious holes (symlinking .plan to any file, then reading it through finger) and so on. This is the kind of stuff that amateurs write, and being amateurs they have no idea how much their stuff really sucks.

Kintaro:
Too bad this doesn't work in practice. People don't read uninteresting parts at all. Was it in PGP or where exactly, that key generation wasn't very random at all and nobody noticed? The code was so obviously flawed that everyone should've realized it's broken. Yet, nobody noticed, for a full year. Just because the code is available doesn't mean that anyone's going to review it. In linux, this means that rarely used drivers and other rarely used things wont likely be read by many people.

Uhm, PGP isnt open source.

You provide no news sources with any of your arguments either, credibility wise your on about the same level as Fox News.

jtpenrod:
Yes, I didn't provide very good reasoninig. By same logic, your story above about Win-XP eating people's work is equally worthless.

No. It is not. It is, indeed, the same point you were trying to make concerning the 2.6.x kernel. Now you can damn Linux all you want for 2.6.x's not being "perfect" right from the get-go. However, Windows has the same problem. Testing under fire is really the only way to make certain everything's OK. Given the various combinations of processors, mo-bo's, miscellaneous hard, it's a wonder that anything other than Macs run at all. Given that, I have seen far fewer problems with Linux than with Win-whatever.

Furthermore, with Linux you do not get onerous EULAs, activation headaches, nagware, spyware, an op-sys filled with "daemons" that like to go running home to mother every time you go on-line. This is why I prefer to use Linux. If these things don't bother you, then go ahead and use Win and be happy. But know this: you won't convince me to return to the Redmond fold. I get considerably more value from Linux than I can from any  current Microsoft offering. I don't need it; I don't want it.


Dynamic languages don't need to be interpreted. Also, ocaml isn't just an interpreted language. It can be compiled to native code, and I know people who say it's really damn fast. No, I don't have personal experience, that's why I said it's been claimed so. Obviously, benchmarking against C++ compilers would suck because the two languages are just so different. However, let's make those comparisons anyway [...] Go ahead, you'll see that ocaml ranks quite high in the list, even though you can question the methods of benchmarking. You'll also see that Ruby scores quite low

I knew this already; I said as much.

So, wouldn't the best approach to solving the problem be user education?

Too late for that. The marketing weenies have already convinced all too many users that "education" is not necessary. I don't see this changing any time soon.

muzzy:
Uhm, PGP isnt open source.

You provide no news sources with any of your arguments either, credibility wise your on about the same level as Fox News.

Ah, "open source" vs. "Open Source". Ok, so their license doesn't conform to the Open Source Initiative, and their definition of Open Source. Here's a reference:

http://cryptome.org/cipn052400.htm#pgp

My point was that even though the source was available to everyone to read, it doesn't get "peer reviewed" if nobody's interested in reading it.

muzzy:
Now you can damn Linux all you want for 2.6.x's not being "perfect" right from the get-go. However, Windows has the same problem. Testing under fire is really the only way to make certain everything's OK. Given the various combinations of processors, mo-bo's, miscellaneous hard, it's a wonder that anything other than Macs run at all. Given that, I have seen far fewer problems with Linux than with Win-whatever.

If we disregard win9x series, I've had way more problems with linux than windows. And I mean real problems, such as netscape crashing whole X, strange kernel panics on same system in which windows worked fine, etc. On Windows 2000 there were initially some problems with memory management (the "out of buffer space" problem), but those have been patched long ago. On windows 2003, I can't remember having a single problem related to windows itself, only third party apps.


Furthermore, with Linux you do not get onerous EULAs, activation headaches, nagware, spyware, an op-sys filled with "daemons" that like to go running home to mother every time you go on-line. This is why I prefer to use Linux. If these things don't bother you, then go ahead and use Win and be happy. But know this: you won't convince me to return to the Redmond fold. I get considerably more value from Linux than I can from any  current Microsoft offering. I don't need it; I don't want it.

I assume you mean GNU/Linux in this context. If you want to use only GNU software, you can avoid all the above mentioned crap in Windows as well. You don't have to use any software you don't trust, I definitely don't.

Linux can be more suitable to you, and as I said it's probably better for a lot more people because it's simpler than Windows. Windows is more complex, and way tougher to learn. Despite Windows being marketed for clueless folk, the Windows itself hasn't been designed for newbies. It's a serious OS for serious people, and currently (imo) the biggest problems are the amount of work it takes to properly configure one. If the default installation wasn't so braindead, a lot of you guys would appreciate the whole system more.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version