MS actually will give you quite good help... if you pony up for the
several-hundred-buck-per-incident support phone lines. :-( (And there's no
charge if the problem turns out to be due to a bug, supposedly.)
Most people need to earn a living. In the long run it is much better
for everyone, if all programmers could write software while being
paid, AND have the software being open source.
How much good quality code has been lost because it is closed
source and have been abandoned for one reason or another ?
If the situation is such that commercial companies find it to there
benefit to pay for open source development, then in the long run
Because they are the ones distributing the binaries. Though I agree it
does not make much difference here - but in general how do we know if
the official sources contain all the modifications needed to make the
software build correctly? It is safest if the code is available, at
least in principle, from a single source (i.e. the same one that
supplied the binaries).
A hobbyist writes some code, and publishes it on his personal website.
A large corporation then distributes binaries with their product, and
refers their users to the author's website for the source. The author's
website either goes down or gets shut down for drastically exceeding its
A less common issue is that the site hosting the source may be less
accessible than the one hosting the binaries. E.g. the source may only be
available by CVS (which may be a problem if you're behind a firewall or
proxy which only allows web and email), or it may be on a shared community
site which emphasises free speech, and thus finds itself on the wrong side
of filtering proxies.
Hence the GPL's requirement that "equivalent" access means that the source
must be available from "the same place" as the binaries.
I'm sure they tested on "a lot" of platforms, but you can't test *all* of
them. Microsoft has an unfair advantage here: anyone who makes hardware
will test it for Windows. If they didn't, even Microsoft couldn't test
Windows against every piece of PC hardware.
It doesn't help when h/w vendors pull tricks like using the same product
code for a dozen substantially different versions of the hardware.
Of course, they'll adjust the supplied Windows drivers accordingly, but
Linux users are left having to read the part numbers on the chips to
figure out exactly which product they have.
I'm surprised that so many manufacturers seem to do this. I've always
figured it's been marketing-driven -- the marketing guys see that the WRT54G
is selling like gangbusters, so they figure it's "risky" to release, e.g., the
WRT55G. Still, there could at least release a WRT54Gv2, which to most
consumers still says, "better than the first!" and doesn't risk the loss of
Chip vendors do the same thing. If volume gets high enough
they often switch to a new version of the silicon that is
less expensive to manufactur but still meets the same specs.
I gather that vendors have procedures so major customers get
notified and/or get a chance to test the new version without
getting surprised in case there is some obscure not-documented
feature that they are depending upon which is different in the
In the era of the 6502-based microcomputers (Acorn, Commodore), use of the
undocumented (i.e. undefined) opcodes was so common that some assemblers
included mnemonics for them. If you swap the CPU for a 65C02 (where the
undefined opcodes are all NOPs), a lot of software (esp. games) won't run.