[KLUG Advocacy] Linux tutor.

Robert G. Brown advocacy@kalamazoolinux.org
Fri, 03 Oct 2003 22:47:07 -0400


On Fri, 03 Oct 2003 20:22:52, Adam Williams <awilliam@whitemice.org> wrote:


>>...What did you have to do with these data sets?
>>Merge? Sort? Produce reports? Something else/additional?
>All the above.  Mostly these "catalogs", large files of part numbers,
>descriptions, minimum order quantities, shipping charges, etc...  We
>pull together catalogs from about 900 vendors.  Some vendors format the
>numbers differently (leading zeroes, dashes, etc...) all those were
>normalized to a certain form based upon a set of rules, then all of the
>same parts (available from different vendors) were brought together
>under a stocking code.  Output analyzed for most effective purchasing
>policies, average lead times, etc...

OK, so it was real, honest-to-goodness general nuts-n-bolts kinda stuff.

>Now imagine: A PC database (JLT-SQL), Borlands RAD tools, and Excel 5.0
>(I think) running on an 80486...
In the movies. Or with a bunch of C programmers, and a few jars of just 
the right "pharmacueticals". At least, for the timeframe you're talking 
about. I would have reccomended APL, and was getting good productivity
gains with that at the time, but really you're loooking at a fairly poor
environment for this kind of thig... everything carries fairly high costs,
and you're not using the processors well at all.

The other problem (at the time) was laos that the players were changing a lot 
of the rules, often too quickly to really complete large projects. Some folks
had to start over, sometimes with different performance characteristics. There 
were also a huge number of new tools coming out at the time, like Access, 
Crystal Reports, and SQL Server... too much to evaluate, and a lot of pressure 
to adopt one or more of 'em...


>..- OR - WingZ, awk, postgres, on an RS/6000?
Um, you've got a better shot at things here, people didn't appreciate Wingz
very much. I used it to slice through some problems, and of all the multipage
spreadsheet programs that were really serious, it was the easiest one to code
for on the add-in level. I wrote some stuff in C and C++ that went into Wingz
to give us some pretty powerful magic.

A lot of what commercial UNIX had to offer here s simply the plehora of tools
that were available. Even in the late 80's, you could wander around and 
download all sorts of data handling and manipulation tools off the Internet
(NSF-net at the time, having just moved over from ARPA) from all sorts of 
labs and universities. Licensing? Most of these guys didn't beleive in any 
steenkin' licencing... here's a .tar file, read the code, maybe a man page or
an example file, and compile it. You can't? Pity. You had to be a big boy to
play in that sandbox (well, mature enough, and sure of yourself and what you
wanted). Support wasn't a joke, it didn't exist, but the users were mostly
programmers, so support was generated on-site, pretty much. The amount of 
softwar ethat worked about right was impressive, actually.


>>>Windows didn't (and doesn't) out-of-the-box 
>>>provide much in the way of useful tools.  The tools provided with the 
>>>$300 Borland C compiler, $199 database, $200 Office suite weren't that 
>>>impressive either.
>>Right, everything is extra, and in general, it still is. Put another way,
>>"Everything" is included in a Linux distribution (I'm thinking about the
>Not to mention the TIME involved to bring a workstation to state of
>readiness.  On Linux I (a) install RedHat, (b) install Ximian, (c)
>install two Java apps (no reboot or logout required).  Thats about an
>hour and a half.  The same task on XP is an entire workday of install,
>reboot, install, reboot, install, reboot....
That's right, In addition, you can make system image CDs completely legally 
if all the stuff is GPL'ed or close, and do a lot of heavy lifing that way.
If there's some stuff for which that can't be done, it's usually very small 
compared to the rest of the system. This tends to further speed things, for
shops that have a lot of the same workstations...

>>"general purpose" distros). It would be a nice exercise to take each
>>(major) component of a (really, any) Linux distro and value it based on
>>(let's say, the average) price of comparable commercial components. I
>>beleive you would get a number no one would consider credible, unless
>>they were alrady familiar with Linux.
>Back in the day AIX was $1,400,  a comparable Windows installation with
>theoretically equivalent tools was about the same. (Early nineties, I've
>actually done that math).
OK, but what was the difference in the cost of the requisite hardware?

>Linux was $199 - $10 for the CDs, and $189
>for Word Perfect.  WingZ ran on either Linux or AIX, at no charge.  Now
>Open Office is free (and better than WP ever was, and oocalc can pretty
>much hold its own).  We only purchase one $89 per seat package
>(DbVisualizer).
Driving TCO through the floor. It's interesting to see what's happening to
these numbers in the Windows world. 

I'd like to ask the group a question, and see people chime in with their 
own experience on this... are contemporary Windows installs more stable,
less stable, or about as stable as their counterparts of 5 years ago?

The reason I ask this is that it is the "other" large determining factor
in computing TCO (besides purchase/licence costs for software). My analysis
indicates that while there are other costs, they tend to even out.

If I don't get good response, I might even post this as a seperate message.

>>>And on a 80386 you needed a program to run for *hours* ...
>>I managed to miss that era, mostly. I was on mainframes and ... I maanaged
>>to avoid THIS particular nightmare.
>Consider yourself lucky.
Oh, I do! :)

>>I am part of an overseas joint venture that has a number of really old 
>>(486 66mHz Dx2's) computers, with Linux installed, and 24 Mb of memory.
>>what we see is kids coming out of schools used to P-II's or better and
>>systems with RAM measuredi n hundreds of MB. We start them off on the 486's
>>because they have gotten very sloppy with memory use and so on, and that
>>makes for really bad software development. As they get acclimatized. we
>>REMOVE memory, but keep the demands for performance, etc. rather high.
>>When we get down to about 6MB,  with good software coming out, we've got 
>>a winner! Some people drop out, frustrated by the demands. Better to do
>>so in training, rather than on a project....
>That would be nice, there are way to many "declare a huge array" people
>bouncing around out there.
Yes, I recall (early 70's) people telling me that something WAS NOT WORTH 
DOING if it took up more than 32 K... yes, that's K with a "K", not M. I
wroe a lot of stuff in 28 K work areas, and felt that the light of day and
the blessings of the diety [ies] shone upon me when I could use 128 K to 
do similar stuff.

The last time I heard stories about doing plenty with no memory were as the 
USSR opened up, but before Western technology really got in there. A lot
of the really good Russian guys and the older American and europeans coders 
were kindred spirits, in that they were simply used to getting things done 
with almost no memory at all...

							Regards,
							---> RGB <---