[KLUG Advocacy] Linux tutor.

Adam Williams advocacy@kalamazoolinux.org
Sat, 04 Oct 2003 07:52:18 -0400


> >>...What did you have to do with these data sets?
> >>Merge? Sort? Produce reports? Something else/additional?
> >All the above.  Mostly these "catalogs", large files of part numbers,
> >descriptions, minimum order quantities, shipping charges, etc...  We
> >pull together catalogs from about 900 vendors.  Some vendors format the
> >numbers differently (leading zeroes, dashes, etc...) all those were
> >normalized to a certain form based upon a set of rules, then all of the
> >same parts (available from different vendors) were brought together
> >under a stocking code.  Output analyzed for most effective purchasing
> >policies, average lead times, etc...
> OK, so it was real, honest-to-goodness general nuts-n-bolts kinda stuff.
> >Now imagine: A PC database (JLT-SQL), Borlands RAD tools, and Excel 5.0
> >(I think) running on an 80486...
> In the movies. Or with a bunch of C programmers, and a few jars of just 
> the right "pharmacueticals". At least, for the timeframe you're talking 
> about. I would have reccomended APL, and was getting good productivity
> gains with that at the time, but really you're loooking at a fairly poor
> environment for this kind of thig... everything carries fairly high costs,
> and you're not using the processors well at all.

Yep, writing C code to transform numbers based on "regular expressions" 
(didn't know that I was re-inventing at the time), and do other sort of
those things is really NON-OPTIMAL.  I didn't even know anything like
APL existed.  Sometimes it was easiest to chunk the numbers a few
thousand (thats thousands, not even tens of thousands) numbers at a time
through the spreadsheet.  When I found awk on AIX I thought I'd died and
gone to heaven.  I was really surprised that there were no equivalent
tools on the "PC": "I mean WTF?  Doesn't anybody know about these?"  The
answer was "Apparently not", so risking $10 on some funky OS that
claimed to run on my PC *AND* provide those tools?  Easily worth the
risk.

> The other problem (at the time) was laos that the players were changing a lot 
> of the rules, often too quickly to really complete large projects. Some folks
> had to start over, sometimes with different performance characteristics. There 
> were also a huge number of new tools coming out at the time, like Access, 
> Crystal Reports, and SQL Server... too much to evaluate, and a lot of pressure 
> to adopt one or more of 'em...

Yep, even Borland was mixing it up between every release of their
software (and at $300 a whack).   Access looked way to much like dBase,
and I didn't want to go there, I had enough tools that couldn't deliver
as it was.

> >..- OR - WingZ, awk, postgres, on an RS/6000?
> Um, you've got a better shot at things here, people didn't appreciate Wingz

People didn't, I did.  It was "odd" about certian things, and they could
have learned some UI lessons from the Excel folks,  but it could process
data - which was sort of the point.  Excel could think and possibly day
dream about processing data.

> very much. I used it to slice through some problems, and of all the multipage
> spreadsheet programs that were really serious, it was the easiest one to code
> for on the add-in level. I wrote some stuff in C and C++ that went into Wingz
> to give us some pretty powerful magic.

Never wrote any C code for WingZ,  I was happy it didn't puke every
fifteen minutes.

> A lot of what commercial UNIX had to offer here s simply the plehora of tools
> that were available. Even in the late 80's, you could wander around and 
> download all sorts of data handling and manipulation tools off the Internet
> (NSF-net at the time, having just moved over from ARPA) from all sorts of 
> labs and universities. Licensing? Most of these guys didn't beleive in any 
> steenkin' licencing... here's a .tar file, read the code, maybe a man page or
> an example file, and compile it. You can't? Pity. You had to be a big boy to

Right, you could dial up via someone like Delphi and get into
sunsite.unc.edu.  Gold Mine!  I didn't know anything about licensing,  I
just knew I could get it and didn't care.  Although I did learn to loath
Imake.

> play in that sandbox (well, mature enough, and sure of yourself and what you
> wanted). Support wasn't a joke, it didn't exist, but the users were mostly

And support existed for doing any real work on a PC?  Only for more $$
than the PC cost.

> programmers, so support was generated on-site, pretty much. The amount of 
> softwar ethat worked about right was impressive, actually.

True, but it wasn't all that bleak.  Those thick white volumes from IBM
documented everything (and I mean everything down to the parameters for
the various system calls).  One could at least take a stab at it.  And
on the PC... "This application has performed an illegal operation" 
(i.e. "Tough cookies").

> >>>Windows didn't (and doesn't) out-of-the-box 
> >>>provide much in the way of useful tools.  The tools provided with the 
> >>>$300 Borland C compiler, $199 database, $200 Office suite weren't that 
> >>>impressive either.
> >>Right, everything is extra, and in general, it still is. Put another way,
> >>"Everything" is included in a Linux distribution (I'm thinking about the
> >Not to mention the TIME involved to bring a workstation to state of
> >readiness.  On Linux I (a) install RedHat, (b) install Ximian, (c)
> >install two Java apps (no reboot or logout required).  Thats about an
> >hour and a half.  The same task on XP is an entire workday of install,
> >reboot, install, reboot, install, reboot....
> That's right, In addition, you can make system image CDs completely legally 
> if all the stuff is GPL'ed or close, and do a lot of heavy lifing that way.
> If there's some stuff for which that can't be done, it's usually very small 
> compared to the rest of the system. This tends to further speed things, for
> shops that have a lot of the same workstations...

And restore the system image to a different workstation, with different
hardware - Linux cares how much?  You might have to reconfigure X and
sound.  Windows XP?  Hah!

> >>"general purpose" distros). It would be a nice exercise to take each
> >>(major) component of a (really, any) Linux distro and value it based on
> >>(let's say, the average) price of comparable commercial components. I
> >>beleive you would get a number no one would consider credible, unless
> >>they were alrady familiar with Linux.
> >Back in the day AIX was $1,400,  a comparable Windows installation with
> >theoretically equivalent tools was about the same. (Early nineties, I've
> >actually done that math).
> OK, but what was the difference in the cost of the requisite hardware?

You got it there.  RS/6000 530 - $57,000.  Compaq Presario - $2,100.

But, about ~100 people were using that RS/6000 at the same time I was
doing stuff that would bring the Presario to the ground.

$57,000 / 101 = $564.36 per user.
$2,100 / 1 = $2,100 per user.

Aren't we talking about TCO?  Yeah, you'd have to add $350 per user for
a terminal.  So the RS/6000 came out at $914.36.

Of course you did inherit some limitations - no excel, etc...  But at
the time these really weren't considered earth shaking, if most of the
people even knew what a spreadsheet was (cough, most still don't but
think they do).

> >Linux was $199 - $10 for the CDs, and $189
> >for Word Perfect.  WingZ ran on either Linux or AIX, at no charge.  Now
> >Open Office is free (and better than WP ever was, and oocalc can pretty
> >much hold its own).  We only purchase one $89 per seat package
> >(DbVisualizer).
> Driving TCO through the floor. It's interesting to see what's happening to
> these numbers in the Windows world. 

Up, up, and away.

> I'd like to ask the group a question, and see people chime in with their 
> own experience on this... are contemporary Windows installs more stable,
> less stable, or about as stable as their counterparts of 5 years ago?

Oh, much much more so.  XP doesn't spontaneously crash, or at least I've
never seen it.  Individual processes hang (to where they don't even
paint their windows anymore, just grey), sometimes for minutes at a
time, with no apparent rhyme or reason - and, again, no tools provided
to do any meaningful analysis of the problem (fuser, lsof, etc... but I
think they have netstat now).  I've crunched my data in Excel and XP, it
is WAY beyond where it was.  It is still REALLY slow doing the same task
and consumes (this is actually excels fault I think) enormous amounts of
RAM - but it does work, and RAM is cheap.

> The reason I ask this is that it is the "other" large determining factor
> in computing TCO (besides purchase/licence costs for software). My analysis
> indicates that while there are other costs, they tend to even out.

With twice the hardware and about $1,500 worth of software I could do my
job on XP with no intense pain (other than actually needing to
Edit->Cut, Edit->Past, all the time).  There are now sufficient command
line tools to do lots of basic tasks quickly (ipconfig, nbtstat, arp,
route, pac, etc...)  I really think the ongoing cost would be roughly
the same, except for the annual upgrade fees.  But I don't use support,
ever.  I have support contracts with IBM for hardware, and Informix for
the RDBMS software, but nothing for PCs.  And I call on those contracts
I have maybe once a year.

> If I don't get good response, I might even post this as a seperate message.

And you might not get one even then.

> >>I am part of an overseas joint venture that has a number of really old 
> >>(486 66mHz Dx2's) computers, with Linux installed, and 24 Mb of memory.
> >>what we see is kids coming out of schools used to P-II's or better and
> >>systems with RAM measuredi n hundreds of MB. We start them off on the 486's
> >>because they have gotten very sloppy with memory use and so on, and that
> >>makes for really bad software development. As they get acclimatized. we
> >>REMOVE memory, but keep the demands for performance, etc. rather high.
> >>When we get down to about 6MB,  with good software coming out, we've got 
> >>a winner! Some people drop out, frustrated by the demands. Better to do
> >>so in training, rather than on a project....
> >That would be nice, there are way to many "declare a huge array" people
> >bouncing around out there.
> Yes, I recall (early 70's) people telling me that something WAS NOT WORTH 
> DOING if it took up more than 32 K... yes, that's K with a "K", not M. I
> wroe a lot of stuff in 28 K work areas, and felt that the light of day and
> the blessings of the diety [ies] shone upon me when I could use 128 K to 
> do similar stuff.

Was that 128k without bank switching?  Hallelujah is right.

> The last time I heard stories about doing plenty with no memory were as the 
> USSR opened up, but before Western technology really got in there. A lot
> of the really good Russian guys and the older American and europeans coders 
> were kindred spirits, in that they were simply used to getting things done 
> with almost no memory at all...

Someone should go back to manufacturing the Commodore PET, just for
training purposes.  Talk about agony.  OK, maybe we'll give then a
KayPro II.