Not looking to start a war over which is better, just looking to update my 
knowledge.
me also not looking to start an arguement, just sharing my knowledge, which 
can be incomplete & flawed.
People start doing things a certain way ... there may be 10 different ways 
out there, but people find a way that works for them & they stick with it, 
until face overwhelming evidence that it is time to change how things are done.
In the early days of computer development there have been many models for 
keeping track of records, some of which had poor ability to recover when 
something goes down.
PC locks up & has to be rebooted ... it was in middle of updating data base.
ISP Ma Bell connection goes down ... a whole bunch of people were in middle 
of updating data base
Power supply fails ...
There's ways to protect against these failures, and do better job of 
recovering when if they happen, but this requires $ investment that some 
top management cannot be persuaded to invest.
So we have a history of such failures, and certain types of data base 
design get a reputation for being very poor to recover after such downage.
Variable field lengths, chain files where control fields are disconnected 
from their data.
When recovery means data is lost, or a day's work has to be re-entered, 
then those data designs people want to steer clear of, even though computer 
evolution may mean that is no longer a risk.
I am speaking in general terms.  We know IBM has a bunch of competitors 
with similar mind sets.  We know that Microsoft Windows interfaces came 
from Apple Macintosh but Apple can't sue Microsoft for intellectual 
property theft, because Apple originally stole that technology from Xerox 
Sparc.
We IBM customers are from a mind set of mission critical: inaccuracy is 
unacceptable; data base integrity is absolute neccessity; dirty data must 
be identified, cleaned out, actions taken to prevent regeneration; security 
matters.
Not everyone in the IBM customer base adheres strictly to these principles.
In the early days of the IBM model, computer technology was astronomically 
expensive, workers cheap, dime a dozen.  Convenience & efficiency for the 
work force was not important.  But times are changing.  There has been an 
evolution in IBM interfaces to facilitate programmer, and other 
productivity, inspired by Microsoft interfaces.
Microsoft's roots were in mind set of extreme low cost convenience, look 
nice, easy access.  The original applications were not business data, but 
home computer games, word processing.  Over time moved into IBM customer 
world, had problems, addressed them, some people not happy with speed of 
getting to matching IBM quality.
Many users get to see both what IBM can do, what Microsoft can do, want both.
Some day we will have the best of both worlds.  But we not there yet.
Al Macintyre
40 year veteran of midrange computers
often X years of experience means repeating same year X times 
 
As an Amazon Associate we earn from qualifying purchases.