Peter Bowditch's Web Site
 


Articles published on my business blog

Sometimes we geriatrics know things

April 2, 2014

The following observation was made in a Facebook comment responding to someone who had said that her current work at a law firm was related to a course she had done in law school but she had forgotten all about it since she closed the books.

At least it isn't outdated before you even graduate, unlike IT

I replied:

If you do a degree in IT and it's all out of date when you finish then you haven't been taught (actually, educated) properly. You have probably just been doing a sort of specialised trade course. The principles haven't changed for decades, just the tools.

The boss of a major retail chain once declared that nobody over the age of 30 was allowed to work in the computer sections of his stores because old people couldn't understand technology. I was grateful to him for this because it made my shopping simpler as I didn't have to consider any of his shops when I wanted to buy anything. And I mean "anything". If he wants to insult my competence at what I do for a living then I don't need to do any business with him at all.

The computer hardware and software business is always given as the fastest-changing business in the history of the world. This is quite possibly true, but the advances have been in how things are made, with those advances in hardware making it possible to do things that could only have been dreamed about before. There is an IT meme known as Moore's Law. Gordon Moore was one of the founders of Intel and his law states that the number of transistors that can be put on an area of integrated circuit doubles approximately every two years. This has been remarkably consistent over the last few decades. To put it into a sort of perspective, if you had acquired an Australian 10 shilling note in 1948 (the year the transistor was patented) and doubled your money every year, changed over to dollars in 1966 and converted it all to $100 notes when they came out you would now have a stack of money that would reach to the Sun. 174 times. Approximately.

I recently attended a memorial service for the person who gave me my first job as a programmer. He had worked at Cambridge University on the world's first commercial stored-program computer. One of the speakers at the ceremony had been working at the University of Manchester at the same time on a competing project. (Cambridge and Manchester still argue about who got there first. It seems that both did for certain values of "first".) I commented to someone that my mobile phone has a MicroSD card holding 16 gigabytes of data in a package about the size of the nail on my little finger. (It's that big because making it smaller would make it difficult to handle without tweezers.) The first computer installation I worked in had 504 kilobytes of memory, which might sound like a lot but it was spread across four machines.

Yes, this is a business which has seen enormous changes at an enormous rate over very few years. But as I said above, it has been the advances in hardware which have allowed us to do all the wonderful things we can now do in software. Many of the things we take for granted today were thought of long before they were practical and had to wait until Moore's Law produced the hardware that moved them from idea to reality. Anyone who has done programming for a graphical interface will be familiar with the concept of dragging screen objects to where you want them to appear in the final product and leaving all the messy stuff like producing the computer code to make it all work to some program running out of sight in the background. I've had lunch with the person who invented that idea (and didn't bother to patent it!), but when it was first developed it had no practical use because the computer power needed was too expensive for most potential users. When I did statistics at university I used an analysis program that cost the university $100,000 and needed an expensive minicomputer to make it work. All the functions I used were added as a free update to Microsoft Excel a few years later, once desktop computers had achieved the necessary capability.

I am continually amazed, however, at the ignorance of history shown by supposed IT experts, particularly those who claim to have specialist knowledge. I remember a few years back when the senior IT journalist for a major Australian media company wrote an article ridiculing Byte magazine for giving a lifetime achievement award to Commander Grace Hopper of the US Navy. His argument was that it was highly unlikely that any woman had had anything to do with computers back in the 1950s and 60s and that even if she had there was no possibility that the Navy could have had any contribution to make. Commander Hopper (later promoted to Admiral) was one of the people responsible for the development of the first higher-level programming languages (which enable programming to be done in a human-readable form rather than in machine code), played a large part in the creation of the FORTRAN and COBOL programming languages, and coincidentally popularised the term "bug" in the lexicon of computers. A professional IT journalist not knowing about Grace Hopper is like a physicist being unaware of Newton. (Even the best can make mistakes, however. I found a picture dated 1952 of Grace Hopper with an early Univac machine on a computer history site. She is holding a COBOL manual. COBOL was not developed until 1959.)

I've been working with computers since before the invention of the Internet (yes, youngsters, there really was a time when no two computers were networked), so I'm entitled to some anecdotes.

Anecdote 1:
In the mid 1990s I was supporting ACT! for a client who sold photocopiers. Colour copiers that could be used as computer printers had just come on the market and the client was marketing them to graphic artists. As part of the campaign they had set up a little facility in their sales display area using an Apple Macintosh computer (because that was what most of their customers and prospects used).

I came in one day and was asked if I knew anything about Macs. They had approached three Macintosh experts, none of whom could solve their problem which was that Photoshop would not load and start on the computer. Three experts! I fiddled around with things on the screen (I was not a Mac expert) and finally found the Control Panel. I turned on the tick next to "Use virtual memory*", double-clicked the Photoshop icon and there it was – loading slowly, but loading nonetheless. I suggested that they take the machine to someone (other than the three "experts", of course) and have the maximum amount of RAM fitted.

I was asked how I knew about this and replied "Because I worked on IBM mainframes in the 1970s".

(* Virtual memory is the ability to use the hard disk as an extension of RAM. This means that you can effectively load and run things that appear to require more RAM than you actually have. In the early versions of the Mac operating system it was available but turned off by default; Windows from at least 3.1 onwards has it turned on by default, as does the current Unix-based Mac OSX. In Windows the extension is in a hidden system file named "pagefile.sys" which is on your C: drive unless some "expert" has moved it.)

Anecdote 2:
I was Boss of Computers at the company that installed the second ever Unix computer in a commercial site in Australia (previous Unix sites had all been in universities or research organisations like CSIRO). I later did consulting work for the first installer. (Being second is sort of a habit – I was at the place that installed the second Australian wide area network to be used for interstate backup (and no, we didn't call it a cloud). We were second because Wang tossed a coin to choose who would be first, then implemented both on the same day.)

Because nobody in my small IT staff had ever used Unix I organised a training course at the University of Technology Sydney. When we turned up at UTS for the week's classes we were simply added to an already-scheduled undergraduate class.

After a couple of days of mind-numbing banality it was announced that we would be spending a day learning "C" programming. I asked why this was necessary in a course about an operating system and was given the non sequitur reply that "Unix is written in C".  In his introduction the lecturer advised us that C was the first programming language to implement data structures.** I put my hand up and told him that not only did both FORTRAN and COBOL (from the 1950s and 60s) have data structure definitions, but in COBOL it was impossible to write any procedural code until the data had been described.

His reply was priceless: COBOL was an old language and therefore could not have had this capability, nobody used COBOL any more***, and (best of all) he didn't know it and he was a university lecturer with a PhD so it couldn't be true. The undergraduates in the class tried arguing with me because I must be wrong!

(** A programmed data structure is a means of describing the characteristics of data so that commands can operate on it. At a trivial level, some languages will not let you add two things together unless they have previously been defined as numbers. You can also handle a collection of related pieces of information as a whole, such as writing a customer record to disk. In fact, all programming languages use data structures, but they don't all use that exact phrase.)

(*** This was in 1987. In 1999 when everyone was panicking about Y2K, COBOL was still the most widely-used programming language in commercial and business sites around the world.)

Anecdote 3:
Speaking of COBOL and Y2K, one of the things said by "experts" was that years had been stored as only two digits to save storage space. The following two data definitions in COBOL differ in only one way – the way the data is stored.

02 YEAR PIC 99.

02 YEAR PIC 9(4) COMP.

Both store the year in two bytes. The first stores two digits in two bytes; the second stores four digits in two bytes (on IBM mainframe computers, where the vast majority of this old software was installed). This had been in the programming language specifications since at least 1963, but I had worked in places where only the first was allowed because otherwise "programmers might be confused".


Previous pageNext page




Copyright © 1998- Peter Bowditch

Logos and trademarks belong to whoever owns them