…the novelty of inventing programs wears off and degenerates into the dull labor of writing and checking programs. This duty now looms as an imposition on the human brain. Also, with the computer paid for, the cost of programming and the time consumed, comes to the notice of vice-presidents and project directors.
1951/52 saw the introduction of the first commercial computers with the Ferranti Mark 1, the LEO I, the IBM 701 and the UNIVAC I, which Rear Admiral “Amazing” Grace Murray Hopper was using at Remington Rand to develop the first operational compiler. The dilemma Hopper faced, and which started her work on the first compiler, was one which modern programmers can easily relate to, particularly the “notice of vice-presidents and project directors” concerning the cost of programming and the time consumed.
The novelty of programming had fast worn off, when the programming work that had to be done moved from the academia and pure mathematics to the commercial world and its at the same time repetitive and deadline driven demands. Before the compiler, it took literally weeks to write (with a piece of paper and a pencil) a program, put the machine instructions on to punched cards via a typewriter, which were then transferred to a magnetic tape and fed into the computer. The compiler – as is obvious today – allowed for “automatic programming” by choosing from and recombining an extensible library (catalogue) of subroutines without bothering about he particular instructions the machine was using to execute them on the computer.
Obvious today, it is one of the – if not the – major breakthrough in the productivity of programming, cutting down programming time from days and weeks to hours and allowing us to program not only in mathematical but in natural – domain specific – language terms. This concept was first introduced with Hopper’s own FLOW-MATIC or B-0 (B= Business Language) language and compiler, that later became one of the foundations of COBOL, which Hopper also helped to create.
This journey to the ever more efficient programming techniques is one that we are still on today with automated build systems, domain specific languages and still putting out optimized compilers to educate our machines. Efficiency is – of course – related not only to the time it takes to program but also the speed which we can execute an operation. Nobody explained the importance of processing time and latency better than Grace Hopper when she explained difference between nanoseconds and microseconds with lengths of string, measuring the distance electricity travels in vacuum (at the speed of light) in a given amount of time:
“Here’s a microsecond. 984 feet. I sometimes think that we should hang one of these above every programmer’s desk – or around their neck – so they know what they are throwing away, when they throw away microseconds.” Grace Hopper explaining nanoseconds and microseconds (YouTube).
So we mark the second diamond jubilee this year and another “crowning” achievement in honour of another one of the pioneers of our craft, journeying on towards better and more efficient programming, a journey on which we would also like to accompany you with your machine related educational problems.
Contact us, if you like us to have a look at your challenges.