Formal qualification: B.Sc. (Hons) Electrical & Electronic Engineering.
Started my working life in 1971 as an apprentice in London Transport's Signal Engineering department. Won the Robert Dell award for Apprentice of the Year in 1973. Worked on various projects including the commissioning of the extension of the Piccadilly Line from Hounslow West to Hatton Cross. The testing of the signalling systems for such a small length of track were painstaking in the extreme: as an example every wire on every single piece of signalling equipment had to be verified according to the drawings otherwise everything would stop to analyse why. Safety is of paramount importance to these guys.
The Signal Department majored in Safety and implemented it in a hierarchical manner: mechanical interlocking (similar concept to that found in the humble house key), electrical interlocking (using electro-mechanical relays which were designed to be impervious to mains-borne interference - LT produce electricity at 33.33Hz and 125Hz which is harmonically unrelated to the indigenous 50Hz mains supply), compressed air valves would drive safety equipment. Electronics were treated with great distrust until Robert Dell pioneered the use of resonance and filtering techniques to effectively act as a safety switch - the Victoria Line was the first to incorporate this technology. Non-safety equipment used transistor logic gates encapsulated in resin to drive the safety circuitry. I started off designing electronic systems using transistors, then TTL 74xx logic chips, followed by CMOS, op amps and 555 timers (programming was with a soldering iron and took a lot longer!). The first microprocessor I programmed on was an Intel 8008, housed in a blue Intel box (see link for pic of front panel)
One had to program by entering binary instruction codes into the machine by use of the rocker switches. There was an inbuilt utility which one could program to run which would take in a paper tape of instructions. This could then in turn be run, taking the machine to a new level of sophistication, such as a text editor or a slightly higher level programming language to work with (assembler). This is now all done automatically in modern machines, but gives an insight as to why pc's take so long to boot up.
Have worked with many systems since then: Motorola's Exorcisor (for the 6800), the Intel MDS together with In Circuit Emulation tools (hardware assisted debugging).
The first lecturer who taught me programming asked us to write a program that solved quadratic equations. So we all did that (in Algol on an ICL mainframe) and he then put "silly" numbers into our Hollerith card decks for a, b and c which crashed everyone's programs. But this brought home the importance of testing Boundary Conditions. I think that lecturer could teach a well known company a thing or two on how to write more reliable software.
First industrial experience of computing was in 1973 on a GEC4080 using Coral: Real Time programming for signal control systems on the Northern and Victoria lines of the London Underground. To edit a file you had to load the editor into a 'shell' (a Background shell to be precise - Background in the sense that the real-time stuff had priority over anything you were doing), then you had to empty a destination file which was big enough to accept everything from the source file. You then had to connect the edit streams up: stream 3 was connected to the source file, stream 4 was connected to the destination file. You could then Run the editor, which had just a few commands (more primitive than edlin), you could go to a line, you could string exchange, or insert after, or you could terminate the edit. Because one was pouring the file from stream 3 to stream 4, modifying it on the way through, if you went past the place you wanted to edit, you would have to terminate the edit, empty the source file, load up COPY into another shell, connect up stream 3 to what was your destination file, stream 4 to the file you want to copy to (remembering to empty this first), then run the Copy process. Having done this, you can then go back to the Edit process, empty the destination file... After a while it becomes ingrained in your mind, the fact I can still remember much of the detail 20 years on is testament to that. With the upgraded system, we had VDU's and we could move around the file at leisure, but, BUT, in the case of a big file, you would have to load a dollop in, edit that, write that away, pick up another dollop, edit that - again there was no way to go back, other than starting again, but at least you could work at the dollop level, rather than a line at a time.
Moved on to train borne instrumentation (data logging and the like). Messed around with BAe's Automatic Test Equipment system (DUTE) and various Hewlett Packard computers, plotters, etc. Was then involved with the prototype automatic fare collection system at Vauxhall - which used conventional 808X Intel processors, together with some of their single-chip devices which had EPROM on board (erasable using UV light) and Assembler language - before moving back to the signal department in time to help upgrade their system to the later GEC technology.
In those days IBM Displaywrite was the wordprocessor of choice, documents were printed on daisywheels, and 8" floppy drives were the norm. What I want to know is whether anyone has recently invoked Inmac's guarantee for the media which was unconditionally guaranteed for life.
The first pc compatible I bought was a dual floppy Amstrad 1512. This got upgraded with a 20Mb Hard Drive (I can't remember how much RAM was in the machine, but it was measured in kb).
Then went to work for a friend who produced technology for visually impaired users (among other things: the company was also involved with the Anglo Russian space project). Wrote software that produced Braille bank statements from mag tape for a well-known bank, and Braille payslips for employees of another well-known bank. The company diversified into mainstream tech support and I became the de facto network support manager for an Italian textile company in the West End (about 30 users). The technology in those days was Thick Ethernet. You had to drill a hole in the network cable and apply a 'bee-sting' unit at regulated intervals marked along the cable (you couldn't space two 'bee-stings' any closer than 2.5m otherwise it wouldn't work). If you had a bunch of pc's together (as they did) you had to have a massive coil of Ethernet cable hidden under the false floor. The cable was thicker than IEEE1284 parallel cables and a lot less flexible. Network Operating System was Ungerman Bass (a derivative of Microsoft's LAN Manager). Software apps were written in Clipper. Everyone's favourite word processor in those days was Wordstar, Lotus Symphony was the spreadsheet and Freelance for graphics.
WordPerfect: Have been a user since version 5.0 (who remembers the UK distributors Sentinel Software in Addlestone? those were the days). Qualified with them as a "Certified Resource". Ran many intermediate/advanced training courses for corporate clients for all succeeding DOS and Windows versions up to the point where Novell handed over to Corel.
Since starting up my own business we as a company have written software for (amongst others) hospitals, industrial psychologists, a well-known international Auction House, management consultants, a dividend management system for a stockbroker, stock control systems for a fruit & veg importer and for one of London's top Jewellers, a Lettings Management system for a London-based Vacation Rentals company.
Started using Joomla for website designs, here's one of my first attempts:-
[UPDATE: I've become one of the EE Moderator team and I feel this must take priority over regular question-solving. I'm still very active on this site].