As I sat this afternoon waiting for a compile to finish, a really long compile, I was struck by something, the complexity, number of libraries, patterns, algorithms and tools I use to write a program for the PC are far more in depth and wide ranging than I ever imagined whilst training as a programmer.
During my A-Level years, when I first cut my teeth on PC programming - creating programs in Turbo Pascal 7.0, from Borland, and later retro porting them back to Personal Pascal on my Atari ST (because I didn't own a PC and wanted to continue programming), then things were very simple.
I had some memory, usually around 500,000 characters tops, split into - I think from memory the the excellent Borland compiler output/progress - around 34,000 lines top...
This was a big program back then for me, it was a stock control program, emulating the features of just such a program I'd seen being used on a Coop POS Machine and wanted to ape in an effort to make some money. The program never shipped, it never sold, it was finished and I printed it out and it became part of my project for my A-Level the following year. But it was a whole program, debugged pretty well, with a DOS based 80x40 character display (so I pretty much wrote my own ncurses before there was an ncurses) and I could supply it to support a serial port driven barcode scanner, or a parallel port scanner and parallel port printer - though no machines I could find ever had two parallel ports for testing this.
I presented this program to two potential customers, small local mini-markets but they just shrugged at the cost of a whole PC just to be a til, "We have perfectly working tils and I can add up on a calculator" was the general feel. So, my software never shipped, despite being just £100. Where a commercial scale stock in/out/POS system was in the 10's of £1000's at the time.
And all this system was a mere 34KLOC.
When I went to Uni, in the first year on the of the modules was the history of computing, they should really have made that module be "The History of American Computing" because it took no stock of the input of the Poles pre 1939, nor Britain through the second world war, nor the machines from Manchester and for Lyons post war. It just lectured on about PDP machines, DEC, IBM. We were a generation of programmers in the embers of the 1990's still being brought up to think about BIG IRON.
To think about programming for a machine which is going to support hundreds of users, time-sharing, indeed, for a time I worked with IBM machines running AS/400 and then I saw a seed change other large scale servers were well spec'd PC's (I remember the first Pentium III I saw was actually a Compaq machine put into a 4u server wrack and used to run a custom cut of SCO Unix) and these well spec'd machines became the lead for enterprise serving of data.
I worked at the turn of the millennium on, what was then thought to be a hot topic, web-servers hosting application platforms. Things like IBM Websphere, Apache TomCat and Jakarta. Creating applications melted from being very specific tuned code, efficient, small, effective, to Java...
IDE's went from being text editors and technical manuals to being code completion driven, supportive environments, programming became easier. The Script Kiddy was born around this time out there in the world and as a qualified Software Engineer I saw a wave of kids out of college whom had not learned to program Pascal, or even Java, they'd learned to throw together VBA inside Access or Excel, and they were being taken on as Java programmers and not one of them had any idea about proper memory management.
And I have to admit, at the time I fell into the web if lies, I relies on the JVM. I moved back and forth in different web based programming jobs, working for Gas Companies, not delivering very much for another company doing metering devices before being laid off, and then I did consultancy work for people wanting web sites.
Lucky for me, the .com boom went bust and I had to get a proper job, and that job was C++. But I was out of practice, and some of the software I created at that time was pretty terrible. We're taking the time in C++ where there were no libraries to help, no STD, no boost... You were on your own, you want a linked list, well you better fucking write on son.
But, C++ like me, like the PC hardware I write for and use has matured. I'm sitting now with oodles and oodles of horse power in my machines, this laptop alone has 8 cores and 8 gb of RAM before we even look at the massive nVidia graphics card within its bowels.
And so I find my self reminded "with great power, comes great responsibility" for I'm trying my damnedest to create really good code. Thread safe, memory leak free, timely and working code, my chosen medium for this is C++. And I'm proud to have delivered early (yes a whole 3 days early, but early none the less) my latest and first official C++ project in about 5 years.
Its a completely C++11 compliant project, using elements of the STD and boost, and I'm quite proud to say its a nice piece of software, one I'd be happy to hang my hat on.
However, C++ is not the main focus of my employer, this is good and bad, good because in being the only person working in C++ and the only person working on this project I was able to inject my own ideals into the mix, I applied my own work ethic, my own design and my own coding style. I've been big on a C++ coding style for a while and elements of it have worked so well they've moved into the other languages I use.
This whole C++ project by the way had code totalling (written by me) about 224,000 lines of code. It uses around 5 major parts of the boost libraries, not to mention linking about many linux libraries, and it uses a plethora of parts of the STD. It compiles on my slowest machine in around 30 seconds, it compiles on the target hardware my employer owns in about 6 minutes - yes I have nice kit, but they've not bought decent kit in about ten years.
And whilst sat in one of these 6 minute stupors, I got looking at the code files, one of the class support files, this is a large header only implementation of around 100 functions - which I port around rather than libraries as it compiles on Linux, Windows OSX and Android easily - this file alone was over 34.2 kilobytes. My first computer ever a Commodore 16, only had 16 kilobytes of data. The first machine I ever programmed could not hold this one code file completely in memory at any one time.
This was in sane.
This thought immediately grounded me, back to thoughts of writing for BIG IRON. Looking at the code thinking, has any bloat got in there, have I fooled myself into thinking the code base if decent? Am I not seeing the elephant in the room that there is some fault.
And I realised, this was the teaching I had had all those years ago, from programming myself on the Commodore, I'd then moved to the ST and then to the PC and I learned to program by always taking the work I was doing now back a layer, I did commodore basic on the C16, then Basic on the ST, then Pascal on the PC and then back Pascal on the ST... its a chain.
So now, with a leap I have to take my latest C++ and look back to that first time I was learning C, to evaluate the code not for the machines of today, but to make it run that little bit better over all now as if it were on those limited machines I first worked on.
My first PC was an 80486SX2-50Mhz with 4MB of RAM, the SX chip could not divide or multiply on the chip, it did it as a software shim as additions and subtractions, I didn't know this at the time, I just organically looked for the best solution to a problem. Like for example checking for a remainder to see if something is odd or even....
int i = 245;
int j = i % 2;
bool isEven = (j == 0);
On a processor able to work out multiplications, with a math co-processor, this is the fastest code, however on an SX it turned out, back then with the shitty compiler I had that actually to remove any digits above the last and examining it were faster...
I can articulate this now, I understand why I did it, I'm not intimidated by not knowing that there was a better way. But back on a day in 1996 I was very afraid of not knowing why, or how.
Now-a-days you'd just pick up the internet connection and look up such a piece of code and see tens, perhaps dozens of suggestions of how to do that task, but not-one would teach that sometimes you have to write for the bare metal. Sometimes you have to optimise for the task at hand, for a specific job.
Languages like Java, C#, python and others running in virtual machines/interpreters don't teach you to bend your machine to its best ability, they teach you, or nail you over the back of the head with it time and again, to write code to best fit their runtime, their rules and their reasons.
So, my employer?... Their main effort is C#, their main product is C#. Its slow, its clunky and its really pushed the hardware to its limit...
If the same system were worked in C++ the hardware being asked to run it might live another 5, maybe even 7 or 8 years. I told them this a year ago, I've now delivered a highly specialised C++ custom tailored solution just for one set of hardware and unlike my C# projects - which spend a protracted amount of time debugging to suit the C# virtual machine rather than the actual machine we're running upon (which is a constant machine, not changing spec) - which seem to take up the maximum amount of time and even then want more time to get themselves done.
Why am I writing all this down?...
To be honest, I don't know I just had all this junk on my mind, it has no point, no purpose, if you read it sorry... but sometimes I just need to rant.