Dec. 23, 2004, 6:13 p.m.

Philosophical Discussions About Everything

Warning and Disclaimer: If you are a sensitive reader or easily offended, do NOT read any further.

I have several strong opinions (note - these are all my own, personal and humble opinions and should be interpreted that way) on several aspects of society. I want to start with developers. By developers I refer to anybody that designs and writes computer programs. Traditionally, when the whole concept of programming started way back in the 60's and 70's (excluding the VERY early days as I know nothing about that :)), developers were very, very different people than the modern developer of today.

As an illustration, those days developers did not have the luxury of a PC (Personal Computer). They shared a mainframe on which they developed their respective applications. You had to book your time with the mainframe several weeks ahead in order to debug and test your code. This had several important consequences. Firstly, because time was a luxury and not a given, it was absolutely imperative for developers to make sure they maximise their productivity for each session on the mainframe. To maximise your productivity, it was obvious that you could not spend your time debugging faulty code. Time was much better spent ensuring that the program worked correctly than on debugging. They managed to do this by debugging their code using pen and paper. Yes, you heard right. Pen and paper. They used variable inspection tables and lots of different diagrams to trace the dynamics of the application before even running it on the mainframe. It was also quite expensive to punch the wrong program on the punch cards in use at that time, so in order to optimise your efficiency the program had to be as close to perfect as possible the first time.

I know times have changed and that we have nice debuggers etc. to assist us, and I am not trying to make the new technologies bad. But one thing got missing in the migration from those days to the present - the paradigm that you should pay intense attention to getting it right the first time. HTML development might be the best example of how this paradigm got lost. HTML is not a strongly typed language. In fact, there is no compiler or similar process that can verify the validity of an HTML web page (People do not really use the validating services available). Instead, most developers use trial and error on a browser they happen to run, to test whether their pages work. This is BAD. The reason? Browser vendors are notorious for adding propriety extensions and incomplete implementation of the standards. Testing your HTML pages on the web browsers your clients are going to use is definitely important for this exact same reason, BUT you should not VALIDATE them this way. Just because it works on a web browser, does not imply it is correct.

Modern developers tend to put way too much dependence on debuggers and trial and error programming. I know several developers who strongly feel against the use of variable inspection tables and self-code review. This is mostly because they do not understand why they should be able to know how to manually debug code. Although automation helps with debugging, it causes people to write sloppy, buggy, bloated and unreliable code. People do not care anymore to write good code. They just want to get the job finished.

It seems as if, in these days, software development is not an art anymore as it used to be. Today it is all just about revenue. About money, money and money. The more applications a company can get out of the door, the quicker a software project can be completed, the higher the revenue will be. It does not matter anymore that in order to develop software, one needs to adhere to the rules of performing the art. Imagine if artists do not sing anymore, but rather scream and make ugly noises. In the end, they produce sound waves. But surely people will stop listening to them as it is not enjoyable anymore. The same is happening with software development.

The scary thing is that people are actually so used to this that they think this is the norm. That software is intrinsically a grey science and prone to bugs, security flaws, poor performing etc. The first thing IT people do when a new major version of an application is released, is to determine when the first service pack (also called a fix pack) will be available. Windows XP shipped on 25 August 2001 (If I remember correctly), and the same day a combo-update was shipped fixing all the issues that were encountered during the period when they stopped development on Windows XP and the product being shipped. Why could it not ship with all those fixes already implemented? Because it was more important to release Windows XP because of market pressure than releasing a working Windows XP.

I am not trying to say it is easy (or even possible) to write bug free systems, but I do believe it is very possible to design and write highly stable, robust, well performing applications with a clean architecture and minimal bugs. We just need to educate users that they need to expect this kind of quality (and not concede to lesser quality), and turn around the whole paradigm that it is okay to release incomplete software. This is obviously not practical, therefore I guess things will never change.

I want to mention some thoughts I have about the quality of today's developers. I believe (and guess some people would want to sue me for this) that more than 95% of all software developers out there are actually completely incompetent. Many years ago IT and software development was not a buzz-word. People in this industry were not highly paid. People working in the IT industry were seen in the same light as scientists - which they were indeed. It was an art as well as a science. Almost all of these people were in this field because they had a passion for what they were doing. If you have a passion for something, and some brains, it is only natural that you will become extremely good at something. This is because you will always want to learn new stuff and master the old stuff. You will always want to improve upon your own skills and abilities. You will want to become the best because you have passion for what you are doing. It is not about getting rich.

Today, IT is a buzz-word. There are billions and billions to be made in this industry. Just look at IBM and Microsoft. This is because everybody has access to a PC. It is not seen as a science anymore. It is an everyday field in which everyday school children, university students and yuppies are working. Anyone spending some time infront of a PC is seen as a computer whiz-kid. Whether they just play games, use it for word processing or actually program on it, they are computer boffins. You do a quick 6 week/6 month course and you know everything. You spend one or two years doing the same thing over and over again (but not learning anything new), then you have one or two years "specialist" skills. It is appalling!!!!

Where is the time when programming was considered to be an art and a science? Using RAD (Rapid Application Development) environments anyone can slam together an application. I have nothing against any particular language, but I believe languages such as Java, Visual Basic and Delphi (to name a few) form a big part of this problem. They advocate RAD development methodologies (which in principle is not a bad thing), but in the process help people develop highly inefficient applications. Let me put it in another way. Say you are zapped through a worm hole to another, uninhabited planet. You are alone. Now you decide the only way you can get out, is to build a strong radio. It is obvious that the only way you can do this, is to truly understand (learn) how a radio works. You need to have a deep understanding of electromagnetism, physics, electronics etc., complete enough to build such a radio from scratch. But once it has been built, it would be easy for you to fix anything that goes wrong because you know the whole system off by heart. You will be able to add new features very quickly and efficiently because you understand the effect any change will have on the rest of the system. And over time, you will be able to build extremely high quality radios because you have a deep understanding of the underlying technology. Now imagine the same guy (but this time on earth) needs to build the same radio as a school project. He will go to an electronics shop, buy a kit and slam the components together (according to the electronic schematic diagram provided in the package) and he will have a working radio. Same results. BUT ask him to make it work on another frequency? Ask him why it does not amplify sufficiently? Ask him to change anything and I promise you - he will be stuck. This is because he does not intrinsically understand what he did. Actually - he does not understand at all what he did to get a working radio.

The same happens currently with most developers. They know a bit of something, and that is all. Now they are employed commercially and write mission critical systems? Obviously those systems will fail. But it is acceptable as software is like that - intrinsically buggy. Scary!

I knew a girl who was busy with her Honours in B.Sc Computer Science. She came to me and asked me a question about something I cannot remember now. However, I do remember my answer. I asked her how much RAM memory she has in her PC. She told me: 200MiB. This was in 1993. To give you an idea - in those days the top of the range desktop PC had 8MiB of RAM installed. Obviously she was referring to the hard disk space. She had been studying for almost 4 years and she did not know the difference between RAM and HDD. How can a plastic surgeon operate on a person if he knows how to work with his scalpel, but do not understand the difference between someone's nose and ear?

To delve a bit into code itself, I remember running lots and lots of assembly based applications that were written by people for the 1994 4K Assembly competition. The competition required you to write (and the resultant COM file could not be larger than 4096 bytes) a graphical application that had nice multimedia effects and cool graphics. The developers came up with applications that had rotating toruses with light shading and real time rendering, animated sinus fields, dynamically changing shapes etc., all in 4096 bytes of hand crafted Assembly language. Today even a stupid "Hello World" application is about 50KiB in size. Why? Because hard disk space and memory are very cheap - so why bother to use it sparingly? If petrol costed nothing, would you mind driving a V12? No - definitely not. But petrol is expensive, so you do not drive such a car as it would be too expensive. So you look at the fuel economy before you buy. Precisely the opposite is happening with software. It is preposterous that an Operating System should require you to have 2GiB of HDD available. It is absolutely crazy that a basic, simple Hello World application in Win32 should take minimum of 8MiB RAM to run in. I remember the days I wrote a screensaver in Assembler that was about 4KiB in size, yet had a fully featured screen saving effect, keyboard intercepts, video memory saving/restoring, TSR hooking/releasing code etc. All in 4KiB. I remember the days I had to decide whether I should make a variable a char or integer as I can score 1 byte if I make it a char (16-bit days...). Developers do not do that anymore. They simply use the biggest variable they can lay their hands on, even if it is a complete waste of space.

Today we have technologies such as .NET and J2EE. In concept they are great - the idea behind them is sane and valid. However, as people are trying to design the ultimate architecture for enterprise applications, they once again forget that their theoretically sane designs are bound to real world limitations. It does not help to have a modular, maintainable architecture if it does not work. I am not implying J2EE or .NET does not work, I merely suggest that they are architecturally flawed. There are so many layers, wrappers and more layers in those technologies that at the end of the day you need a huge server to run a simple application. Take RMI for an example as used in distributed J2EE applications. The idea is nice - create a technology that will allow transparent marshalling of state information between a cluster of servers so that you have redundancy and load balancing. Problem is, this is prohibitively complex to use and deploy, and the performance suffers so much you either have to give up proper object oriented design, or you need to buy insane hardware. By adding complexity to increase performance, they actually caused a far worse scenario than would have been the case otherwise. Whereas Visual Basic, Java and Delphi irritates me mildly because of the reasons I mentioned earlier, J2EE and .NET irritates me intensely.

So what does the future hold for us? I sincerely do not know, and I really do not want to know. I am too scared. I am actually ashamed to be called a software developer, because the name is synonymous with incompetency. Speaking of titles - I have a huge problem with the word "Engineer". Nowadays, if you played a game or two of Doom or Quake, you are a Software Engineer. If you studied at a college, you are an engineer. If you done a 6 month course, you are an engineer. Just weird that I had to complete a 4 year full time (8-5) degree at a university before I could even think to call myself an engineer. And the same goes for the title Systems Engineer. The scary thing is that most people do not even know what a Systems Engineer does...

I am sorry if I offended anyone reading this, but this is how I sincerely feel. It is so sad, because there are so few people left with whom you can share your passion with. It is so difficult to keep your own passion as increased market pressure forces you to abandon your arty and scientific skills, and to rush in with RAD technologies to get the software out of the door ASAP. Neglecting quality (not that people will ever admit to that).

Software Development started out for me as a hobby which I passionately pursued. It grew into the job that now brings food on my table. The job that I am increasingly starting to detest. Not because of Software Development itself, but because of what people made of it.

The reality is simple: To survive you need to abandon art, science and passion.