by Jon Foster
Happy New Year 2018!
I wish you all a very blessed new year. I thought it would be suiting to kick the year off by defining software quality, especially since I'll probably be referring to it a lot in future posts. So here goes...
It appears that most people define the quality of a particular piece of software based on its feature set. But this is a totally subjective analysis. Normally the result is going to be based on how well the software does what the user is trying to do or how they would like it to work. Still in some cases a user may get set in their ways and even if another program provides a better way of doing things it may rub them the wrong way and be labeled "crap". Although the end goal of software is to solve some problem and the prospective user certainly needs to evaluate the programs ability to provide benefit to their needs, that's not quality, that is simply "fitness" for a specific task.
I define "software quality" in 3 metrics (listed roughly in order of importance):
- Resource consumption
If a program has an epic fail in one aspect it will raise the importance of that metric. But by far the most important is "bug freeness". I define this as the programs ability to perform according to its documented behavior without unexpected side affects, arbitrary limitations, crashes or incorrect results.
As an example: in the mid '90s I was asked to join a software development effort that used FoxPro. Well M$ had bought FoxPro. I joined the effort just as the first M$ rewrite was released, and renamed Visual FoxPro (VFP). There really was much to like about the concept. But quality was poor.
One bug was a command called "setfocus" the purpose of which, according to the docs, is to direct keyboard input to a specific location on a form. As an example: You have a form to enter an address book entry. The "last name" field is first field on the form. Its only natural to want a click on the "new" button to put your cursor there so you can start the new entry with the name. This is stupid simple with most dev tools. Well it turns out that didn't work in VFP. Whenever the "setfocus" command was used from a button click it was ignored. This bug made the command virtually useless. The docs never mentioned this limitation and M$ claimed it wasn't a "bug" because it was "by design". Really?!?! That's like having your plumbing back up daily but not seeking a fix because it was designed that way. NOT! I'd fix the design. The bug is in the design!
That's a lot like being told that: 0.5 + 0.2 = 0.6999999999 really?!?! Not when I went to school! This floating point math flaw is designed into virtually everything. Its a bug. Any school student old enough to have dealt with decimal fractions will tell you the answer is 0.7, plain and simple. The common excuse is: "floating point results aren't exact, so adding more error is OK." Umm... many floating point answers are exact. But the whole concept of saying there will be some errors so its ok to add more really chaps my hide. The goal should be to strive for perfection not mediocrity.
I think the last two metrics are somewhat obvious but there will probably be many follow ups that expand on why these are important. The reality is that they speak to a burden placed on the end user. Having to wait or not for results. How often their hardware has to be replaced.... and so on. And these burdens multiply alarmingly fast when applied to software serving many users.
For most common business software needs a 16MHz machine with a handful of megs of RAM could get the job done. Its just that todays software has gotten so bloated you have to buy so much more hardware, a lot of it consumed in the OS itself. One commonly used exception is the web browser. The demands there are insane and the standards keep being rewritten to further the bloat.
The bottom line is that if you take my office suite and code editing, diagramming, sales and general business software my demands haven't changed since my 286-12 (w/ 4MB of RAM), which handled it all quite well. Yet to do the same things with todays software requires thousands of times more resources (Hz, RAM, storage, ...).
But to be completely fair things like mp3s, many megapixel images, video, .... they do require more resources, they are just big by nature. "More" does require more. And so I'm grateful for multi-GHz and multi-GB machines.
Oh yeah and 3D modeling is fun! Needs oodles of resources!
There are things that need the resources, but a large portion of things that don't need them now demand them. And they shouldn't. This is probably worthy of a whole discussion of its own but as a picture of bloat: I once tried an experiment. I wanted a simple pop-up one month calendar display for my desktop. No data storage. Just show me today on a one month calendar and let me scroll back and forth. I grabbed Lazarus and whipped it out in a few minutes. The resultant binary was ~2MB. I thought: Ug! All the real code is in the external GTK library. What's eating all the space? So I grabbed the GTK reference and re-wrote it just using the underlying FPC compiler. The result was ~80K. I loved it. But I wondered is there anything I can do to save more? With a few extra minutes and one simple tweak I was down to 40K!