Tuesday, May 22, 2012

Quality

It seems like a universal thing that good quality is something to aim for and bad quality is something to avoid. It seems like a simple idea to achieve good quality by doing things right. The problems seem to start right after that with the difficulty of defining what it means to do it right and the even greater difficulty of doing it according to the definition.

Here at the issue department we're mostly affected by quality related to software. The literature on software development is full of different process models and methodologies which ultimately aim at pushing the quality of the end product higher. There have been numerous seminars on the topic and masses of warm air has been produced by the advocates of different standards and certifications - but what has been achieved?

The statistics say that not so much.

And that's kind of sad.

Please remember, I'm talking on a generic level, there are certainly good examples of good quality software, but there are also examples of the opposite kind. On the generic level, my guess is that the main reason for the situation is that nobody is willing to pay the price tag of really good quality, so the development (and testing & QA) teams end up with something adequate. This also applies to entire ICT systems, redundant and high-performing iron costs big bucks, not to mention the energy bill for having that hardware switched on even though good quality might already mean also low energy consumption. Both from the financial and ecologic viewpoint everything that can be cut off from idling ICT reserve is a good idea. Cutting off testing resources might not be equally good idea, at least it is much harder to define what is the absolutely required amount and what is extra reserve (in fact it could be argued that everything needed to establish 100% code coverage is absolutely required). And there's always the fact that it is much cheaper to do it right in the first place instead of trying to fix it later (perhaps even after some costly damage has been caused by the faults).

Now, until now I've totally skipped one area of quality, that is security. Certainly adequate security is an essential part of good quality, as what's the use of having 100% functionally correct system that can be easily made to do things that are essentially against the purpose of it, right? Reading the discussions in software related groups in LinkedIn, for example, have taught me that there is a kind of silent battle going on between enthusiastic penetration testers (ethical hackers?) and the more functionally oriented "traditional" testing pros. Or maybe it's not a battle between those two but maybe with the management as with security testing it is even more difficult to measure the completeness of testing - and one can always argue that the attained security by partial testing is already adequate due to the nature of the system as with information security we're always dealing with probabilities. If there is a lurking theoretical possibility of gaining illegal access to the system which is very unlikely to be ever realised is there any point in putting effort on finding it (not to mention fixing it)? And if the information contained in the system is not valuable or confidential, why bother securing it? On the other hand, what's the use of any computer system which contains or processes only invaluable information? Certainly it costs something to develop/acquire such a system and also to maintain it so if the infomation has not any value where's the benefit? Anyway, the recent highly public success stories of crackers show that information security is very weak in some systems which do contain both valuable and confidential information, which leads us back at what I wrote in the beginning of my rant about the state of software quality on generic level.

The good news is that bad quality gives work for us issue resolvers. 100% correctness would make all of us unemployed.

Hooray?
submit to reddit Delicious