Friday, September 21, 2012

Skill set - breadth vs. depth

I came to ponder upon the value of ones skill set in relation to its breadth and depth, and due to the inherent nature of human being also ended up comparing the options.
In my experience, what is most commonly sought after in job openings seems to be the type with deep knowledge on some limited domain. Only very rarely somebody seems to be clearly seeking a "ICT handyman". On the other hand it also seems that sometimes they are seeking for the ultimate expert with deep skills in dozen domains. Anyway, being more like a guy with broad skills it is a bit discouraging to find so few job postings with a spot-on requirements. Not that it would be crucial at this moment as the current position feels both suitable valued but you know, better safe than sorry and thus its good to have at least a vision of a exit plan.

Whether it is about diverse interests or just my "professional karma" that I seem to end up doing all sorts of things and thus getting a rather broad experience. Havnig done coding in dozen languages on at least three OS families, user support, reporting, system administration, DB design, network related tasks, all sorts of data extraction and analysis and getting familiar with various OSs and user interfaces it is clear that most of that might not be so ashtonishingly deep knowledge that would require anyone to drop one's jaw - but rest assured, it will get the job done. It also guarantees that I can throw myself into such obscure things as printer operation on MVS or rescuing a crashed PBX info system on a Digital Unix box. Challenges that nobody expects to have (both the platforms in these examples are long obsolete) but still somebody has to deal with them when the need arises.

This is not to in any way say that somebody with deeps skills in something would be somehow worse - in contrary I admire people how really know their stuff inside out and from top to bottom. We also need those guys, as without people like that nothing really fancy technical stuff would not be here since all that requires tackling such obscure issues that no lay man can even imagine. I also need those guys since if they were not there I would have no source of help after running into a wall in attempts to solve some hard issue.

Basically this seems to boil down a all embracing notion that both types are needed to form a good ICT team: those that have tackled the task already earlier and are willing to do so again, and those who have not done that but are definately willing to do it nevertheless. The former onces are more productive in the specific thing but the latter should be just as good for a broader spectrum of things.

Of course, this is just my personal opinion and as subjective as it gets, opposing views are more than welcome!
submit to reddit Delicious

Friday, August 3, 2012

Complexity is the key

This was inspired by a slogan of one IT service company in which they promise to make things more simple. To me it looks like in the IT world the things are never getting simpler, quite the opposite, but we're trying to make them look and feel simple.

About a year ago I started with a personal programming project that is to be implemented as a Java web application. I started with just the Java Servlet API but as I have already learned long ago that making dynamic web UIs is a pain if you need to code it directly I was delighted by seeing the issue of Java Tech Journal which presented several different Java frameworks for web UIs. I ended up with Apache Wicket, since I didn't exactly want to get involved with anything as expansive as Spring to avoid having the project stall while I'm trying to learn too many new things, and Wicket seemed simple enough for my needs. The project did stall anyway, since I had some more concrete things to learn and do, but that's another story in another blog.

So, I picked a rather lightweight and simple framework and still I think I multiplied the amount of code base in my application by hundreds. The Wicket JARs take 3.7 megs (I currently use a tiny fraction of the features provided, and probably will never use even a tenth). Then there's the Servlet API which seems a lot slimmer on the outside, and there's log4j which is not that big either, but overall the web archive file is now around 9 megs. I run it on Tomcat, Tomcat runs on Java Virtual Machine which in turn runs on Linux. My very rough guess for the total amount of executable code needed for all of this is around 0.5 to 1 gigabyte with all the dependencies - and the application doesn't even do anything useful yet!

So where's the simplicity? The operating system makes things simple for the JVM, which in turn hides the underlying architecture from Tomcat so the developers didn't need to figure out all the details on how to make it run on different platforms. Tomcat, in turn, adds another layer of simplicity by offering services to the web application so that it can rely on the Servlet API and stuff like that, and finally Wicket makes it so much more simple to make HTML based UIs. I write little code, but use more than would have been needed if the whole thing would have been written from the scratch without using any APIs or extra abstraction layers. I save time and effort - more than a lifetime definately, I can't deny that. It's just so clear where all the advances in hardware are spent - to make things more simple...

Beside application servers, another good example of simplicity by complexity is the cloud concept. Abstraction upon abstraction, that is. I'm not opposing the layered software architecture, by no means no, it's a perfectly good concept for decoupling things (and thus controlling some of the complexity), just like modular code was a good "invention" back in the older times. It's just that when building upon n layers of abstraction you're already on so high level of abstraction that to do something that would be a trivial one-liner in e.g. C you need to call a bunch of methods that consist of tens of lines of code, and based on the KISS principle that's just ugly.

Now the interesting question: Is there an end for all this increased simplicity through complexity, will the stack of abstraction layers stop growing some day?
submit to reddit Delicious

Tuesday, May 22, 2012

Quality

It seems like a universal thing that good quality is something to aim for and bad quality is something to avoid. It seems like a simple idea to achieve good quality by doing things right. The problems seem to start right after that with the difficulty of defining what it means to do it right and the even greater difficulty of doing it according to the definition.

Here at the issue department we're mostly affected by quality related to software. The literature on software development is full of different process models and methodologies which ultimately aim at pushing the quality of the end product higher. There have been numerous seminars on the topic and masses of warm air has been produced by the advocates of different standards and certifications - but what has been achieved?

The statistics say that not so much.

And that's kind of sad.

Please remember, I'm talking on a generic level, there are certainly good examples of good quality software, but there are also examples of the opposite kind. On the generic level, my guess is that the main reason for the situation is that nobody is willing to pay the price tag of really good quality, so the development (and testing & QA) teams end up with something adequate. This also applies to entire ICT systems, redundant and high-performing iron costs big bucks, not to mention the energy bill for having that hardware switched on even though good quality might already mean also low energy consumption. Both from the financial and ecologic viewpoint everything that can be cut off from idling ICT reserve is a good idea. Cutting off testing resources might not be equally good idea, at least it is much harder to define what is the absolutely required amount and what is extra reserve (in fact it could be argued that everything needed to establish 100% code coverage is absolutely required). And there's always the fact that it is much cheaper to do it right in the first place instead of trying to fix it later (perhaps even after some costly damage has been caused by the faults).

Now, until now I've totally skipped one area of quality, that is security. Certainly adequate security is an essential part of good quality, as what's the use of having 100% functionally correct system that can be easily made to do things that are essentially against the purpose of it, right? Reading the discussions in software related groups in LinkedIn, for example, have taught me that there is a kind of silent battle going on between enthusiastic penetration testers (ethical hackers?) and the more functionally oriented "traditional" testing pros. Or maybe it's not a battle between those two but maybe with the management as with security testing it is even more difficult to measure the completeness of testing - and one can always argue that the attained security by partial testing is already adequate due to the nature of the system as with information security we're always dealing with probabilities. If there is a lurking theoretical possibility of gaining illegal access to the system which is very unlikely to be ever realised is there any point in putting effort on finding it (not to mention fixing it)? And if the information contained in the system is not valuable or confidential, why bother securing it? On the other hand, what's the use of any computer system which contains or processes only invaluable information? Certainly it costs something to develop/acquire such a system and also to maintain it so if the infomation has not any value where's the benefit? Anyway, the recent highly public success stories of crackers show that information security is very weak in some systems which do contain both valuable and confidential information, which leads us back at what I wrote in the beginning of my rant about the state of software quality on generic level.

The good news is that bad quality gives work for us issue resolvers. 100% correctness would make all of us unemployed.

Hooray?
submit to reddit Delicious