Showing posts with label quality. Show all posts
Showing posts with label quality. Show all posts

Friday, July 3, 2015

The joys and griefs of having a hardware keyboard with a smartphone

My awaited TOHKBD arrived among the first batch that was sent out, just before I started my early summer vacation in mid-June. The package actually arrived before the software had even passed QA at Jolla, so I had to wait for a couple of days before being able to actually use the keyboard - that certainly was a chilling factor. Another chill was the M key which I noticed didn't have any tactile response, and I assumed it is faulty. That was confirmed when the SW was made officially available and I was able to test the keyboard for real, the keypress was registered only after applying much force. I joined the few others with similar issues by whining about it on TMO forum, also reporting it by email to FunkyOtherHalf. Dirk van L, the one responsible for the coordination of HW development, responded and asked to wait and see if it settles with use - OK, fair enough, getting a replacement keypad won't be easy anyway for a gadget that has been manufactured in a small volume, strictly based on orders. As I'm not typing this post with my TOHKBD it is obvious that things have improved a lot, although there's still huge difference in the response and the required force. There's also another key which needs more force but has normal response (if any Os are missing from this post, there's the reason) and that's perhaps even more distracting.

There are still SW issues open even after the couple of updates by Kimmo L, mostly prominent being the lack of other layouts than American. Support for other layouts requires changes in Sailfish, and it'll take some more time until Jolla releases an update with those changes.


Despite all that, I'm happy for spending the money in the project. There's no way I could (and would) have written text this long with the on-screen keyboard. I'm also confident that the SW issues get resolved sooner or later. Even now it's a delight to use the terminal app, and writing in English is fine already. Seems like not all apps support all keys (tab, home, end, page up/down being the most prominent), but the community will fix that, too, eventually. Jolla, together with the community, will also eventually enable landscape mode everywhere (definitely something wait for from Sailfish 2.0). The greatest remaining issue is the faulty keys. According to a Kickstarter project page update, all keypads should have been manually tested, but it is evident that if that really was done for all batches (or even to this batch mine came from), it was done badly. I'd hope that there would eventually be some sort of a compensation with replacement keypads for those who need them. Hoping for the best!

Oh, one more "grief": Now the originally slim-fit phone has a rather bulky look which certainly will cause comments like "is that from the 90s or what" (hey, it is still thinner than my previous trusted phone, the E90!). Due to that, also my Mapbagdrag case don't fit anymore. Well, maybe someone will, some day, make a nice case for sturdier TOHs, too.


submit to reddit Delicious

Wednesday, November 27, 2013

Large and complex projects make a large and complex mess?

I have written already earlier about the commonalities between building houses (maybe it would be better to generalise it by saying construction projects) and computer systems. It so happens that Tekniikka&Talous writes about the results of a survey made by Independent Project Analysis (IPA), of which IPA's head of Europe and Middle-East Mary Ellen Yarossi talked in a seminar held by Bentley in London in the end of October. I couldn't find any international news coverage on the event, and the full survey results likely are not available freely, but I'll refer here some of the highlights published by T&T on it.

IPA had analysed almost 17 000 construction projects (world-wide) of which 500 were classified mega-projects. Of those 500, two out of three had failed when schedule or budget overflowed over 25% or there still were unresolved issues after two years of official project ending. The survey also concluded that large projects fail more often than small ones, and projects which use new technology, fail more often than the rest.

Now, why does that not sound surprising? And from projects with new technology it's pretty easy to draw an analogy to software projects. I mean, new technology is introduced pretty often on this trade (construction business, in contrast, is usually quite conservative, but the direction is toward ever more challenging environments and larger structures which require new technologies).

Curiously enough, the survey also concluded that a common problem was that the objectives set for the project are not clear to all parties, or the objectives are not understood the same way, or they are in conflict (analogous to common issues with software requirements). Also they noted that in long projects the slipping of schedule in one phase is thought to be possible to compensate by shortening later phases, although in reality that only introduces more problems (as far as I know, testing tends to often get this kind of treatment in non-agile SW projects, which was also brought up in the "Obamacare" case lately).

As the seminar was held by a software house it should be no surprise that there seem to be common factors with software business, but I find these pretty striking and central. Thus, it might be that it is possible to learn from the mistakes of the other trade what comes to these things - and it's not just (us) software folks who are not that good with large and/or complex projects (it is us humans who are miserable in sticking to good practices and proper processes).
submit to reddit Delicious

Friday, June 7, 2013

Of house and systems building

It happened that one evening I was watching the episode of British TV series Grand Designs which was portraying a usually overly-optimistic and imaginative house building project in Exeter (you may be able to view the episode via youtube). The gotcha of the series is that all the projects are focused on truly grand designs, leading almost always to a doubled budget and severely stretched schedule. Now, if that doesn't sound enough like an average system development project, let me add that quite many of the building projects also have legacy components (all kinds of conversion and extension projects) and the plans are modified along the road to accommodate to the failing budget and/or schedule - or the changing vision of the project owner(s), and the project is not necessarily led by an professional of building industry or architecture.

This specific episode was somehow unique in the sense that this time the plans really were changed remarkably in key points multiple times as the project went on, and the host of the show looked really doubtful of the chances of the owners ever getting a decent house. As usual in the series, the end result indeed was a decent house, unique for sure but not in a weird way, and definitely not ugly to look at (I wonder if they just leave out all the miserable failures or if the house builders just are so darn lucky).

Software or system development projects have been compared to all kinds of more traditional projects in the history, and software industry also has tried to take influence from other industries (like lean that originates in car industry). The inherent differences between software and its comparison points that might make the metaphor lame are debatable, as cars and houses (and maybe even more so the huge projects like bridges, dams etc, not to mention nuclear power plants) really are quite complex and technical these days.

The text book recipe for success is to have a rigorous working process that is not abandoned when difficulties arise, to avoid touching legacy parts when possible and to have professionals that are knowledgeable on the task they've taken on all levels of the project organisation. Many times it just doesn't go like that, I think. Many times there will be a sort of a cowboy mentality, "agile" or perhaps better called anti-agile (as it does not have much anything to do with the definition the signees of Agile Manifest had in mind) way of working. Many times it is indeed the legacy parts that need to be touched, which is harder, more expensive and more error-prone. Many times the key persons might be professionals of some other trades than software (no-one really expects non-coders to code, but surely any person with a technical mindset can be an architect, and financial skills are enough to lead the project because then the project surely will be on budget, right?)

The success rate of software projects is not very flattering. I've not got any recent figures at hand, but Robert Kraut and Lynn Streeter referred in 1995 in their work "Coordination in Software Development" to Robert Blazer's statement dating as far back as 1975 (!) and said that software still is unreliable, delivered late, does not respond to changes, doesn't hit the performance target and is expensive. Well, has that changed since then? I cited that paper in my MSc thesis in 2007 and my conclusion is still the same: that does still happen in large numbers.

Looking at the news here, I'd say that the success rate for building projects is not very good, either, especially the ones that are not large and demanding enough to clearly require the best practices and the best people. Just like the resulting software is likely to have security issues, the resulting buildings (at least in this country) are likely to have issues with moisture and/or unhealthy athmosphere inside the building. Neither of those would be hard to avoid, though, either by thinking security already in the design phase (for software) or applying basic building physics in the design phase and carefully sticking to the plan through the building phase (in construction work). But like in the TV series, when asked about the expected end time, it can be found out that there is no detailed schedule, and the project owners have the habit of changing the design as they go along (each change perhaps leading to other changes), and the project lead might be technical, but not exactly from the right branch of science...

As for the couple that featured in the TV show, I have absolutely nothing to say to how they dealt with the house they were building for themselves - it is solely in each one's own decision and risk. After all, they were amateurs when they started the project (as much as I am what comes to building a complete house), and should thus be allowed to learn as they go. However, the same mentality and way of working would be very strongly objectible if it occurs in the context of professional work.

So, what all this babble boils down to, I assume, is that some software projects are either led or completely run by a bunch of amateurs, or a bunch of people acting like amateurs, which is sad and unfortunate (both for the industry itself and the users of the software). I'm not into poking fingers at people, and I must confess that I've done plenty of amateurish mistakes so far and likely will discover new amateurishness in my actions in the future. Live and learn, or should I say fail and learn...

Since my sources are a bit dated already, I did some quick googling on the current situation:
Why Projects Fail - Facts and Figures
The Failed Record of the Software Industry
2010 IT Project Success Rates

It would seem that there, indeed, has been some improvement, and the agile movement really seems to be able to deliver what they've promised. However, the papers referred in the above sources still contain the same complaint and worry: that it is the people who make the projects fail. People, who do not act in a professional way.

From a certain angle computers creating their own software sounds like a good idea, even if it is quite sci-fi sort of an idea...


Post addendum

Some more survey results:
http://thisiswhatgoodlookslike.com/2012/06/10/gartner-survey-shows-why-projects-fail/
http://www.galorath.com/wp/software-project-failure-costs-billions-better-estimation-planning-can-help.php


And then yet later on (13th Nov 2013):
A good example of a gigantic failure in a gigantic project is the US healthcare.gov project that has been under scrutiny lately.
submit to reddit Delicious

Thursday, March 28, 2013

How quality degrades

I by no means assume the following does not fall into some sort of logical pit along the way, but I hope my idea will be conveyed anyway.

On a layered architecture each level has its individual degree of quality, but the total quality is not a sum over the parts but more like a multiplication over them, meaning that adding a top quality layer does not improve the whole (if not makes it any worse, either) and adding a less-than-perfect one will drag the total quality down somewhat. Adding quality would actually mean fixing a bug in another layer, which is not a good idea to do anyway. The whole can be as bad as the worst part within it.

What does that mean, then?

This week I took my first run on a web-based service for creating, editing and publishing video content. All the pain of being able to produce a nice enough, coherent and clearly voiced material aside, I also ran into some problems with the tool itself. I do not know if the service itself had occasionally some trouble handling all the traffic, or was my network connection bad somehow (the office WAN connection is known to have latency issues), or was it the fault of the browser (tried both IE and Firefox, though). Nevertheless I ended up logging in and out and restarting the browser and lost two good audio takes due to all that, which is not adding any glory for the service or its provider.

How are regular users seeing that kind of things?  They point at the service provider and say "your service is not good" as they do not see the different pieces in the whole. Is the service provider at fault? Many times that might not be the case. Does the reputation of the service provider take hit? Yes. Does it take the user support person a great amount of patience to get through the idea that the service provider does not (and can not) be responsible in any way of what there is between the user and their network? Most likely yes.



Thus, it is really important to ensure top quality all over the stack, and also beware of the pitfalls of having your product being stacked with layers you have no control over, since it may affect your brand. There are of course ways to protect against things like network twitches etc by properly handling all error conditions and taking care of handling communication time-outs gracefully - all signs of good quality software!
submit to reddit Delicious

Tuesday, May 22, 2012

Quality

It seems like a universal thing that good quality is something to aim for and bad quality is something to avoid. It seems like a simple idea to achieve good quality by doing things right. The problems seem to start right after that with the difficulty of defining what it means to do it right and the even greater difficulty of doing it according to the definition.

Here at the issue department we're mostly affected by quality related to software. The literature on software development is full of different process models and methodologies which ultimately aim at pushing the quality of the end product higher. There have been numerous seminars on the topic and masses of warm air has been produced by the advocates of different standards and certifications - but what has been achieved?

The statistics say that not so much.

And that's kind of sad.

Please remember, I'm talking on a generic level, there are certainly good examples of good quality software, but there are also examples of the opposite kind. On the generic level, my guess is that the main reason for the situation is that nobody is willing to pay the price tag of really good quality, so the development (and testing & QA) teams end up with something adequate. This also applies to entire ICT systems, redundant and high-performing iron costs big bucks, not to mention the energy bill for having that hardware switched on even though good quality might already mean also low energy consumption. Both from the financial and ecologic viewpoint everything that can be cut off from idling ICT reserve is a good idea. Cutting off testing resources might not be equally good idea, at least it is much harder to define what is the absolutely required amount and what is extra reserve (in fact it could be argued that everything needed to establish 100% code coverage is absolutely required). And there's always the fact that it is much cheaper to do it right in the first place instead of trying to fix it later (perhaps even after some costly damage has been caused by the faults).

Now, until now I've totally skipped one area of quality, that is security. Certainly adequate security is an essential part of good quality, as what's the use of having 100% functionally correct system that can be easily made to do things that are essentially against the purpose of it, right? Reading the discussions in software related groups in LinkedIn, for example, have taught me that there is a kind of silent battle going on between enthusiastic penetration testers (ethical hackers?) and the more functionally oriented "traditional" testing pros. Or maybe it's not a battle between those two but maybe with the management as with security testing it is even more difficult to measure the completeness of testing - and one can always argue that the attained security by partial testing is already adequate due to the nature of the system as with information security we're always dealing with probabilities. If there is a lurking theoretical possibility of gaining illegal access to the system which is very unlikely to be ever realised is there any point in putting effort on finding it (not to mention fixing it)? And if the information contained in the system is not valuable or confidential, why bother securing it? On the other hand, what's the use of any computer system which contains or processes only invaluable information? Certainly it costs something to develop/acquire such a system and also to maintain it so if the infomation has not any value where's the benefit? Anyway, the recent highly public success stories of crackers show that information security is very weak in some systems which do contain both valuable and confidential information, which leads us back at what I wrote in the beginning of my rant about the state of software quality on generic level.

The good news is that bad quality gives work for us issue resolvers. 100% correctness would make all of us unemployed.

Hooray?
submit to reddit Delicious