Wednesday, November 27, 2013

Large and complex projects make a large and complex mess?

I have written already earlier about the commonalities between building houses (maybe it would be better to generalise it by saying construction projects) and computer systems. It so happens that Tekniikka&Talous writes about the results of a survey made by Independent Project Analysis (IPA), of which IPA's head of Europe and Middle-East Mary Ellen Yarossi talked in a seminar held by Bentley in London in the end of October. I couldn't find any international news coverage on the event, and the full survey results likely are not available freely, but I'll refer here some of the highlights published by T&T on it.

IPA had analysed almost 17 000 construction projects (world-wide) of which 500 were classified mega-projects. Of those 500, two out of three had failed when schedule or budget overflowed over 25% or there still were unresolved issues after two years of official project ending. The survey also concluded that large projects fail more often than small ones, and projects which use new technology, fail more often than the rest.

Now, why does that not sound surprising? And from projects with new technology it's pretty easy to draw an analogy to software projects. I mean, new technology is introduced pretty often on this trade (construction business, in contrast, is usually quite conservative, but the direction is toward ever more challenging environments and larger structures which require new technologies).

Curiously enough, the survey also concluded that a common problem was that the objectives set for the project are not clear to all parties, or the objectives are not understood the same way, or they are in conflict (analogous to common issues with software requirements). Also they noted that in long projects the slipping of schedule in one phase is thought to be possible to compensate by shortening later phases, although in reality that only introduces more problems (as far as I know, testing tends to often get this kind of treatment in non-agile SW projects, which was also brought up in the "Obamacare" case lately).

As the seminar was held by a software house it should be no surprise that there seem to be common factors with software business, but I find these pretty striking and central. Thus, it might be that it is possible to learn from the mistakes of the other trade what comes to these things - and it's not just (us) software folks who are not that good with large and/or complex projects (it is us humans who are miserable in sticking to good practices and proper processes).
submit to reddit Delicious

Tuesday, November 12, 2013

Value in simple tools: psloggedon

Scenario: You notice a stream of requests ending up in a production server error log that clearly hint you of a misconfigured software client on a laptop (running Windows) in the company network. Checking the IP address against DNS gets you the computer name, but checking that against the company CMDB you only get a name of an former employee. The errors in the log are ugly and you want to get rid of them, but as you don't have admin access for laptops, you can't use the regular Windows admin tools to figure out who's using the darn thing. Asking for a workstation admin to get you the information might be one way, but there's also another way...

Solution: Grab psloggedon from pstools package (and nevermind if your copy is six years old, as mine was). Issue
pslooggedon \\hostname.yourdomain.com
on the command line and there you'll see all the accounts that are currently logged on, and can contact the user in question.


I acknowledge there might be plenty of other ways, too (myself I tried also using msg to message the user but that seemed to be blocked somehow, or maybe it was a conflict with Win7 vs. WinXP), and you're welcome to share similar stories in the comments.
submit to reddit Delicious