Saturday, 25 October 2008

The testers of doom!

Gather round, make yourselves comfortable, I have a story to tell.  It's a true story, one that I omitted from my epic "How I got here" story for reasons of brevity.  This particular story, for those interested in continuity, should be inserted at the end of Project Doom (described in Part 15).

This is the story of the unique clusterfuck that was the testing of Project Doom!

Project Doom was not a successful project, not by any defintion of the word.  The goals were both trivial and impossible, it would require herculean feats but the end result was "neh, but so what?".  It was led by an idiot and implemented by morons (except for me, and perhaps a couple of others, if they were in the right mood).  But still, all the design and build problems were nothing (NOTHING!) compared to what happened during the testing phase.

The initial signs weren't good, Paranoid Andy had hired yet another functional analyst (i.e. the only group of humans he ever attached any value to) to lead testing.  She was exactly the sort of person Paranoid Andy liked: stupid, arrogant and openly contemptuous of technology.  [I'm trying to think of a pseudonym to give her, for the purpose of this post, but I've completely forgotten what her real name actually was; I don't want to accidentally call her her real name, just in case.] Lets call her Emma.  [I'm sure that's not her real name, it's the real name of someone else I've slagged off elsewhere...]

I have no patience, sympathy or understanding of people who choose a technology based career but then spend their whole time railing against technology "oh you want a cron job, can't you just do it manually every morning?"  Get the fuck out!  Marketing is next door.

Emma also had no understanding of testing methodologies.  Unfortunately that is also a common disease, about the only place I've worked that seemed to do anywhere approaching thorough testing was Job No. 4; but even there the testers were vetoed by higher management and the product launched to instant high-profile public failure.  Job No. 4 didn't do anything particularly radical, but were the only bunch of testers who understood the concepts of releases, environments and re-testing.

She wasted no time in pissing everybody off, and in true Paranoid Andy minion style she pissed the client off too.  She wrote the most patronising "Defect Management Guide" I've ever seen (it consisted almost 100% of bad practices and anti-patterns, and was written in the style of a five-year-old) then spent a whole week talking down the client about "that's the way it's done".  That particular member of the client staff had been in the software game for fifteen years, she'd been in the game for six months, but that didn't stop her...

Although not in my team, Emma was junior to me in the company grading system, this meant I was asked for input into her performance review.  I tried to make the point that both her technology (bad because of her job) and client (bad because of the nature of the business) skills were lacking.  Paranoid Andy vetoed it, saying: "no, I like the way she forces things through."  (This was long after Project Doom was rejected by the client, but Paranoid Andy was still refusing to see it; he was his own worst enemy that bloke, if he would only accept the fact that he doesn't know technology, let go of that side of things, and instead concentrate on his upward management he would actually get a lot done.)

She also had a unique approach to writing test scripts, she didn't believe in testing that the built product complied with the designs/mock-ups/etc., instead she did two things (at opposite ends of the idiocy spectrum):  She wrote half the scripts by looking at the latest build of the system, and would therefore miss any missing functionality, by very definition; and then wrote some others, by going back the client's functional team and asking "so then, if you were designing a system for X, what would go in it?", opening huge cans of worms which would treble the scope - only two weeks were assigned for testing, and there was no iteration two, it was this or the project would be abandoned.  And, once the scripts were written, the expected results could never change; this caused fun when fixing a defect - the result would change, she didn't like that.

In other words everything was doomed before testing even began, but even I didn't expect things would get worse, but they did, mostly because The Puppet would get involved too.  Emma or The Puppet on their own were quite dense, and both capable of single-handedly crashing a project, but the pair of them together?  It formed a perfectly dense singularity, one that would consume all around it, the slightest bit of sense that went anywhere near it would be lost and would never be seen again.

The Puppet's first decision was to turn off the continuous integration server.  Why?  "Because you'll only be fixing on Wednesday's and Friday's."  The idea that he and Emma had come up with was that the developers would do the bulk of the testing, lining up a list of defects to fix, and then we'd fix them all on just two days.  This was nonsense for two reasons: first, developers make bad testers; and secondly, they obviously didn't understand simple versioning.  They reckoned that the test environment would instantly change for each bug fix, "we have to know what we are testing", they envisioned all the developers with one defect each fixing at exactly the same time so as to maintain the integrity of the testing environment.  (I did try and educate people, but they didn't want to know.)

The idea of using developers as testers quickly went wrong too (who'd have thought it!).  Each developer was given an area of functionality to test, usually separate from the one they built originally (which was actually a good idea, compared to testing your own work) but then one of three things happened:

  1. The developer got bored, and refused to do it - usually citing a critical issue elsewhere to avoid it.
  2. The developer "tested it", but didn't realise it didn't work - after all they'd seen the cack-handed decisions that went into the project and assumed that nonsensical results was what the client actually wanted, they hadn't quite figured out that The Puppet was incompetent.
  3. The developer tested it, and raised errors which weren't actually errors.
Not once did one developer find a real defect.  The practice of developer testing soon fell into disuse, the bulk of testing being picked up by Emma and The Puppet themselves.  There was more than enough fixing to keep the developers busy anyway.

Also quickly abandoned was the clockwork builds.  The original plan was for there to be three builds, and three rounds of testing; but not even The Puppet could pretend everything was going to be fixed in three rounds, and a policy of builds-on-demand started instead, to allow a quicker turn-around for retesting problem areas.  Unfortunately the sophistication of testing management couldn't really keep up.  A build number was soon added to the application, but no tracking was made of which build contained which fixes.  This usually resulted in each defect having twenty or so duplicates.  I recommended that duplicates should be merged, but Emma was having none of it: "they are both defects".

The first pass quickly showed that nothing worked.  In fact it wasn't really a pass.  Emma put a stop to it after about ten minutes.

Emma: "It doesn't work."

Me: "I know that, have you not been listening to me for the past six weeks.  It doesn't work.  However, as far as I'm concerned they are all design flaws, the build phase is finished, if you find any coding errors please let me know."

Emma: "It's not really worth my time testing it then."

Me: "Don't test it if you don't want to.  But everything has been built correct as per The Puppets micro-management, he's made it quite clear that he doesn't like developers making their own decisions therefore nothing is going to change unless someone raises a few defects."

Emma: "But this is a waste of my time if it doesn't work."

Me: "I know you know nothing about testing, but you don't have to make it so obvious.  Let me give you a clue.  It's your job to go through the product, and here's the key thing, make a list of the things that don't work and explain the correct behaviour.  Do you see?  Then we will fix things, then it will work a little bit better, and then you test it again.  Do you see?  That's why it's called testing.  Test-ing.  The clue is in the name."

(I did warn you I have no patience for these people, I've been around the block often enough to see that such clueless people have no positive impact on a project.  I'm certainly not going to go out of my way to accomodate them.)

Very quickly we had a very long defect list.  Although I was surprised how many obvious defects hadn't been noticed.  It was almost 100% cosmetic issues, with a few very obvious ones ("500 - Internal Server Error", etc.); the more fundamental defects, like the fact that it simply failed to show the correct content, were missed.  I'd been in situations before when I'd noticed problems the testers had missed, in those cases I'd wait until a quiet time then leak the information to the testers - that way it would be fixed before it blew up in production.  This time, however, I had so much latent anger and resentment from the build phase and the previous Paranoid Andy shenanigans, that I wasn't going to point anything out, I was going to conspire to keep things hidden if possible.

The Puppet was shocked by the list of defects.  This was partially theatre on his part, to deflect anybody wondering how he let it happen; but mostly he had underestimated the probability that his mandated hacks (that all the developers were against) would blow up.  (He didn't learn either, he'd just make ever more outlandish demands in later projects.)  He was facing the prospect that we wouldn't be able to fix the defects given the time available, so came up with an ingenious solution.  He reasoned that if defects were assigned to the people who were expert on the component which had the bug it would take too long, so instead he sent database defects to the HTML developers, front-end bugs to the database developers, HTML bugs to the back-office team, and so on.  "We have a deadline" was his explanation.

Things, not to put too fine a point on it, weren't going well.  Each developer had several dozen defects in components they didn't understand.  Since my team had originally built most of the internal-facing core of the system, we had all the edge-case public-facing defects to fix.  This meant examining some Java code which was written by people who usually struggled with HTML (and that was as close to a programming language as they got)... ZOMGWTF?  I had never seen anything like it; I'd thought some of the original legacy code was as bad as it got...  I was wrong.

In hindsight it shouldn't have come as a surprise that a group of people who'd never done any programming would write such crap, but I somehow never expected to see anything quite that bad authored by someone who was paid to produce it.  There was one class per HTML page, this was mandated by the framework (which was another clusterfuck, I'll save that for a future story), but apart from that it was just one blob of code.  Each class had one method, and one method only; similar functionality needed in difference classes was mercilessly copied and pasted (and, on occasion, selectively commented out); the bulk of the logic was nested if statements... 20 to 25 levels deep!

Each fix would take an age, a couple of days usually.  It would require mentally parsing about 10,000 lines of code - and it's near identical clones in other classes - applying the fix, and then trying to identify and contain side-effects (which was near impossible, we're talking about code which hardcoded list indexes - page title was the 13th item in the list because the database at the time only had 12 pieces of content).

Amateurish wasn't the word.  I would have swore it was deliberate sabotage if I hadn't known the authors better.

I would "fix it" by simply deleting the dead code - i.e. the unreachable bits - and then doing things in a more sensible way.  This was how I knew full well that only a fraction of the bugs had been identified, the code that was required for certain features simply didn't exist.  I had deleted it!  (I didn't delete it out of sabotage, but because it was unreachable and got in the way; fixing it would have meant re-writing anyway, but in the meantime it might as well be erased.)

Even more worrying, of course, was that the amateurs who originally wrote this crap were now being assigned fairly big-picture defects.  The damage they were doing... well, actually, they were too scared to make any changes when they were faced with real code, so they didn't do any damage, but they didn't do any good either, the defects came back to my team in the end.

So, with a crap product, shedloads of bugs, and only three people who could fix them, The Puppet took a great deal of time and effort to prioritise defects and manage client expectations...

...only joking!  The Puppet demanded that all defects were fixed, even the most trivial that no-one even noticed - e.g. one page element being 301 pixels long instead of 300.  He also decided to change functionality at random, then change his mind again, then go back to the client functional managers and start free-ranging decisions about how data should be entered/archived/versioned/etc.  We hadn't got enough time to fix the currently known defects, but he was trying to push a version 2 through at the same time.

Not one of the development team knew what the finished version was supposed to look like, it changed so often.  Just when we were getting used to something working a certain way, The Puppet would raise a new defect "On page X there are three items chosen by relevance; there should be four: two by relevance, and two by fixed assignment."

"Err, since when?" I would say.

"Since this morning, that's what Random Client Person wants."

So I would, with great difficulty (because of the state of the code), make the change.  In the next test pass I would recieve a new defect "Why was this changed?  Put it back."  During the full-scale slanging match that quickly followed, it transpired that The Puppet didn't actually want the first defect fixing.  I was getting quite annoyed, really quite annoyed, but in an eloquent but loud way stated that if a defect doesn't want fixing then: 1) it probably shouldn't be logged as a defect, we have a seperate category for 'enhancements'; 2) it definitely shouldn't have been assigned to me; and 3) he should have said as much when we were talking about it.  The Puppet couldn't quite see my point-of-view, and blamed me for wasting time; I then said that in order to avoid wasting time in future, I would be ignoring all defects unless I had an unambiguous email asking me to fix the defect, and I'd only accept it if it came from The Puppet.  He said he was too busy.

I don't know if what I did next went too far, or didn't go far enough?  There were only three people who had any chance of making this work, but The Puppet was still determined to treat them as slaves.  I needed to make a stand, I needed to throw some weight about, a way of saying "if you don't have me on side, this project isn't going anywhere."  Originally I was going to reassign the defects back to The Puppet, "send them on to me when you've finished reviewing them".  But I quickly found that my permissions didn't allow that.  There was no way for anyone in the development team to reject a defect, that shit only rolled downhill.  Earlier on in the project this was allowed, but Emma had revoked those permissions after we combined duplicates together, "we're losing defects" she said, "if it's assigned to you, you have to fix it."  However, The Puppet and Emma had forgotten one key fact, the defect tracking database was on a development server.  To cut a long story short, I deleted all defects that were assigned to me and went home early for the weekend, "there's nothing on my list."

In a way it worked.  I came back the following Monday expecting all sorts of shit to be going down, people going mad, and specifically people fuming at me for sabotage with only a week to go.  But instead: nothing.  They hadn't noticed.  So doomed was the project that me just saying I'd done stuff was accepted, no-one had any way of proving otherwise.

This then became my default policy for the rest of the project.  Emma had an annoying habit of raising defects every day, even if we hadn't had a build in the meantime; she would know there hadn't been a build, but still somehow expected the original defect to have gone away.  Everytime she got on a moral high horse about raising defects.  "Hmm, give me two minutes... O.K., that's my defects 'fixed', what's next?"  Defect numbers were assigned sequentially, so I expected someone at some point to realise there were huge chunks missing... but nobody ever did.

I tried to promote this same strategy as a means of stress relief to my more favoured minions.  But I couldn't just tell them to delete defects, that would have gone too far.  Instead I said, "hmm, you need a way of making these problems go away... notice how I didn't use the word 'fix', I said make them 'go away'... incidentally, you have the username and password for that database don't you?"  The poor innocents didn't quite understand what I was saying.

I'm making myself sound really unprofessional here, but you weren't there.  The end result was still better as a result of me performing these shenanigans than it would have been had I not been there at all.  It was a mercy killing.  My conscience is clear.

Vindication of this approach wasn't late in coming.  The Puppet quickly abandoned the idea of ever getting the system working in any meaningful sense of the word and set about on his own unique attempt at sabotage.  His focus was the database, and specifically hacking it until he got the front-end to change how he liked.  The original idea for the idea was based on an almost religious belief that simplistic "relevance rules" would produce a cohesive website, they didn't; but, not only that, due to the fact they were all implemented by an idiot - The Puppet's Puppet, to be precise - they were just effectively random results.  A web page with the theme of Polar Bears would have a picture of the Great Pyramids of Giza attached to it (believe me, it took a world-beating amount of cretinism on behalf of The Puppet's Puppet to fuck it up that much - the design was bad, but it wasn't that bad).

Instead of fixing the relevance rules, or even better, replacing the rules with a respected published algorithm rather than the home-brew nonsense that The Puppet and Paranoid Andy designed, The Puppet just swapped the picture of the pyramids with a picture of a polar bear - and kept all the metadata intact.  This had some obvious side effects, the article about the pyramids now had a picture of a polar bear on it, but The Puppet didn't notice.  He was "fixing things", so he told everyone.

All this shenanigans started at version 12, with quite predictable consequences when we deployed version 13.  "Why aren't I getting a picture of polar bears anymore?"  So back went the test environment to version 12.  We were expected to fix things without actually making any code changes, which of course didn't work.

So we split the test environment into two, one to stay at version 12, the other to have up-to-date code.  But Emma and her minion could not quite grasp the concept, they kept logging defects (which would mysteriously disappear) stating that the two environments were different, they wouldn't take "that's because they are different" for an answer.

Emma: "Yeah, this defect you fixed in environment B... you have failed to fix it on environment A, so I've created three more identical defects and assigned them to you."

Me: "As I've explained ten times already, and believe me this is going to be the last time, that's because environment A hasn't been updated due to the express instructions of The Puppet.  Give me the word and I'll update the fucking thing!"

Emma: "We want you to update it, but not change the code."

Me: "Look, shut up, go away."

Emma: "I'm going to keep raising defects..."

Me: "OK, I'll update it, you'll just completely destroy The Puppet's demo tomorrow, I don't care."

Ten minutes later:


The Puppet: "Where are my polar bears! NO!!"

I quickly gathered my coat and went home before he figured it out.

Did I mention it never went live?

The project ended shortly after testing was declared a success.  I still don't know how Paranoid Andy managed to claim that with a straight face.  He had to be a psychopath to be that good at lying.  The internal users loading content was the beginning of the end.

Client: "Hmm, I've just loaded an article and a picture about the Model-T Ford, but I've got a picture of Myra Hindley on the page?"

The Puppet: "Can you put that in the defect tracker?"

Shortly afterwards the client project sponsor quit, a moritorium on new projects (including Project Doom) was introduced, and an annoucement made that the whole department would be closed.  A small price to pay to ensure that Project Doom never saw the light of day!

1 comment:

  1. Just found your blog, attracted by the 'testers of doom' title and have to say that sadly I recognise most of the aspects and characters in it.
    Off to read more of your tales...

    ReplyDelete