Thursday, November 25, 2010

When to do written test cases?

Recently I've had several talks with various testers, whom I tutor, about use of written test cases. Since the subject has gained remarkable feedback lately so it looked well-cooked enough to publish.

Doing manual testing with written test cases tends to be one of the most expensive form of testing. Why so? Some of the reasons are

If doing the test and/or repeating it isn't either expensive nor extremly time-consuming then usually writing down in details what to test takes a lot longer than actually doing setup for test and running it. So from testers perspective it's not very wise choice of cost vs value. My personal worst (I'm not proud of it, but my that-time manager required it) was around 5 hours to write down the test case into really fine details while it took 5 minutes to run it. What makes it "my personal worst" - the test case was written for very testable functionality taking only 1 minute to setup and easy/cheap to run it again as many times as required. So in general I spend over five hours for a less than 5 minute easily repeatable test.

Someone with higher level of qualification should usually write the test case and review it several times since mistakes in test cases can cause various problems from false-negative bugs (bugs that aren't real and appear only because of the mistake(s) in test case) to false-positive bugs (bugs that are considered correct behavior because misinterpretation of test case instructions). That means that higher level specialist's time, what is usually more expensive, is used for this task. In addition it takes lot of time to think up and write down every possible combination how to test something since there are almost unlimited amount of possible test cases. And worst part is that usually there are too many unknowns about the product to reduce number of possible combinations to reasonable level.

If the Test Case is prewritten after some other document (Use case, requirement document), then often some areas are left uncovered in test case either because all possible combinations can't be thought at that time, some things are still uncertain at the time of writing the test case or some requirements have changed between writing the document and actually testing done. My experience I usually find majority of bugs from things not written down on test cases or in any other documentation.

If detailed test cases are written afterwards for someone else then it's usually good idea to find out for what purpose and what is done with them afterwards. If they are done for someone to review what's been done, then there are usually much cheaper ways to do it. For example to use screenrecording tool to record done tests. It's even to organize them like documents with versions etc. 1 GB of data storage in Hard Drives cost less than 0,1 USD. Or to write just test report describing test coverage at the end of each time-cycle or mission.

But then when to do written test cases?
Written test cases are the most formal ways to test and should be used when the highest possible level formality is required. The reason can be business decision or obligation from contract, but in any case the additional time required must be considered.

Also written test cases are useful if you can run tests only limited number of times and must make sure to cover every possible and required scenario with really limited reruns of tests. And the need for extra resources must be considered again.

Another reason for written test cases might be needed if something additional is done with the test cases. For example if certain test are needed to be automated for any number of reason. Then the test cases serve as use case documents for automated tests. In which case they need to be maintained and updated as any other product documentation along with automated tests in order to use them again in future properly.

My personal experience is that most software development projects doesn't have neither the need nor budget for fully formal testing to do all testing in written test case form.

Thursday, July 8, 2010

How to hide problems in software behind business rules (and what happens afterwards). Case Study part 2

Few weeks have passed and after some very difficult red tape I succeeded getting the necessary information. It only took me around 6 emails and 3 phone calls to get answer to a question "How are after 23 o'clock made payments handled and why it takes so long to reach to third party system".
What I found out was little disturbing - Payments made after 23 o'clock are handled by persons during the next work day manually and then sent to Tax and Customs Board day after that. This of course explains why payment reaches to next systems over 24 hours later. The day-to-day human interaction are put in place to handle corner case problems. Also I found out that Tax and Customs Board is trying modify their own system to accept these corner cases by in their end. I personally don't have much faith in it. I see various problems like how to separate faulty data with enough certainty or if owner of the first part system (banks) aren't willing enough to give out all necessary information about their system behaviour to very high detail.

Tuesday, June 15, 2010

How to hide problems in software behind business rules (and what happens afterwards). Case Study

All started at the end of May 2010 when I found a strange bug (from my point of view) from self-service system of one of Estonian biggest banks.

I tried to pay my taxes to local Tax and Customs Board when I noticed the money wasn't transfered during the time I expected (later I found out took 2 banking days to arrive). So I started checking what's going on. I found that payments made after 23:00 local time got timestamp 24 hours to the future. For example payment order for payment made 25.05.2010 23:40 got timestamp 26.05.2010 23:40. So I enquired from bank. I got nice reply (translating and shorting the email):

"Thank you for your letter. According to our terms and conditions of account management we quarantee in-bank transfer within 1 hour though usually it takes less. So the money is transfered to another account but has timestamp for the next day of real transfer".

So I noted out that money wasn't transfered during next work day. After that I got reply pointing that "It is not a bug. According to our terms and conditions payments forwarded after 10 p.m. shall be excecuted immediatley but amount shall be expressed in the account statement of the customer under transactions of the following day".

Okey I thought. Bank says it's not a bug. So did I really make mistake this time... Doesn't feel right...

I started enquiring from Tax and Customs Board. They replied that current amount was logged into their system until 2 days later in the morning and according to their info the transfer didn't came through immediatly. If I tried to make payment during day the transfer went through on the same day so it couldn't have been issue from their side.

I forwarded current message to the bank but their reply didn't change much. They still refer to terms and conditions.

So I made my conclusion from information I posessed - Due to the fact that their banking day doesn't match astronomical day they couldn't fix it to make it match properly and applied business rule for allowing this bug to remain (now called a feature). However now that 3rd party systems are getting data from their system something goes wrong due to this feature. My best guess is that due to the wrong timestamp the information isn't forwarded at correct time, but instead it is forwarded in the time marked in the payment order. Hopefully I'll find out the real reason soon and can close this subject in my mind.

However I must admit. Nice idea to use business processes and rules to change bug to a feature. Have seen it before in other systems and working pretty well. But it might come back to haunt the system if another system might need correct value instead of value inserted by the feature.

PS. I also found mistake from Tax and Customs Board's system while looking for this specific mistake. It's getting slowly out of hand... Finding bugs from almost any system I use...

Friday, May 14, 2010

Bugged software checking bugged software for bugs

Some time ago I had nice conversation with Madis Jullinen about test automation. We starting wondering how business people understand automated testing. If they understand that any software has bugs (because they approve testing at all) then how come they still belive that automated tests don't have bugs in themselves? Let's forget all about theory of models ("All models are wrong, some are useful") and checking vs testing ("automated tests are only useful for checking") to simplify matters and only concentrate on the subject of automated tests as just another piece of software.

We assume that any software has unknown amount of bugs in them. Testers are there to find some of them (usually as many as possible within any given amount of time). Automated tests are another piece of software. Ergo automated tests have unknown amount of bugs in them and testers should test them as well and someone have to fix found bugs.

So I started approaching this subject mathematically (and financially)... To create automated tests you have to approach this as another development project. With it's own analysis, development, testing and support phase (plus some project management to schedule it to timetabels). You have to analysis what to automate and how to do it, develop necessary tests, test if tests works properly and support automated tests throughout project lifecycle (since project might change in time). I added up the numbers. It takes n hours to prepare what to test, m hours to develop it, p hours to test if it works, r hours to fix bugs in it and s hours to keep the tests up-to-date. So in general it takes n+m+p+r+s hours to prepare the tests and s hours every iteration to keep tests up-to-date. Sadly most of business people see test automation cost as only p hours total for developing tests.

Another bad trend I've seen lately is just the need to create automated tests because it is a good thing to do. Why is it a good thing to do? To spare tester some time from regression testing? How do you know it spares time? Have them done calculation of ROI to see whether or not it would be even useful to automate? So why to even bother to write test automation if you are not sure how much time and money they would actually save? The worst thing that could happen is when testers are required to create automated tests since some procedure dictates that n amount of tests have to be automated.

I won't say I'm against test automation. But everything must be done not because some procedure says it must be done, but because it must actually help to get better results in the long run. So instead of building one bugged system atop of another bugged system and hoping it would find some new bugs... Well... I think it would be wiser to spend that time either actually testing or calculating whether or not it's worth to automate these tests after all.

Thursday, May 6, 2010

Having fun with computer terminal in a bank.

I heared this story from my tester Rasmus. He went to a bank to complain that their web application doesn't work properly when he tried logging in with Estonian ID-card. So the bank worker showed him a local computer terminal to try if it works here. He inserted his card into the reader and watched what happened. Program waited a little bit and opened a pop-up where you had to insert PIN1 of the card to log in. But... There was no OK or Cancel button... And when he tried to enter numbers to the field, that didn't work either.

So he thought to have little fun (read about that kind of fun from James Bach's blog here and watch here). He removed his card from a reader hoping something fun to happen. And... Computer started to shut down by itself... oops... the whole system started behaving really odd because of just removing ID card...

That made me think... I hope that they are not testing all their systems like they had tested this terminal... Who knows what else could he have stumbled upon there if that computer hadn't crashed completly...

Test Camp: puhkaeestis.ee

Another month has passed and again it was time to call for Test Camp. This time I tried to do little more. Add more people and add more different approaches. One of participants from previous time (Madis) recommended very intresting application to test - web application without database. Date of the event was agreed upon 17.04.2010.

I got the permission to use this application (www.puhkaeestis.ee) and now I started putting together participants list. Sadly Tanel from Fraktal couldn't come, though he wanted, but rest of people were Oliver and Rasmus from Celeg Hannas and Madis, Artur, Eero and Polina from Webmedia.

This time we got people with really different approaches and backgrounds to participate - Testers, QA manager, Test manager and even .Net programmer (Eero). So the results were also very various. Eero (as only participant who actually works as programmer) gave some nice results and they made great team with Rasmus. Both generating ideas and improving each other - both had programmer background, but currently doing different type of work, thus thinking differently.

Ideas were generated so fast that Madis had hard time keeping up the pace when everyone were asking additional questions. Again the average rate of issues found in an hour per participant was around 2. But none-the-less it was fun and Webmedia thinks of startingsimilar events in-house also. Glad to hear that my idea was well-recieved.

Hopefully everyone also learned some new tricks to try with our own projects. At least I know I did.

Thursday, March 18, 2010

Test Camp: Edicy

Few weeks ago I got an idea - very dangerous, especially when you lack time. The idea was to create sort of camp out event for testers to practice our skills, learn new skills and get to know each other. After several days of thinking and discussing this idea with friends I trust (some are testers as well) I decided to organize an event called Test Camp. Idea was to spend 1-2 days in a row on one specific software and try to find as much bugs (or issues, since not all issues are bugs) as possible. Secondly I thought it would give participants excellent chance to see how other testers work and think. Thirdly the idea was to try new methods and techniques of software testing in order to see which ones work better. To add some coke and pizza to that event would make it like camping out with computers - kind of LAN party with the exception that instead of playing games, we would test.

So I start moving forward with my idea.
First I put up list of participants. Since it was a first time event I didn't want to make it too crowded. So first list contained 2 of my friends (and also ambitious testers) Rasmus Koorits (Tester from Celeg Hannas), Madis Jullinen (tester from WebMedia) and of course myself.
Then I chose website making tool called Edicy from company Fraktal as the first software to be Test Camped (term I invented to illustrate that the event is like sending software into boot camp). It has got some media coverage here and there and since my company decided to build our own website upon it, I thought it might be a good idea. So I talked with one of the creators (and owner) of that tool for permissions to do it. Not only did he agree with the idea, he also suggested that their tester should participate as well. Wow... That was surprising, but okey... so now there were 4 of us. So I named the event Test Camp: Edicy to show what software is being Test Camped. In addition I talked with Development Manager from Webmedia for feedback for the idea. Again another wow moment in that meeting... He also thought it was a good idea and suggested to add another tester from Webmedia... This time I had to decline, since 5 persons in an event I originally planned for 3 would have meant double the participation... I didn't had much time to prepare anyway so I figured I can't handle that many people at that moment.

So... List of participants was locked, software to test chosen, date and time for start of the event agreed upon and location fixed. We decided to start in the morning of saturday the March 13 in my company's office on the border of Old Town of Tartu. After 7+ hours we decided to end since all of us had rough workweeks behind us and wanted some rest. We were mentally exhausted and literally physically tired because of this event. But it was a success... Even for me though I couldn't try out not even as much as third of the new methology I wanted... We found total of 72 issues in just 7 hours, people got to know each other better and learned probably several nice new tricks how to generate better test ideas.

Test Camp's verdict for Edicy was: It works pretty well. So, Fraktal, keep up the good work!

I already have an idea for the next Test Camp as well and hopefully by then I can increase the amount of participants. Until next time and my thanks to Tanel, Madis and Rasmus. Without you, the event would have never been that much fun.