Category Archives: Testing

STANZ speaker profile: Karen Johnson

If you’re interested in testing you’ll want to hear what testing experts have to say on the subject. At this year’s STANZ (Software Testing Australia New Zealand conference) there will be presentations from four international testing experts. Each week we’ll tell you a little bit more about each of them. This week we’re introducing Karen Johnson.

About Karen:

Karen is a software test consultant. She is based in Chicago but travels to speak at conferences around the world and work with organisations planning test strategy.

She has worked as a software tester or test manager since 1992 after catching the testing bug (pardon the pun) while writing technical guides.

Karen’s testing history is very varied. She has worked with banking, manufacturing and ecommerce software as well as content management systems, medical software and business intelligence initiatives.

As well as teaching and testing Karen is a contributing author to the book Beautiful Testing released by O’Reilly publishers. She has published numerous articles and blog posts about her experiences with software testing. Read the rest of this entry »

Leave a comment

Posted by on August 11, 2011 in STANZ 2011, Testing


Tags: , , , , , , , ,

Performance agreements when you are on projects

I was happily running a class when someone asked about how performance agreements work if you are on projects.

“Really well” I replied, “you just need a new one for each project you are on.”

My answer didn’t seem to go down well though. After all, who wants yet another piece of administration to do?

Then we discussed the problem that both permanent staff and contractors often have with projects. Contractors get no real feedback on how they are going until their contract ends, while permanent staff have to have a series of discussions based on a document that bears little relationship to what they are doing on their projects.

Read the rest of this entry »

1 Comment

Posted by on August 3, 2011 in Business Analysis, Testing


Tags: ,

Learn from international experts. For free.

While you are progressing in your I.T. career you want to hear from the smartest people in the industry, right? Here at SoftEd we sell two and three day MasterClasses presented by industry experts, but if all you have is a couple of hours you can still benefit from attending a SoftEd sponsored meetup. Our speakers present at small, community events in Australia and New Zealand throughout the year; the next two talks will be in Brisbane and Sydney.

For Business Analysts in Brisbane

If you work in business analysis you have probably heard of Alec Sharp, author of the bestselling book, ‘Workflow Modelling’. He spends part of his time running his consulting business, Clariteq Systems Consulting and the rest of it conducting workshops and presenting at conferences. Any time which is left is dedicated to following his favourite ice hockey team, the Vancouver Canucks (and presumably sleeping). On the 28th July he’ll be in Brisbane presenting at the Agile Academy Meetup Group, talking about Agile Modelling. There has been a lot of change for business analysts over the past few years as companies have moved from ‘traditional’ software development which includes a huge amount of documentation to ‘Agile’ processes which have cut down documentation (sometimes dramatically). If you’re finding it tricky to negotiate between enough detail and information overload then Alec’s experience, tools and techniques will help you strike the right balance. As a result, your developers will have a better understanding of your project, your stakeholders will be happier and you’ll have an easier life, what could be better? If that sounds good to you please RSVP to attend.

Alec is running his Advanced Business Process Management course this July in Auckland, Wellington, Brisbane and Sydney and there are still a few places left if you would like to sign up.

For Testers in Sydney

If you work in testing you may have gone through the ISTQB to get certification, or you may have decided that it was not relevant or necessary and gone without, as one well-known testing expert, James Bach, has done. He is quite a controversial figure in the world of testing and is a powerful speaker who doesn’t shy away from challenges and rigorous debate. This means his Rapid Software Testing course is great for testers who want to examine what they do and why they are doing it, helping to concentrate their efforts and improve their confidence. James has partnered with SoftEd for years and we regularly bring him out to this part of the world to run courses, if you’re interested in hiring him to train your testers you can get in touch with us.

James will be in Sydney next Monday presenting at the Sydney Testers Meetup. Keep your eye on this group for future events because they’ll have more guest speakers throughout the year.

What’s next?

Who knows what the future will hold for you and your career? We don’t have any crystal balls, but we do have the phone numbers of international I.T. experts and if you subscribe to our blog or follow us on twitter we’ll let you know the next time they’re in town.


Kiwi Software Testers Unite!

Last weekend SoftEd and James Bach hosted the first Kiwi Workshop on Software Testing (KWST) at the SoftEd offices in Wellington. This event is special because of the very specific way it is set up and run (which I’ll discuss below). This particular event was also very special because of all the people who came along (on a Friday and Saturday) and contributed so much to the discussions we had. It definitely felt like the beginning of a conversation about the future of the testing profession, rather than a stand-alone event and we can’t wait to see what happens next! For more on the content of the KWST, read Brian Osman’s blog post.


So what are the rules? Firstly it is by invitation only and has a maximum of 20 participants. This is to ensure a wide range of backgrounds and opinions, but also some shared attributes, so in this case all of the participants were testing managers who would have enough shared experience to understand each other, but enough unique experience to learn from each other. Also keeping a cap on numbers is helpful because the conversations can go on for a long time. Even with the 20 or so people we had, KWST could have easily lasted two weeks rather than two days!

Hard at work!

Secondly the facilitation role is essential (massive thanks to Brian Osman for his heroic efforts there!). This is because everyone who attends KWST can make a presentation or deliver an ‘experience report’ and then a discussion can stem from there involving the whole group. There isn’t always time for everyone to give a presentation, but everyone gets the chance to participate in the discussion and it is not over until everyone is satisfied, which also has the consequence that a typical event will be able to cover no more than two or three topics at the most.

Facilitation cards

To aid the facilitator there are cards which each participant is given. If someone wants to contribute to the current discussion they hold up their yellow ‘same thread’ card. To start a new thread on the same topic they hold up their green ‘new thread’ card. Where they have something of high importance to contribute they hold up their red ‘high priority’ card and finally (perhaps most importantly) if the discussion is going off on a massive tangent anyone can hold up their purple ‘rat hole’ card. There is a fantastic blog post on how to run these events which covers more detail, especially about the role of the facilitator and to be honest if I were to write anymore I’d merely be copying what it said, so if you’re interested please follow this link to read it.

Of course you can also include games, plenty of breaks and delicious food to keep the brain active. For even more info on what was discussed at the event, the best thing to do is read the twitter feed of some of the participants: James Bach, Brian Osman, Aaron Hodder, Farid Vaswani, Oliver Erlewein, Richard Robinson and Nadine Brown or you can do a twitter search for the hashtag #KWST to see all the news! This new event framework has given us lots to think about. If you have any thoughts or ideas you would like to share, please leave a comment, we’d love to hear from you.

See the rest of the photos from the event on flickr

1 Comment

Posted by on June 28, 2011 in Testing


Tags: , , , , , ,

Exploratory Testing is Dead! Long Live Wombat Testing!

During one of my many customer visits last week I was talking to someone who has been a software tester and business analyst for the last six years. We talked about the variety of training courses there are available and the benefits they provide (letting them know that we offer lots of great testing courses at SoftEd of course!).

This customer hasn’t been on any training courses so far in their career but they did attend our STANZ conference in 2008 and it proved to be an informative experience. Without any specific guidance their team at work had come up with their own terminology for what they did, such as “Wombat Testing”. This was the name they gave to the practice of ‘burrowing’ through a system looking for bugs. After attending STANZ they realised that what they did had a ‘proper’ name: Exploratory Testing; and that actually lots of other test teams use it as well and have had great results.

I thought this was interesting for two reasons. Firstly I’ve heard people say ‘I don’t have time for training’ so many times, however when people have been able to go on a course or go to a conference we get an overwhelmingly positive response. This was certainly the case for this particular tester. They were in the middle of a big project when STANZ 2008 was on, and they had to make a case for attending the conference, but because they were successful they not only got to meet other testers with similar war stories but they also acquired new skills to improve their “Wombat Testing”. Secondly I think “Wombat Testing” is a brilliant name – Exploratory Testing is Dead, Long Live Wombat Testing!

(By the way, this is meant to be a story more than a sales pitch but if you do want to know more about STANZ you can visit our website and if you want help making a business case to secure your attendance this year, get in touch with SoftEd!)


Posted by on June 9, 2011 in Courses, Testing


Tags: , , , , , , ,

SoftEd and Revolution IT partner up!

Last month Software Education and Revolution IT became partners, which means we can offer our customers the same fantastic software testing and business training courses, now run even more frequently in Melbourne, Sydney, Brisbane, Adelaide, Auckland and Wellington.

Software Education have been training providers for over 20 years and we pride ourselves on having excellent trainers. Our testing trainers consistently score top marks from our customers for both their knowledge and teaching ability. For more information on our testing courses have a look on our website and get in touch with us to book a place.

Leave a comment

Posted by on June 7, 2011 in Testing


Tags: , , , , , , , , , ,

Rex Black hosts a free webinar this Friday – Testing Metrics

This Friday at 1.30 (New Zealand) or 11.30 (NSW/QLD) Rex Black will be presenting a webinar entitled Testing Metrics: Project, Process and Product. If you use facts and figures to understand your business or your product, you’ll know how useful they can be. The problem is making sure your metrics are correct and useful. In this webinar Rex looks at how we can use metrics for testing, including which metrics we can use to measure the test process, metrics we can use to measure our progress and what metrics tell us about the quality of a product.

Those of you who know Rex will know that he speaks at several conferences and workshops around the world and has written several books on testing. The most popular of his books, “Managing the Testing Process”, has sold over 25,000 copies and has been translated into several languages. He is President and Principal Consultant of RBCS which is a leader in software, hardware and systems testing as well as past president of the International Software Testing Qualifications Board (ISTQB) and the American Software Testing Qualifications Board (ASTQB) and is co-author of the ISTQB Foundation and Advanced Syllabus. Phew! What a long list!

The other thing you might want to know about Rex is where you can do one of his courses. In Australia and New Zealand Software Education are exclusive partners with Rex so we’re the only organisation that run his courses. Drop Paula (NZ) or Bridgette (Aus) from our sales team an email if you’re interested in finding out more information.

This webinar is one of several FREE webinars which Software Education has hosted. We post all of our upcoming and previous webinars on our site. Some of the webinars are run by our partners (like tomorrow’s webinar) and some of them are done independently by us (these happen once every three months). This week Shane Hastie presented a webinar entitled ‘Building and Working With User Stories”. There were around 300 attendees from 14 different countries, including a handful from the U.S. which is even more impressive when you consider that it was about 5 a.m. for them! We have had some great feedback about the session and are really pleased that it has proved useful for people. At Software Education we’re always looking for ways to help our customers and fix their problems so do get in touch if you want to suggest topics or just tell us what you think in general 🙂

If you attend tomorrow hope you enjoy it (I’ll be logging in as well!) and if not then I hope one of our upcoming sessions is of interest. Thanks.

Leave a comment

Posted by on May 26, 2011 in Testing


Tags: , , , , , , , , , , , , ,

Government employees eat cake and talk Agile!

Yesterday Nicki Jones and Kim Partridge headed down to the Government Test Professionals Forum at Te Papa to tell all of the attendees about the fantastic courses and conferences coming up at Software Education. It was a busy day with over 100 testers and test managers getting together to hear presentations from several testing course vendors. Our very own Shane Hastie gave the post-lunch presentation about the work that has been done at the Livestock Improvement Corporation to transform their team from working in a traditional ‘waterfall’ way to taking an Agile approach.

Shane Hastie speaking in the main auditorium

SoftEd had a great day out, we always love the opportunity to meet our customers and to keep an ear to the ground as to what is new in the world of testing. Even better, we got to spend the day at the wonderful Te Papa and were able to indulge in their fantastic catering…

There wasn't much cake left by the end of lunchtime!

If you would like to hear about our testing courses, it’s really easy to get in touch and in the meantime you can check out the rest of our photos on flickr and our twitter stream. We hope your day is full of cake too!

Leave a comment

Posted by on May 4, 2011 in Testing


Tags: , , , , , , , , , , , , , , ,

Why so little good Agile?

Today I gave a talk at the  Government Test Professionals Forum conference in Wellington.

I presented a case study about how the Farm Systems Division of Livestock Improvement Corporation have adopted Agile methods.
I told the story of Team Awesome (they choose the name themselves), how their practices have changed and the measurable benefits that have resulted from their new way of working:

  • Massive reduction in residual defects
  • Increased team satisfaction
  • Shorter time to market
  • Increased customer satisfaction

I told the story of how the team collaborates, how all the roles work together. How testing is fully integrated into the flow of work and how the whole team (developers, analysts, product manager, testers) coordinate their activities starting with expressing the detailed requirements for user stories as test design specifications using the Behavoural Driven Development model. They have adopted Agile well, and gained the benefits that all the books talk about.

After the talk I was chatting with one of the attendees and she said that my story was unusual – her experience as a tester on “agile” projects has been uniformly negative – testers as victims of development teams who use “going agile” as a excuse for hacking and not bothering with requirements.

This saddened me; surely we know how to apply these practices properly, don’t management in our organizations understand that agile is about applying the practices in a disciplined way, you can’t just pick the pieces that seem easy? Agile works when the combination of practices are applied together – for example: user stories are a good technique for identifying features and prioritizing the work to be done, but they must then be supplemented by some technique to define the detailed requirements on a just in time basis, just ahead of when the story will be developed. You can’t just adopt user stories and ignore the detailed requirements and testing – that is truly Tragile, but seems to be oh-so-common.

Building software is hard work; building good software takes a concerted team effort, irrespective of the methodology being used. We know this, and have known this for the last 40 years.

Please stop calling these half-hearted implementations Agile – it’s not, it’s just a continuation of the bad practices that many organizations have followed over the years, just with a different brand.

1 Comment

Posted by on May 3, 2011 in Agile, Testing


Tags: , , , , , , , , , ,

What to look for in system testing

Previous readers of this blog will have noticed the intellectual firepower and sheer passion of the contributors who share their views on testing.

They will also have noticed the blatant and shameful lack of authority with which one contributor (me) replies to questions and issues around testing.

What you may not be aware of is that I frequently also share my confident but dangerously simplistic views on testing when running courses, talking in the pub and anywhere else.  So hopefully those better educated in the mysterious ways of the tester will continue to correct my wayward advice before anyone follows it.

So, with that opportunity, here is an email request I recently received:

Dear TWG (Testing-wannabe-guru):

On our last day of our BA course, you mentioned several things a tester should test for in system testing.  Would you be able to send me the list?

Yours – BA who believes in quality

So here is my response- subject to correction from the pros:

Dear BA

In system testing we should test for things that might be wrong with the system.  You would think this means the process, the IT applications, the training material and other things that a business person would think are the “system” we are delivering.  But this would be hard and might result in a lot of work when we realise there are issues.

So we normally only test the IT applications and even then we only test three  things:

  1. Test things the developer should have already thought of when writing the code.
  • Since the developer should have thought of these things, we should not need to test for them at all.  So you can assure yourself of the system’s integrity by asking the developer “does the system do what we asked you to make it do?”  They will generally say “yes” and look a bit shifty.
  • We then ask the developer “how do you know” and they will respond “I don’t know – its the testers job to tell you if the system does what you asked me to make it do”
  1. Test the developers patience.  You should now ask the developer if they are a proud member of the professional elite of great developers.  They will tend to say “yes”.  Then you say – “Excellent, then clearly you would have not only made the system do what we asked you to make it do .. but even suggested innovative new things that nobody else could have made it do … Surely that is the difference between the adequate developer and the great one.  So, how can I prove to others that you deserve their respect and even their awe?”
  • They will generally look perplexed or annoyed – so ask them “OK, assuming we had great developers then we should only need adequate testers – what would an adequate tester test for to make sure the system works”
  • Now ignore what the developer says, because this is the stuff they  have thought of and that will generally already be working.  This is therefore not what we spend time testing in the system testing phase.
  • Testing the developer’s patience should always be included in your testing.  Not only is it fun, but you will soon learn who the good developers are (they actually do know the system does the obvious things they were asked to make it do) and this is a good measure of the quality of the overall system.
  1. Test for bizarre things the developer would not think  of
  • Begin with scenario tests – create some examples of how people might use the system.  Don’t just look at one requirement but think up and example where the user would want the system to do several things.  For example, to test a banking system, don’t test just whether you can log in.  Do an example where someone logs in, looks at an account balance, performs a transaction and then goes back and looks at the account balance.
  • Think of some more examples (scenarios) and spend about 70% of however much time you have available running these though these examples to see if they work.
  • Now do some “bizarro testing”.  This is often called “exploratory testing“.  Think of random things that someone might do if they were dumb/they were distracted/they were intelligent but did not understand the system/they were older than normal/they were younger than normal/ they were in any way different to how a developer would imagine them to be.  Spend 30% of your time just doing some of these random seeming things and see what errors and unpleasant outcomes you can uncover.

Its pretty easy really.  The only problem is that you will only have a few days available and you will need about 6 months to test the system properly.  So, oddly enough what we pay testers for is not “finding bugs” but

“Spending very limited time to

  • Help the good developers discover the issues and problems they would not have thought of and
  • Let us know if it is likely that the system will suck in production and embarrass the team, or work as designed and make us look good.

I am not sure how they do this with so little time available but apparently this is their job.

Does anyone know how we can do better in system testing that the above approach?  It seems quite straight forward and yet appears to be a real challenge on projects.