A blog on Software Testing, Quality Engineering, Tools, Conferences

Developing & Testing – or the other way around?

This week I attended a local meet-up of software testers. Michael Thiele from Saxonia Systems AG gave a talk about his view on testing and developing (original German title: “Testend Entwickeln – Entwickelnd Testen”).

The main message of the talk was that test-driven development on unit test level didn’t work for him and his team. Instead, they are now following the acceptance test-driven approach of development. Starting with the requirements in the specification of the software that is to be developed one derives the actual test cases that will later be used as an acceptance criteria. Michael emphasized that this is already done in code, not in some test case management tool nobody is looking at. Of course, the first description of the acceptance tests is pretty high-level, but it’s of good use to prepare automated checks. One defines the various test cases independent from the technologies that are used to run the tests. Then the implementation of the software or user story is done and unit tests are written directly after that followed by integration tests, if multiple components are involved.

Michael and his team made positive experience with that top-down approach of TDD, especially because developers get used to ask about the acceptance criteria instead of inventing their own. So, it fosters communication between product owner, testers and developers and its a good way to find bugs and gaps in the requirements early.

What is missing in that approach is a good overview of what actually is and isn’t tested. Michael told they tried to extract human-readable test documentation automatically from the test code. I still think that the behavior-driven approach is the better way to achieve the same. With a language like Gherkin you get the test documentation for free and can still benefit from reusing existing test methods, parameterization and so on. It even has good integration into current IDEs.

Summing up, it was a very good talk about a very interesting topic that is relevant for so many software development teams.

Upcoming: GTAC 2016

Google Test Automation Conference 2016 takes place in Sunnyvale, California, USA next Tuesday and Wednesday (November 15-16).

When attending the 2015 edition in Cambridge (Massachusetts, USA) people told me that there will be no GTAC in autumn 2016 but in early 2017. Evidently, Google decided to stick to the yearly schedule afterwards. Unfortunately, I didn’t notice that last change in plans and therefore I missed the registration deadline.

Too bad because the schedule looks pretty good. These are the talks I find most promising by just looking at the title:

  • Using test run automation statistics to predict which tests to run
  • Automating Telepresence Robot Driving
  • Need for Speed – Accelerate Automation Tests From 3 Hours to 3 Minutes
  • Docker Based Geo Dispersed Test Farm
  • How I learned to crash test a server

Let’s see what that’s all about. Also, I’m excited because two of my colleagues got their talk proposal accepted. Dan and Alex will talk about automating audio quality tests: “Can you hear me?” – Surviving Audio Quality Testing. Don’t miss that one. I already know it’s cool stuff because they let me watch the rehearsal.

The good thing is that there will be a live stream of the whole event and recordings of each session afterwards. Nine hours time difference will make it hard for me to follow the whole event live. I’ll provide my personal view on best of GTAC 2016 here in a few weeks.

CAST 2016 – Conference Recap

As I mentioned before, I was lucky to be able to personally attend CAST 2016 in Vancouver. What follows is a quick recap of the sessions I liked most in chronological order:

James Bach: “What Catalyzes Testing? Testability!”

Testability is the quality of being easy to test. After that quick and comprehensible definition the goal was to find out what that really means. So we – the tutorial attendees – were given the following task: Find three things that are really hard and three that are really easy to test. Clearly, James Bach wanted us to find the core elements of testability on our own. We collected many, many ideas ranging from easy to test calculators and text fields to hard to test docker containers and colors. Finally, James gave an overview of his perspective on what he calls “Practical Testability”.

What I learned in this tutorial is that software testability doesn’t only mean the necessity of an interface / an API to drive and check your application automatically. Among others it also contains subjective and project-related aspects. In the end testers should be testability-focused and performing a testability analysis regularly can definitely help the project.

Fiona Charles: “Learning to say NO!”

This was a tutorial with a support group style and lots of role playing. These are some of my learnings on how to better say NO: play it back, use “we”, focus on facts, buy time if necessary, suggest alternatives, be honest, keep explanations to a minimum. The most important conclusions: don’t say YES when you mean NO.

Nicholas Carr: “How automation affects our daily lives” (Keynote I)

I’m not sure on the exact title of this keynote and because there is no recording I wasn’t able to find it out afterwards. The whole presentation was on how automation affects our daily lives. Nicholas Carr stated that Software development and testing is also an ethical challenge. He gave many bad examples of too much automation: planes crashes because of overburdened pilots, car accidents because of drivers relying on GPS and ignoring warning signs. Instead of completely relying on automation, we need to develop rich human skills accompanied by software automation. Nicholas Carr proposed the following ideas to reach that goal:

  • only automate tasks after human mastery
  • transfer control between computer and operator (example: planes)
  • allow professionals to assess situation before providing algorithmic aid (example: radiology)
  • allow friction for learning
  • don’t hide feedback from automated tasks

These ideas were presented in a very entertaining way. After listening to his keynote I decided to read at least one of Nicholas Carr’s books:

  • The Shallows: What the Internet Is Doing to Our Brains (2011)
  • The Glass Cage: How Our Computers Are Changing Us (2015)
  • Utopia Is Creepy And Other Provocations (Sep 2016)

Sandor Boros: “Embedded testers are not undercover cops”

This was a personal experience report of being a tester who is embedded in the development team. If you work closely with the dev team you definitely need deep technical skills. There is of course the problem that you need to wear many hats, i.e. acting as developer and quality guard. Sandor mentioned his experience on management wanting the embedded testers to act as a spy among the developers. During the talk Sandor recommend to do the following if you (want to) work as an embedded tester:

  • take responsibility for the product (like every tester?!)
  • make use of your veto right if necessary (you’ll be respected afterwards)
  • 3 amigo meetings: business analyst, developer, embedded tester
  • introduce testing story points

Peter Bartlett: “Create the change you want”

Pete already worked for his company for many years and has seen people changing jobs and companies because they felt kind of stuck at what they do. This talk was about how to learn new skills without changing jobs frequently, especially from a tester’s perspective.

As mentioned in Sandor’s talk, working closely with the dev team as an embedded tester definitely improves your technical skill set. If you’re not already doing it, talk to the developers about what you are testing and learn about development and the used programming languages. Furthermore, if you think you are an expert in a certain field and are not learning anything new, think about coaching / mentoring others. This really helps you to fully understand and master a certain area or skill. Becoming a coach will probably require you to step outside of your comfort zone, but it’ll allow you to grow your knowledge. Finally, do self reflection on a regular basis. Compare where you are vs. where you want to be. Ask yourself: What are the parts of your job you like most and how can you do more of it?

Sallyann Freudenberg: “Neurodiversity & Software Development. Why the Tech Industry needs all kinds of minds and how we can support them”

This was a very good keynote on a topic that I didn’t know much about before. Sallyann Freudenberg quoted different studies that show a high correlation between autism and STEM education / jobs (science, technology, engineering, mathematics). She states that especially the software industry already has lots of people with “different minds” and that many big inventions would not have happened without autism. The main message of this keynote is that we need all kinds of diversity to get more productive teams and neurodiversity is very important as well.

Some software companies are already addressing that in changing their processes, e.g. in recruiting they offer separate integration programs. Also some companies nowadays understand that the usual office / work environment doesn’t fit everybody.

Nancy Kelln: “Lessons Learned in Implementing Exploratory Testing”

That was an experience report from a test manager working in different projects. Multiple approaches were presented to transition a whole team from ‘traditional’ to ‘modern’ testing techniques. This is what I learned in this talk:

  • Training for all testers usually doesn’t last long.
  • If the trainers / contractors leave, everything will go back to normal.
  • Testers need to want to use modern testing techniques. You cannot force anyone to do so.

Recordings I still need to watch

As there were up to four parallel tracks, I cannot cover all talks. Although,  I still plan to watch the sessions that I missed but that have a recording:


I really had a wonderful time at CAST2016 listening to and discussing about so many talks and meeting lots of interesting people! Once again it has shown me that there is a huge software testing community outside and lots of things are happening that I would really like to participate in. The conference focus is more on the testing profession, tester mindset, exploratory and context-driven testing. Unfortunately, I didn’t attend any talk on testing tools or concrete automation solutions. Let’s see what’s in the recordings.

Next year there will be two CAST conferences, CAST X17 in Sydney and another one later on in the US.

Tests should be easy to read

Yesterday, I read an article about how to unleash the power of Test-driven Development*. I liked most of it so much that I would like to share and discuss some of its main thoughts. What I liked most about it were the concepts of how tests can be written in a way that is easy to read and understand. To assure long-term quality of tests it’s important to comply with some conventions regarding partitioning, naming and structure. I totally agree with the authors that the Cucumber project with the Gherkin language and all its tools is a good reference for easy-to-understand test design. Especially the natural language of the test cases facilitates comprehension for the readers who can then dive into step definitions and the actual test code. Even if you’re not willing to use Cucumber / Gherkin in your test solution you can benefit from its concepts. If you make use of its three phases Given / When / Then in existing test cases, even unit tests get a better structure and readability. Additionally, the authors point out the following requirements for clarity in tests:

  1. Tests should be short. Each section should only contain few lines of test code.
  2. Each test should only check one scenario of a feature.
  3. Tests should be concise, only showing the relevant aspects and hiding insignificant details.

Especially to reach the goal of the third point the authors recommend to use builder classes and helper methods from a small test DSL. All these recommendations are finally used in the following example of a readable test case:

public class OrderTest extends TestBase {
    public void test_order_single_product_in_stock() {
        // Given:
        User user = database().persist(a($User()));
        Product product = database().persist(a($Product));
        // When:
        // Then:

Having good readability like in this example results in faster understanding of what is actually tested and better maintainability. More tests should be like this one!

* original article: “Interne Prüfung. Wie Test-driven Development seine Stärken entfaltet.” by Michael Karneim and Oliver Kraeft in magazine “iX”, Vol. 9 / 2016, pages 84-88.

CAST 2016 in Vancouver

Yeah, I’m so exited about attending the software testing conference CAST 2016 in Vancouver. This is why:

  • AST’s program committee has put together promising keynotes and talks with lots of interesting titles like “Embedded Testers Aren’t Undercover Cops” or “Shifting the Testing Role Pendulum“. Let’s see what’s that all about.
  • The conference is not only about cool topics, but the organizers set special focus on discussing the contents and they foster communication among attendees.
  • I registered for a half day tutorial by James Bach on “What Catalyzes Testing? Testability!” As I already watched and liked so many videos of him talking about software testing (e.g. this, this and this), I’m exited to see him in live action.
  • Vancouver is a beautiful city! Last time I’ve been there I discovered many wonderful places in the city by bike. This time I plan to enjoy the surroundings on a motorbike.

For those of you who are not that lucky to be able to attend I recommend the live stream and video recordings. You will most probably find me writing a conference recap post in a few weeks.

Gherkin in Confluence: Test Documentation for BDD

Even if you are – like me –  not a big fan of creating huge and detailed test plans upfront, there will be a time when somebody from outside your project will ask you about what you are testing at all. This happened to me in various “Testing Reviews” before. If you are using Behavior Driven Development (BDD) with the Gherkin language – and I recommend you to do so! – then you can point that person to your test code repository with all the *.feature files for inspection. These files contain all the user stories you test in a simple and human-readable format.

I admit that repository links to a set of feature files might not always be the best way to present your test cases. Of course, depending on the person you want to present it to, you might want to see the different scenarios on a higher level. That is why there are multiple Gherkin parsers for different programming languages available. You could use any of these to read your Gherkin files and then produce a nice test documentation from it.

I worked in a project where people were very used to see test plans and test documentation in Confluence, a corporate wiki and collaboration tool. That is why I decided to somehow import the feature files from the test code repository, extract the necessary information and print it on a wiki page. The goal was a simple and fast solution and that’s why I shied away of developing an own Confluence plugin for that. Instead, I wanted to use one of the available Script Macros that you can use with Confluence. With these macros you can include scripts in Groovy, Gant, Jython, BeanShell and server-based JavaScript (Rhino) into your wiki page. These scripts are then executed on the Confluence server before delivering the page. Unfortunately for this reason, you cannot add your own script libraries – for example a Gherkin parser for JavaScript – to the server-side execution engine. Groovy fitted best to my needs as it provided easy means to make HTTP requests and parse JSON. Finally, I came up with that script:

def stashBaseURL = ""
def stashREST    = "/rest/api/1.0"
def stashProject = "/projects/RTCGW/repos/rtcgw-automated-testing/browse/features/" 

def responseText = new URL(stashBaseURL + stashREST + stashProject).getText()
def json = new groovy.json.JsonSlurper()
def responseObject = json.parseText(responseText)

def i = 0
print "<ul>"
responseObject.children.values.each { file ->	
	if (file.path.extension=="feature") {
		def featureDescription = "Feature description not found"
		def scenarioList = []
		responseText = new URL(stashBaseURL + stashREST + stashProject +
		fileContent = json.parseText(responseText)
		fileContent.lines.each { line ->
			if (line.text.trim().startsWith("Feature:")) {
				featureDescription = line.text
			} else if (line.text.trim().startsWith("Scenario:")) {
				scenarioList << line.text
		def scenarioListId = "cucumberScenarios-" + i	
		print "<li><a onclick='\$(\"#"+scenarioListId+"\").slideToggle();'>"+ featureDescription +"</a> (see <a href='""' target='_blank'>" + + "</a> for details)</li>"		
		print "<ul id='"+scenarioListId+"' style='display:none'>"
		scenarioList.each { scenario -> print "<li>" + scenario + "</li>" }
		print "</ul>"
print "</ul>"

The result in Confluence looks like that:


Of course, there are multiple things to improve:

  • somehow evaluate or show tags from features and scenarios – so you can for example see which test cases are disabled
  • make the resulting wiki page look better
  • support scenario outlines and backgrounds
  • Page delivery times increase because of server-side script executions.

For my project this simple script helped me to get rid of manual synchronization efforts between test code and test documentation. Page delivery is a little sluggish for about 15 feature files each containing about five scenarios, but people are used to slow Atlassian products. 😉

If you want follow a similar approach, please let know. Maybe, this sample script can serve you as a basis.

I’ll speak at Agile Testing Days 2016

Yeah… my talk proposal for Agile Testing Days 2016 in Potsdam, Germany was accepted. Togehter with a colleague of mine I’m going to talk about “Continuous Large-Scale Testing of Real-Time Communications”.

If you are interested in some more details, check out the conference website or watch the short video I made in preparation of the conference.

Why this blog?

What is this blog about?

As I’m interested in new trends and innovative ideas about software testing, quality engineering as well as tool development, I really like to read other testing professionals’ blogs and also to attend testing conferences. It seems like blogs are the major resource to share thoughts in this community, which I want to actively participate in. So, here is my own blog featuring my current ideas and thoughts. 🙂

What can readers and followers of this blog expect?

Please don’t expect weekly posts here. I’ll share my thoughts on Software Testing, Quality Engineering, Tools and Conferences once I discover interesting things and if I find some time to write it down.

Why start a blog years after it was cool? 😉

It seems like I wasn’t cool enough back then. 😉

© 2024 BlueTopTesting

Theme by Anders NorenUp ↑