15
Apr
I Just Want to be “Accepted”, part-1
No Comments Agile Testing, On Agility, Pointers Acceptance Criteria, Acceptance Tests, ATDD, BDD, User Story
Have you heard of the “3’C’s” of User Stories? It’s a to remind storywriters of the three key aspects of a User Story:
- Card
- Confirmation
- Conversation
There’s quite a bit of debate as to what the most important ‘C’ is. Often in my classes I talk about “conversation” or collaboration being the most critical ‘C’. But to be honest, I have a hard time making a priority distinction between the three components of a user story.
In this article I want to explore an area that is often overlooked. It’s the confirmation-C. I sometimes refer to it as:
- Acceptance Criteria;
- Acceptance Tests;
- Mini-UAT for each story;
- Or Confirmation Tests.
Acceptance tests seem to be the most often used terminology. For example, leading to the notion of ATDD or Acceptance Test Driven Development, which can be a powerful side effect of how you approach writing your stories.
So let’s start with an introductory example.
Here’s what I would call an Epic with several related epics derived from it. We don’t have any acceptance tests yet, but we’re starting to develop a related set of epic-level stories.
- 1. As a writer, I want to allow for text font changes; 20-30 different font types, colors, so that I can highlight different levels of interaction with my readers
- variations from this “root story”…Allow for various attributes: underline, bolding, sub/super script, italicize, etc…
- Allow for a form of headings; 3 primary levels
- Allow for indenting of text
- Allow for lists (numbered and bulleted); single level first, then move to multi-level
- Allow for alignment – right/left justified, centered, variable
- Allow for do/un-do to include ongoing text activities
- Establish a paragraph model (or a variety of models)
- Show/hide ‘hidden’ formatting marks
- Establish the notion of a “style set” that can be used to establish a collection of favorites
Let’s expand upon the second Epic:
As a Writer, I want to allow for various attributes: underline, bolding, sub/super script, italicize, etc. so that I can highlight different levels of interaction with my readers
We’ll start writing acceptance tests for this story. I have a preference for using “Verify that…” phrasing when writing my acceptance tests.
- Verify that underline works
- Verify that bold toggles for all font / color types
- Verify that all combinations of all attributes can be combined
- Verify that font size changes do not impact attributes
- Verify that paragraph boundaries are not effected by attribute changes
- Verify that attributes continue in pre-text, post-text ; for example, if we bold a numbered list text, the number should be bolded
You’ll notice in this case, that the acceptance criteria are all functionally focused. I don’t think that’s necessarily bad, but it would be nice to put in some significant error cases as well. For example, lets say that sub/superscript are not allowed in headers and footers for some reason. Then I’d expect the following acceptance criteria to be added to the list:
- Verify that super & sub script are not allowed in Header or Footer areas and that an error messaged is displayed in-line and on the error console
I hope you see the clarity and value that solid acceptance tests can make to your story writing. I always refer to them as helping the 3 Amigos who are collaborating around story writing:
- From a Development perspective: they should share design hints with the developer(s) exploring what’s important and the business logic behind each feature. They should share non-functional requirements as well, for example performance requirements.
- From a Testing perspective: they should share some of the ‘How’ and ‘Why’ behind the customers usage and their intentions. The tester(s) should use this information to construct a series of tests that exercise the most important bits surrounding customer value.
- From a Product Owner perspective: they are a rich communication landscape to augment the ‘C’ard of the user story. Typically the PO writes them in a grooming session with their team—so they are collaboratively explored and defined. They also serve as an acceptance checklist when the team delivers a ‘Done’ story for Product Owner sign-off.
This combination of roles (perspectives) surrounding the acceptance criteria helps to ensure the customer deliverable meets the need AND that you have a rich set of “tests” to confirm it.
Wrapping up
In part two of this article, we’ll explore how acceptance tests help in your agile user stories, aspects of meta-acceptance, and an example of acceptance tests for a “technical User Story. Hope you stay tuned for that.
BTW: I wrote another blog post focused on the 3 Amigo’s. You can check it out here.
Till then, stay agile my friends,
Bob.
No Comments Agile Testing, On Agility, Pointers Acceptance Criteria, Acceptance Tests, ATDD, BDD, User Story