Agile Glossary

Acceptance Testing

What is Acceptance Testing?

An acceptance test is a formal description of the behavior of a software product, generally expressed as an example or a usage scenario. A number of different notations and approaches have been proposed for such examples or scenarios. In many cases, the aim is that it should be possible to automate the execution of such tests by a software tool, either ad-hoc to the development team or off the shelf.

Similar to a unit test, an acceptance test generally has a binary result, pass or fail. A failure suggests, though does not prove, the presence of a defect in the product.

Teams mature in their practice of agile use acceptance tests as the main form of functional specification and the only formal expression of business requirements. Other teams use acceptance tests as a complement to specification documents containing use cases or more narrative text.

The terms “functional test”, “acceptance test” and “customer test” are used more or less interchangeably. A more specific term “story test”, referring to user stories is also used, as in the phrase “story test driven development”.

Acceptance testing has the following benefits, complementing those which can be obtained from unit tests:

  • encouraging closer collaboration between developers on the one hand and customers, users, or domain experts on the other, as they entail that business requirements should be expressed
  • providing a clear and unambiguous “contract” between customers and developers; a product that passes acceptance tests will be considered adequate (though customers and developers might refine existing tests or suggest new ones as necessary)
  • decreasing the chance and severity both of new defects and regressions (defects impairing functionality previously reviewed and declared acceptable)

Expressing acceptance tests in an overly technical manner

Customers and domain experts, the primary audience for acceptance tests, find acceptance tests that contain implementation details difficult to review and understand.  To prevent acceptance tests from being overly concerned with technical implementation, involve customers and/or domain experts in the creation and discussion of acceptance tests. See Behavior Driven Development for more information.

Acceptance tests that are unduly focused on technical implementation also run the risk of failing due to minor or cosmetic changes which in reality do not have any impact on the product’s behavior.  For example, if an acceptance test references the label for a text field and that label changes, the acceptance test fails even though the actual functioning of the product is not impacted.

Unlike automated unit tests, automated acceptance tests are not universally viewed as a net benefit and some controversy has arisen after experts such as Jim Shore or Brian Marick questioned whether the following costs were outweighed by the benefits of the practice:

  • many teams report that the creation of automated acceptance tests requires significant effort
  • sometimes due to the “fragile” test issue, teams find the maintenance of automated acceptance tests burdensome
  • the first generation of tools in the Fit/FitNesse tradition resulted in acceptance tests that customers or domain experts could not understand.

The BDD approach may hold promise for a resolution of this controversy.

  • 1996: Automated tests identified as a practice of Extreme Programming, without much emphasis on the distinction between unit and acceptance testing, and with no particular notation or tool recommended
  • 2002: Ward Cunningham, one of the inventors of Extreme Programming, publishes Fit, a tool for acceptance testing based on a tabular, Excel-like notation
  • 2003: Bob Martin combines Fit with Wikis (another invention of Cunningham’s), creating FitNesse
  • 2003-2006: Fit/FitNesse combo eclipses most other tools and becomes the mainstream model for Agile acceptance testing

For a comprehensive survey, see Automated Acceptance Testing: A Literature Review and an Industrial Case Study

Thank you to our Annual Partners​

Join us today!

Agile Alliance offers many online and in-person events and workshops for our members. If you’re not currently a member, you can join now to take advantage of our many members-only resources and programs.

Get the latest Agile news!

  • This field is for validation purposes and should be left unchanged.

By subscribing, you acknowledge the Agile Alliance Privacy Policy, and agree to receive our emails.

Additional Agile Glossary Terms

The definition of done is an agreed upon list of the activities deemed necessary to get a product increment, usually represented by a user story, to a done state by the end of a sprint.
Agile projects are iterative insofar as they intentionally allow for "repeating" software development activities, and for potentially "revisiting" the same work products (the phrase "planned rework" is sometimes used; refactoring is a good example).
Story mapping consists of ordering user stories along two independent dimensions based on the order activities occur and sophistication of implementation.
Sprint planning is an event that occurs at the beginning of a sprint where the team determines the product backlog items they will work on during that sprint and discusses their initial plan for completing those product backlog items.
"Integration" (or "integrating") refers to any efforts still required for a project team to deliver a product suitable for release as a functional whole.
Class Responsibility Collaborator (CRC) Cards are an object oriented design technique teams can use to discuss what a class should know and do and what other classes it interacts with.
Antipatterns are common solutions to common problems where the solution is ineffective and may result in undesired consequences.

Help us keep the definitions updated

Ready to join Agile Alliance?

Unlock members-only access to online learning sessions, Agile resources, annual conference discounts, and more! And when you join, you’ll be supporting our member initiatives, regional events, and global community groups.

Privacy Preference Center

IMPORTANT: We have transitioned to a new membership platform. If you have not already done so, you will need to SET UP AN ACCOUNT on the new platform to establish your user profile. Your previous login credentials will not work until you do this set up.

When you see the login screen, choose “Set up Account” and follow the prompts to create your new account. You can choose to log in using your social credentials for either Google or Linkedin (recommended), or you can set up your account using an email address.