Value-Driven Delivery - Part 5

The illustration above demonstrates the Gulf of Evaluation. Intangible projects often experience this gulf, where the distinction between what is said and understood may vary.

Ensuring the team is building the right thing and working as desired is a critical agile practice. Verifying and validating value frequently enables the project to rapidly centralize on the real solution, even if that solution is evolving from the originally stated requirements. An agile project aims to create value derived from working software and contains techniques to fix issues as soon as possible before they can grow larger and increase the cost change curve. Agile methods rely on multiple levels of deliberate checks, tests, and confirmation activities to catch issues as soon as possible to mitigate communication challenges often encountered during knowledge-based work.

 

Exploratory and Usability Testing

Exploratory and usability testing encourage agile teams to analyze the product internally and externally, aiming to keep mistakes out of the customer's hands. An independent internal analysis by a tester might unveil issues and unexpected product behavior, while scripted testing can help find edge cases and define system boundaries. In addition to internal testing by the team, it is essential to consider how the product users will respond. User-centered interaction design can uncover product issues related to usability and accessibility and add value to the project by acknowledging the user's perspective. Exploratory and usability testing practices increase test coverage and reduce the risk of product defects.

 

Continuous Integration

XP includes overlapping testing and verification cycles ranging from a few seconds to several months.

Software teams use continuous integration to frequently incorporate new and changed code into the project code repository. As a practice, continuous integration calls for small code commits, frequent integration, and automated tooling that checks for code compliance and other considerations. Incorporating continuous integration provides value at regular intervals by encouraging the team to develop, test, and integrate the functionalities of each Prioritized Product Backlog Item (PBI) in every sprint. Doing so provides the team an early warning and immediate feedback if something breaks or a merge conflict arises. The following are the typical elements of a continuous integration system.

  • Source code control system - The version control software.

  • Build tools - The build tools that compile the code.

  • Test tools - Unit tests that ensure product functionality operates as expected.

  • Scheduler or trigger - Builds are launched on a schedule or under certain conditions.

  • Notifications - An email or instant message reporting on the results of a build.

Continuous integration enables frequent commits that ensure that changes to the product are testable, small, and safe and make the commit process quick and painless.

 

Test-Driven Development (TDD)

Test-Driven Development, also called test-first development, involves writing automated test code first and developing the least amount of code necessary to pass that test later. The entire project is broken down into small, client-valued features that need development in the shortest possible cycle. Based on the client's needs and specifications, tests are documented and used to design and write the production code. Writing a test that initially fails, adding some code until the unit test passes, and then refactoring the code is known as "Red, Green, Refactor" or sometimes "Red, Green, Clean."

 

Acceptance Test-Driven Development (ATDD)

Acceptance test-driven development shifts the focus from the code to the business requirements. These tests are noted in the initial meeting with business representatives while discussing the purpose of backlog items. The overall process is outlined in four stages:

  1. Discuss the requirements - Developers ask the product owner or other business representatives questions designed to gather acceptance criteria.

  2. Distill test in a framework-friendly format - Gets the test ready to enter the acceptance test tool.

  3. Develop the code and run the test - The test initially fails because the code has not been written correctly, but once the code is written and implemented correctly, the test validates the code.

  4. Demo - Automated acceptance testing scripts are present, and software demonstrations commence.

 
Previous
Previous

A New Adventure

Next
Next

Value-Driven Delivery - Part 4