The Agile software development approach involves a highly disciplined process with Government representation on the team to set priorities and ensure whether working software is compliant with contract or system requirements.
Agencies need to ensure adequate resources are applied to manage their contracts irrespective of the strategy used; Agile software development is no exception. While the process is highly interactive, the overall amount of work is not greater - just applied differently - to produce quicker results.
The Government holds the contractors accountable for producing working software consistent with the set sprint/release schedule and within budget.
Documentation & Contractor Performance
Key Question: Without having detailed system requirements documented up front, how will the Government ensure it has appropriate documentation and know whether a contractor is performing?
Agile software development prioritizes working software over comprehensive documentation, but that does not mean no documentation exists. Many agile teams use project boards (also known as Kanban boards) to limit the amount of work in progress during a particular sprint. The information on these boards can serve as documentation of agile performance because they depict discrete tasks that need to be completed during the sprint timeline. Agile project managers maintain these boards, taking notes about the work required and linking to project artifacts within these boards.
While additional documentation may be created from these project boards, government acquisition professionals can structure contracts in a manner to permit project boards to serve as the necessary documentation and evidence of performance. Because the Government is able to monitor progress through these boards, it is clear whether a contractor’s performance is on or off schedule.
Under the Agile methodology, contractor performance is monitored through interactions between the product owner, other stakeholders and the contractor that take place during the development process through various agile rituals like sprint planning, product releases, demonstrations, and retrospectives. These activities help to break the Product Vision down into a product roadmap, where technical requirements are developed that reflect validated learning, testing, and feedback from users.
These are some general rules of thumb for understanding how contract performance is monitored:
Working software is produced in short iterations (which can be as frequently as every 1-4 weeks).
Each iteration produces discrete functionality that meets a definition of done (e.g., end user being able to schedule an appointment online).
Multiple iterations form releases which correspond to the product roadmap.
During release planning, sprints are established to create the technical requirements for individual iterations.
Technical requirements are often identified through “user stories” produced by the product owner to identify and prioritize core requirements that are needed to produce the desired functionality.
Agile teams often create minimum viable product (MVP) versions of the software early in the development process. An MVP contains “just enough” functionality to achieve validated learning from project stakeholders through testing, demonstrations, and user feedback.
Features that may be useful, but not essential, are not considered until the MVP is delivered, and can be stored in the product backlog.
Testing is conducted often, at the end of sprints and prior to release, to determine whether software meets the definition of done. If bugs are found or new features are required, they are addressed in a subsequent iteration.
Product backlogs are maintained to ensure that functionalities not included in early iterations, but which are desired by the product owner and within the scope of the Product Vision, are incorporated into future releases.
In each iteration, the Product Owner may evaluate whether working software is responsive by looking at documentation from user stories, acceptance criteria, tasks to be completed to fulfill the “definition of done,” code quality, and standards compliance.
Agency Resources Required for Agile Success
Key Question: FAR 42.302 lists the contract administration functions to be performed by the Government. When performing contract administration, agencies have noted challenges in committing staff to support Agile software development. Is Agile software development feasible given agencies’ limited resources?
Agencies need to ensure adequate resources are applied to manage their contracts irrespective of the strategy used; Agile software development is no exception. While the process is highly interactive, the overall amount of work is not greater. It is simply applied differently, to produce quicker results. For example, instead of spending time reading weekly reports from the contractor, government team members may instead participate in daily standup meetings, sprint planning sessions, and retrospectives to see demonstrated efforts first hand and provide feedback in real time.
In Chapter 3 of its Agile Assessment Guide, GAO notes best practices for team dynamics, activities, and processes for staff. This report acknowledges that the Agile process works only if there are appropriate dedicated resources who have experience with agile methods. Adequate resources are critical to the highly interactive and disciplined process associated with Agile. This includes a full time Product Owner, preferably one that can take on the role for the life of the project and a dedicated cross-functional team consisting of acquisition professionals, designers, analysts, developers, and testers. It is highly recommended to include acquisition professionals with Level II/III FAC-C, FAC-COR and FAC-P/PM who have graduated from the Digital IT Acquisition Professional Training (DITAP) program and earned a FAC-C Core Plus Specialization in Digital Services (FAC-C-DS) certificate. While agile processes are not rocket science and can be learned on the job, prior training and knowledge helps to promote a healthy working relationship between government and contractor.
As stated above, the amount of work required of the government team is not necessarily greater in agile development, it is just applied differently. As the Agile process matures, the amount of administration work should be less, especially for the acquisition workforce. The following tips help agencies identify the right people and have the necessary arrangements and agreements in advance to ensure success.
Key Question: What level of government resource support would be considered adequate for agile success?
At the outset, instead of focusing on developing detailed requirements and documentation prior to awarding a contract, focus resources on building a cross-functional Agile team, developing norms of team behavior to identify customer priorities aligned with what can be accomplished in time-boxed sprint cycles. This process allows the Government to produce faster results with greater success.
Identify high-performing individuals with business expertise to fill the roles of Product Owners at the earliest possible time and document in the product charter. These roles should be built into the career growth of top performing individuals. The CIO, program leadership, and product owners should establish a memorandum of understanding during the acquisition strategy to memorialize common agreement for the time commitment and responsibilities of the product owner.
Utilize Agile coaches to look programmatically and systematically at how to increase business value and bring standard industry practices into the team. Agile coaches are experts in Agile processes; their main responsibility is facilitation.
When agencies have limited resources, they may have the COR and the Product Owner be filled by the same person. However, all Agile teams do not need to follow this arrangement. Other agencies have had success with having the Product Owner separate from the COR. This allows the technical expertise of the Product Owner to focus on setting priorities, testing, and collaborating on a daily basis with the team, while the COR takes on the higher level overview of the success of the awarded contract.
Key Question: Because Agile software development is a fluid process with technical requirements that are refined as part of the process, how can the Government hold contractors accountable in an Agile environment?
Even though a key principle of Agile software development is that working software is the primary measure of progress, contractors are still responsible for meeting cost and schedule goals. The Government holds the contractors accountable for producing working software consistent with the set sprint/release schedule and within budget, as well as with any performance standards, service level agreements or acceptable quality levels stated in the contract (usually in a quality assurance plan). Working software that is delivered after each iteration should achieve an established “definition of done”, and meet any test scripts that the delivery team requires before making the software available to end users.
The Government also holds the contractor accountable by being involved in and managing the Agile process. With Agile software development, the contractor determines the development processes within each cycle and proposes the way the cycle is to be run within the parameters set by the Government. The Government approves the specific plans for each iteration, as well as the overall plan revisions reflecting the experience from completed iterations. The Government holds the contractor accountable for each iteration and provides input on the “definition of done.”
Agile teams should create a product roadmap that describes at a high level the releases required to incrementally build the final software product. The contractor is bound to deliver the functionality required to satisfy each release, and the quality assurance or performance standards established in the contract hold the contractor accountable. The contractor should be responsible for delivering automated tests that demonstrate the viability of their deployed code.
In summary, contractors are held accountable for delivering a final working software that realizes the Product Vision, satisfying all the definitions of done while meeting the performance standards of the contract.
Tracking Contractor Performance
Key Question: Because Agile software development is a fluid process with technical requirements that are refined as part of the process, how can the Government track contractor progress? Are there consequences for situations in which contractors fall behind?
The Government tracks progress by tracking completed work; in Agile, project status is evaluated based on software demonstrations and confirmed by the commitment of working code to a predetermined codebase such as GitHub. Demonstrations and code commitment typically happens at the end of each sprint (or release), making it very easy to track progress. Documenting or recording agile demonstrations or even taking screenshots of code commitments for a memo to the contracting file are tactics that can be used to formally track progress.
If the contractor is not producing the releases with the required features, the contracting officer should use discrepancy reports or other measures to put the contractor on notice and enforce consequences for poor performance. As stated in FAR 34.2, when an Earned Value Management System is required, the EVMS data also should be used to track progress. Here are some other techniques that can be used to track progress and success of Agile teams.
Promoting Successful Agile Development
Here are some techniques that can be used to promote successful delivery and performance when using agile software development.
Use of Performance Metrics. Agile progress and contractor performance should be tracked using metrics. Consider metrics such as cost variance, schedule variance, the throughput and capacity of the team(s), the number of features completed, bug defects and resolution times, and stability of deployed features over time.
Security. Make security a priority early in the process by starting with a hardened system and regression test every day. Implement continuous monitoring in production and ongoing authorization. Use code scanning tools and processes to allow for ongoing evaluation of security posture. Build any new controls during the regular sprints, test frequently, and be ready to deploy. The system should always be in a deployable state.
Quality Assurance. Change the focus from document compliance to process quality evaluation. Create performance standards and record them in the quality assurance plan. Use automated testing software to pick up defects so that vulnerabilities are addressed in the development phases.
Service Level Agreements. Service Level Agreements may be used for a variety of purposes such as the conduct of release planning activities and the creation and closure of user stories through completed sprints.
Section 508 Compliance. Accessibility should be addressed from the start as each deployable increment will have to be Section 508 compliant. This should be done through continuous monitoring in production and ongoing authorization.
It is recommended that a notional quality control plan be submitted with the offerors’ proposals. This plan should be evaluated to determine whether it will ensure that the performance standards are met. These metrics should cover planning, inspecting, and understanding progress under time, and should correspond with the “definition of done” as proposed in the solicitation. These may include such measures as sprint/release success rates, defect resolutions, time to market, and end user satisfaction.
Tracking Progress During Agile Development
Here are some tips for tracking contractor progress during agile development.
Adhere to Federal Regulations and Guidance. When an EVMS is required per FAR 34.2, the EVMS data will be used to track progress.
Test at every sprint cycle. Testing during software code delivery, instead of after delivery, reduces risk and remediation costs.
Ensure that the “definition(s) of done” is comprehensive and objective. Comprehensiveness includes defining what constitutes a finished product that is packaged, documented, tested, and independently verified. Objective means that it is measurable or verifiable.
Establish and Prioritize Features. In each iteration, features and their prioritization should be established so work can be easily tracked and the most important features completed early in the project.
Utilize burn-up and burn-down charts. Burn-up charts show the progress in completing requirements and any changes to the total number of requirements. They show progress towards release completion and projection of the remaining features that will be completed within a given release. Burn-down charts show the progress of the iteration by displaying the length of the iteration compared to the requirements to be completed.
Regardless of the contract type, the contractor is still responsible for issuing software releases that meet the Government’s requirements that were determined at the beginning of the iteration. If features aren’t included in a release, then those features are reprioritized and added to future releases or disregarded if not needed. The contractor is still required to produce working software at the release dates and adhere to the sprint/release cycle schedule; the actual release’s features may differ based on what realistically can be accomplished in the sprint and the Government’s priorities.
If the contractor is unable to deliver working software within budget and on schedule, the CO is encouraged to use discrepancy reports and meet with the contractor to determine steps needed to get back on track. This reinforces the need for continual Government involvement on the Agile team to help address the issues.