Our Agile Process
We executed this challenge with an agile approach, delivering working software every four hours, and building in an ability to respond to change. Initially the project was planned over a 4 day period - with 2 sprints and 1 release each day. We added a couple of sprints (sprint 8 and 9) to increase the usability of the prototype at the end.
We had an part-time agile coach to help guide our process and encourage whole-team collaboration.
Sprint Zero
Before we began core development, we spent time in our Sprint 0: setting up our development environment, including a continuous delivery infrastructure, and going through ideation/human centered design techniques to understand the value we could create and ultimately decide what to build. We also ensured we had a cross-functional team and a productive co-located space to work.
Release 1 to 4 took place over a period of 4 days (2 sprints per day), with a release every day. Release 5 took place over a period of 2 days (2 8 hour sprints) - with only front-end developers and a usability tester working these sprints. The purpose of release 5 was to increase the usability of the site based on feedback from usability testing.
Release 1 (Day 1)
- Sprint 1 (4 hours)
- Sprint 2 (4 hours)
Release 2 (Minimal Viable Product) - Day 2
- Sprint 3 (4 hours)
- Sprint 4 (4 hours)
Release 3 (Day 3)
- Sprint 5 (4 hours)
- Sprint 6 (4 hours)
Release 4 (Day 4)
- Sprint 7 (4 hours)
Release 5 - only front-end development to increase usability based on usability testing
- Sprint 8 (8 hours)
- Sprint 9 (8 hours)
As we executed, we followed a simple pattern:
- Release Planning - Plan each day as a release
- Sprint Planning - Plan and execute two 4-hour sprints per day, using whole-team estimation and buy-in
- Sprint Demonstration - Demo each Sprint's output, receiving feedback from the team and interested parties
- Retrospective - Conduct a retrospective after each release, to adapt our work for the next release
Work was captured as user stories (expressing a need and value), and broken down into technical tasks. We estimated each task and captured the planned user stories for each sprint. We also captured the velocity for each sprint and used that as input into our next sprints.
User stories are archived on our wiki here - https://github.com/booz-allen-agile-delivery/ads-final/wiki/User-Stories
Team Velocity and User Stories Completed
##Agile Practices Used
- Ideation/visioning session
- Pair programming and peer review
- Kanban board (Physical)
- Sprint planning
- Whole-team planning
- Timeboxed iterations (4 hrs)
- Sprint review
- Co-location/Osmotic communication
- Sprint planning and review
- Release planning and review
- User stories
- Prioritized backlog
- API-driven development
- Wireframes/Mockups
- Human-Centered Design (incl. user research, brainstorming, dot-voting, wireframes/mockups, usability testing)
- Regular user-feedback
- Cross-functional team
- Retrospectives (after each release)
- Frequent demos
- Personas
- Product Owner
- Roadmaps
- Agile estimation (story points), whole-team estimation
These activities took place prior to beginning primary delivery efforts:
- Ideation - activities to decide what to build, how we could deliver value
- Establishing continuous integration/delivery infrastructure
- Establishing development environment
- Assembling the team
- Finding the co-located space
- Establishing an initial backlog
One of the backend-developers working with the DevOps team to diagnose an issue with the deployment. The Agile Scrum wall can be seen in the background.
In the background, the front-end designer consult with one of the subject matter experts, a 25-year verterin of the FDA, who stopped by to offer suggestions.
The team votes on a name of a product by placing colored stickies onto product names of their choice.
One of the front-end developer's and the DevOps team watch as the backend team demonstrate one of the new features.
Schedule
The daily schedule post in the team room depicting four-hour sprints followed by all-hands demonstrations and retrospectives.
- 8:30 AM Breakfast
- 9:00 AM Sprint 1 begins
- 1:00 PM Demo and Retrospective
- 1:00 PM Lunch
- 2:00 PM Sprint 2 begins
- 6:00 PM Dinner and Retrospective
The team worked on the plan for how to get-started on the first day.
Primary target for this first Sprint: User story 001: As a consumer, I want to search and select a drug so that I can see more information about that drug.
Planned for this sprint:
- UI for search (2 pts)
- Structure a service to access FDA API (2 pts)
- Mockups/styles for Search & Select (2 pts)
- Complete the Digital Service Playbook checklist (1 pt)
- Define git flow (1 pt) (Total planned 8 points)
Significant events
- During this sprint, we started exploring a new way we could crowdsource keeping drug labels up to date; ongoing conversation and design sessions will drive some future stories
At the Sprint 1 Demo:
- User Story 001 done
- Drug search bar - can type into it, will bring up a drug and brief description
- Running on local - not deployed yet
- Completed stories:
- UI for search (2 pts)
- Structure a service to access FDA API (2 pts)
- Mockups/styles for Search & Select (2 pts)
- Complete the Digital Service Playbook checklist (1 pt)
- Incomplete stories:
- Define git flow (1 pt) (Completed 7 points)
Feedback:
- Approach Sprint 2 with test-first methods/have more unit tests
- Run Sprint 2 demo on a "Release" environment
Completed 7 points out of the 8 points forecast
User story 002: As a consumer, I want to see a list of reported adverse effects for a selected drug so I can understand the risk of taking the drug.
Stories planned:
- Search by brand name for adverse events (1)
- populate list of adverse effects (2)
- Fix search UI bugs (3)
- Unit tests for UI (1)
- 3 environments (3)
- Napkin usability (2)
- Wire for "napkin" (dependency) (2)
Forecast a total of 14 points for this sprint
At the Sprint 2 Demo:
- Search by brand name for adverse events (1)
- populate list of adverse effects (2)
- Fix search UI bugs (3)
- Unit tests for UI (1)
- 3 environments (3)
- Napkin usability (2)
- Wire for "napkin" (dependency) (2)
Completed 14 points
Positive:
- Lunch
- Coffee
- Build pipeline
- Accommodate scope change
- SME involvement
- Forecasts worked
- People spoke up
- Demos
- 4 hour iterations
Negative:
- Missing key resource (sick)
- Pivoting conversation may have been interruptive
- Documentation artifacts perhaps behind
Change
- Start promptly at 9 AM
- Review documents Wednesday afternoon
- Get desks, monitors (Nico)
- Get cleanup
We planned our overall release, along with some additional functionality which would likely come along in Day 3 -- prioritized, we had our guiding plan:
Focus for Sprint 3: Making our Day 1 progress smoother, better-looking, easier to use; and beginning progress on the Crowdsourcing user story -- primarily on the backend.
Graphical refinement for user stories 001 & 002 (landing page, search, list of reported effects)
- Enable search on enter/select (1)
- Filter autocomplete to reduce duplicates (2)
- Reviewing site text, title (2)
- How to be agile diagram (2)
- Revise reported adverse effect API to return top 50 (1)
User story 004: As a crowdsourcing user, I want to record adverse effects mentioned on the label, based on a scan/preselection of the label text, using simple clicks, so that I can contribute to the value of the database in this tool.
Tasks planned:
- Graph mockup (1)
- Crowd sourcing wires (2)
- Database/scheme for storing label adverse effects data (1)
- Describe API FE-BE for accessing suggestion adverse effects from label (2)
- API for submitting an adverse effect from the label (2)
- Implement search of label text, return suggested (2)
- Rails, models 2 APIs for label adverse effects (1)
Forecasted 19 points
- Completed 17 points
- Graph mockup (1)
- Crowd sourcing wires (2)
- Database/scheme for storing label adverse effects data (1)
- Describe API FE-BE for accessing suggestion adverse effects from label (2)
- API for submitting an adverse effect from the label (2)
- Implement search of label text, return suggested (2)
- Rails, models 2 APIs for label adverse effects (1)
- Incomplete stories:
- Implementing the search to return suggested adverse effects from the label text
User story 004: As a crowdsourcing user, I want to record adverse effects mentioned on the label, based on a scan/preselection of the label text, using simple clicks, so that I can contribute to the value of the database in this tool.
Tasks planned:
- Implement search of label text, return suggested (2)
- UI to show label text & suggested adverse effects (needs wire) (2)
- Submit effects from user to backend (2)
- UI for submitted state (1)
- UI in place for text submission (1)
- Chart add seriousness (2)
- Persona for crowdsourcing foundation/address (1)
- Add generic bottle picture (1)
- Reviewing site text, title (3)
- Mockup (HiFi) crowdsourcing (2)
Documentation planned:
- Continuous integration details (1)
- Digital service play evidence (1)
- Continuous deployment details (1)
- Digital services check list (1)
- Continuous delivery details (1)
- End-to-end continuous delivery w/ tools + products - raft update (1)
Forecasted 24 points
At the sprint 4 demo:
- Completed points
- Graph mockup (1)
- Crowd sourcing wires (2)
- Database/scheme for storing label adverse effects data (1)
- Describe API FE-BE for accessing suggestion adverse effects from label (2)
- API for submitting an adverse effect from the label (2)
- Implement search of label text, return suggested (2)
- Rails, models 2 APIs for label adverse effects (1)
- Incomplete points:
- Implementing the search to return suggested adverse effects from the label text
Start:
- Setup Rails environments
Stop:
- Merges right before demos
Do More:
- Front-end unit tests
- Release the Master
- Front end documentation
- Peer Review
Do Less:
- (nothing)
- Site text, vision, message (content) Style guide
- Finish submitting suggested label adverse effects
- Submit a new (text entry) adverse effect
- Ensure responsive design
- Deploy continuous monitoring
- Display crowdsource statistics -> filling pill bottle indicator?
Report + fix bugs
Primary target for the fifth sprint: continuing to work on user stories 004, 005, and 006.
Planned for this sprint:
- Intro purpose statement (1)
- Site test (3)
- Submit effects from user to BE (rec & test) (2)
- UI + text for submitted state (2)
- Determine responsive design concerns (1)
- Record/Publish CI/test metrics (2)
- Integrate Promotheus (3)
- Create alerts (1)
Forecasted 15 points
Significant events
- During this sprint, we decided to shift from an earlier "checkbox" UI approach, to a simple "yes-no" question approach, for a better UX
So we rewrote User Story 006 as User Story 007, and realized we didn't need User Story 005.
- We did not finish creating the continuous monitoring alerts in Sprint 5 (1 pt)
- We did complete some unplanned usability testing on our crowdsourcing feature (2 pts)
- Based on usability feedback, we were able to jump ahead to the (unplanned for this sprint) upcoming task of implementing the simpler "yes/no" UI for our crowdsourcing feature (2 pts)
- Based on usability feedback, we completed a change to our site title, from MineMed to CrowdMed (1 pt)
Primary target for the sixth sprint: continuing to work on user stories 004 and 007.
Planned for this sprint:
- Decide on "help" text (1)
- Show 5 sentences, label text each time (not whole label) (3)
- Bold/emphasize a found word from the label text (2)
- Implement usability recs from testing (2)
- Obscure upcoming questions (2)
- Deploy site text onto site (2)
- Define "progress" or "verified" for crowdsourced data (2)
- Chart mockup sync (2)
- Research alerts (2)
- Make load testing available (1)
- Publish/viz more metrics (2)
- Wiki doc for Human Centered Design (3)
Forecasted 24 points
Primary target for the sixth sprint: continuing to work on user stories 004 and 007.
Planned for this sprint:
- Update "help" text (1)
- Write value proposition on why to contribute (1)
- Write instructions on how to report averse effects (1)
- Fix progress bar on crowdsourcing chart (1)
- Fix Weird Dates on frequency chart (1)
- Define "progress" or "verified" for crowdsourced data (2)
- Overall Page Style (10)
- Publish/viz more metrics (2)
- Create Monitoring Alerts (1)
- Wiki doc for Human Centered Design (3)
Forecasted 23 points
Incomplete Sprint 7 items: None
Primary Target for this sprint – cleanup and usability
Forecasted points = 6
- Landing Page Updates (1)
- Update Leader Board (2)
- Graph – color for Icons and Legend (1)
- Fix Tree Graph (2)
All tickets successfully completed - 6 story points