Child pages
  • 2014-01-16 Developers Forum
Skip to end of metadata
Go to start of metadata

How to Join

 Click here to expand...


In person

Courtesy, please

If you are joining remotely via telephone, Adobe Connect, or Skype, please use a headset-microphone, or at least earphones. Please use the mute feature when you are not speaking.

Interactive meeting - Adobe Connect

  • We routinely share a screen during the call. You can view the screen via our Adobe Connect meeting room at For large meetings, the room has the ability to broadcast audio and connect to a telephone-based system as well, as controlled by the meeting hosts.

By telephone

  • US telephone number: +1-888-510-4073
  • Access code: 24222#

By Browser


  • Chat is available in the Adobe Connect meeting room (see above).
  • A backchannel meta-discussion during the meeting also occurs on IRC.




  • Quickly review previous meeting minutes (5 min)
    • Release Process
    • Testing/Countinous Delivery
  • After-action review & next week's agenda (5 min)


Developers Forum 2013-01-16
  • Lauren Stanisic
  • Michael Downey
  • Roger Friedman
  • Burke Mamlin
  • Jeremy Keiper
  • Ryan Yates
  • Daniel Kayiwa
  • Mhawila Mhawila
  • Elliott Williams
  • Rafal Korytkowski
  • Wyclif Luyima
  • Paul Biondich
  • Chris Power
  • Darius Jazayeri
  • Suranga K
  • Nyoman Ribeka
  • Steve Githens
Agenda & Minutes
  • Review Last Week's TODOs
  • TODO: Need some clear boundaries/goals/stakes in the Order Entry Sprint ground/timelines
  • Reschedule topics: Road Map & Implementers Wist List (with Chris Power available)
  • TODO: Is there way to run SonarQube against pull requests (a la Travis CI) rather than commits?
  • TODO: Create a JIRA issue related to dbunit.dataset.NoSuchTableException errors from Bamboo/SonarQube (Darius)
  • Order Entry Sprint Showcase (Wyclif & Burke) 
  • Release Process
  • ?
  • Testing/Continous Delivery
  • QA Recommendations for 2013 were:
  • Emphasize the importance of automated testing at all levels (unit, service, integration, performance). Create a robust testing pyramid
  • Concern that current testing approach is somewhat cumbersome and focused on component/data interactions
  • Include acceptance criteria review/creation in JIRA curation activities; acceptance criteria and test case scenarios should be mandatory
  • Explore ways to formalize the core<-->module contract using Integration Contract Tests.
  • Publish/Subscribe contract tests in component pipelines
  • Execute module tests against all supported versions of core; publish test results for community consumption
  • Adopt a scenario based testing approach to capture high-value paths through the system [especially from implementations]
  • Consider adopting BDD testing tooling; leverage community expertise to increase scenario coverage?
  • QA Recommendations for 2014 were:
  • Manage cycle times to keep feedback loops optimized
  • <5m from commit for unit & component tests
  • <60m from commit for service & integration tests
  • Addition of additional deployment targets & test clients may be necessary [parallel execution]
  • Consider automatic inclusion of community modules in pipeline
  • Goals for CI/CD
  • (stated 2013 CD goals were improvements in Configuration Management, Quality Assurance, and Environments & Deployments)
  • A clear list of community-supported modules and some clearly defined criteria for why modules are or are not in the list
  • Is it just what's in the reference application?
  • TODO: define this process and vet with community (Burke/Darius) along with what gets tested
  • What are the intended targets for CD/CI?
  • Scope: Reference application (API + platform + refapp modules + community supported modules included in OpenMRS 2.0+)
  • What are community supported modules? 
  • Improving the development process (quicker feedback to devs)
  • Improving the code (reducing defects, increasing quality & reliability)
  • More automation (always increasing amount of automation)
  • Making "nightly" (current) builds available
  • OpenMRS API jar
  • OpenMRS WAR
  • Each community modules
  • NOT currently omod snapshots (other than what is included in WAR)
  • Deployments included in CI process
  • Standalone
  • Small system
  • Large system
  • Current issues with CI (action plan)
  • How do we test modules?
  • "Deployments" as part of CI
  • What are we "delivering"?
  • We are deploying to test servers (for automated and manual testing)
  • Formal releases
  • BDD
  • Scripted browser testing
  • web driver
  • selenium
  • Governance (what is included)?
  • Making the decisions/process clear to everybody
  • CI Recommendations for 2013:
  • Migrate from CI and individual builds to a CD Pipeline of component and release pipelines
  • Component pipelines for core and individual modules
  • Release pipeline for integration of core and modules
  • Leverage Bamboo automatic plan branching for pipelines
  • Solidify CI practices and incentivize teams to fix broken builds quickly
  • Recognize CI and the Pipeline as a production system for releasing software
  • CI Recommendations for 2014:
  • Incrementally add new validations to the pipeline; fail the build for anything that matters
  • Release modules into OpenMRS module repository upon successful pipeline run
  • After Action Review
  • What did you expect to happen?
  • What actually happened?
  • What can we do differently?



  • Backchannel IRC transcript
  • Audio recording of the call: Listen online or download - available after the meeting
  • No labels