Wiki Spaces

Documentation
Projects
Resources

Get Help from Others

Q&A: Ask OpenMRS »
Discussion: OpenMRS Talk »
Real-Time: IRC Chat

Resources

Skip to end of metadata
Go to start of metadata

How to Join

 Click here to expand...

 

In person

Courtesy, please

If you are joining remotely via telephone, Adobe Connect, or Skype, please use a headset-microphone, or at least earphones. Please use the mute feature when you are not speaking.

Interactive meeting - Adobe Connect

  • We routinely share a screen during the call. You can view the screen via our Adobe Connect meeting room at http://connect.iu.edu/omrsdf. For large meetings, the room has the ability to broadcast audio and connect to a telephone-based system as well, as controlled by the meeting hosts.

By telephone

  • US telephone number: +1-888-510-4073
  • Access code: 24222#

By Browser

Chat/IRC

  • Chat is available in the Adobe Connect meeting room (see above).
  • A backchannel meta-discussion during the meeting also occurs on IRC.

 

Agenda

 

  • Quickly review previous meeting minutes (5 min)
    • Latest on improving QA processes and practices (CI/CD related testing, Bamboo, SonarQube, etc.), How can we "move up the stack" regarding quality assurance?
  • After-action review & next week's agenda (5 min)

In Attendance

  • You

Minutes

View at notes.openmrs.org

Developers Forum 2014-05-29

 

 

 

Attendees

 

  • Michael Downey

 

  • Chuck (Suranga) Kasthurirathne

 

  • Ryan Yates

 

  • Elliott Williams

 

  • Burke Mamlin

 

  • Wyclif Luyima

 

  • Nyoman Ribeka

 

  • Vaclav Krpec

 

  • Geoff ________________________________

 

  • Joseph Kaweesi

 

  • Willa

 

  • Rafal Korytkowski

 

  • Darius Jazayeri

 

  • Daniel Kayiwa

 

  • Shaun Grannis

 

 

 

Agenda & Minutes

 

  • Review last week (5 min)

 

  • No TODOs from last week

 

  • Latest on improving QA processes and practices

 

  • Late 2013-early 2013 assessment. Presentation details:

 

  • CI/CD related testing, Bamboo, SonarQube, etc.

 

  • SonarQube - automated analysis of code

 

  • Downey: SonarQube currently runs only on staging. We're not sure on its status or usage. (No one is doing much complaining.) Infrastructure team has no information on how it was built and needs that to move it to production :-)

 

  • Rafal & Ozge worked on this in 2013 ... Rafal has some details. Follow-up meeting with Rafal and infrastructure to be had, then maybe try to track down Ozge or others.

 

  • It's still installed on the staging server, but probably didn't get re-started after a reboot. Without details, we don't know how to do this. :-)

 

  • Acceptance Criteria

 

  • Can we create a culture where tickets aren't Ready for Development until acceptance criteria are defined?

 

  • ThoughtWorks suggested this be part of our triage process

 

  • The criteria can be listed as 1+ subtasks under a JIRA issue

 

  • TODO: A dev forum on "Acceptance Criteria"

 

  • Behavior Driven Development / Test Driven Development

 

  • Code Review

 

  • Who does code review?

 

  • How could 10x as many people be doing code review 2 years from now?  What would that take?

 

  • TODO: Automate basic code review via Travis

 

  • Automated Code Analysis (Sonarqube) +1

 

  • Profiling

 

 

 

  • Unit Tests

 

  • Regression Testing (e.g., Travis)

 

  • Integrations Tests +1

 

  • Core vs. Modules

 

  • Scenario-Based Testing

 

  • Platform +1

 

  • Release Testing

 

  • User Acceptance Testing

 

  • Performance Testing

 

  • Deployment Testing +1

 

  • Database Installation Testing

 

  • Target Environment Testing (e.g., version of Tomcat)

 

  • Upgrade Testing

 

  • Deployment Use Cases – e.g., small, medium, and large

 

  • Thoughts/Notes

 

  • Need to think about CI test coverage of OpenMRS Platform as well as the Reference Application

 

  • For example, web services has unit tests, but not necessarily integration tests of REST API calls (e.g., create patient)

 

 

  • May 2013 QA recommendations from ThoughtWorks:

 

  • Include acceptance criteria review/creation in JIRA curation activities; acceptance criteria and test case scenarios should be mandatory

 

  • Explore ways to formalize the core<-->module contract using Integration Contract Tests.

 

  • Publish/Subscribe contract tests in component pipelines

 

  • Execute module tests against all supported versions of core; publish  test results for community consumption

 

  • Adopt a scenario based testing approach to capture high-value  paths through the system [especially from implementations] 

 

  • Consider adopting BDD testing tooling; leverage community  expertise to increase scenario coverage?

 

  • Manage cycle times to keep feedback loops optimized

 

  • <5 minutes from commit for unit & component tests

 

  • <60 minutes from commit for service & integration tests

 

  • Addition of additional deployment targets & test clients may be  necessary [parallel execution]

 

  • Consider automatic inclusion of community modules in pipeline

 

  • How can we "move up the stack" regarding quality assurance?

 

 

  • After Action Review

 

  • What did you expect to happen?

 

 

  • What actually happened?

 

 

  • What can we do better?

 

 

  • Preview Next Week: 

 

  • WIP: Mirebalais Update

 

TODOs

Loading

Outstanding TODOs (0 issues)

Summary Assignee Created Due

Create a TODO: http://go.openmrs.org/todo

Transcripts

  • Backchannel IRC transcript
  • Audio recording of the call: Listen online or download - available after the meeting
  • No labels