--
You received this message because you are subscribed to the Google Groups "OpenLMIS Dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to openlmis-dev...@googlegroups.com.
To post to this group, send email to openlm...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/openlmis-dev/CAOKb-R4-9LMCvA2ztxcf_kHgFctZioP_rjxFxtjh7p%2BghDwkvA%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.
Thanks Darius!
As Chris mentioned, the team is in the midst of adding documented API and tests to prove those out. Chris, team: how can we track/measure the coverage of these?
Sonar can provide some help here, but as noted in OLMIS-848, Sonar is not picking up some tests due to their location in the source tree, which supports Darius’ observations about the integrations-tests directory.
For the sprint starting 7-27, we should normalize our testing setup:
- Fix OLMIS-848
- Organize tests in folders by type (unit, api, etc.)
- Focus on completing any outstanding web API tests
o Maybe have a team ILL member work with a ToP/AYIC member to jointly write or refactor some tests to hold up as good examples to follow.
- Unit tests: as Darius mentioned, are we neglecting unit tests, or does the “plumbing” code dwarf the business logic code at this point?
- Normalize error returns per the ProgramController discussion below. I can create a story, but need input from the team on what this should achieve in terms of a consistent error payload. I saw some code review chatter about error codes, so it seems we need to set some standards.
I will create issues for these, and really desire to hear additional input from the team on other specific actions.
Rich
To view this discussion on the web visit https://groups.google.com/d/msgid/openlmis-dev/5796416B.3090808%40soldevelo.com.