Platform test coverage so we don't break Initializer workflows?

Hi team @dkayiwa @jnsereko @kdaud @ibacher @burke @mksrom,

@binduak shared an issue today in the #initializer slack channel, which made me realize: What automated tests do we have set up to prevent or alert us when changes to the platform would break Iniz workflows? Given we are increasingly recommending Iniz to many many distros, this seems like coverage we need.

Specific message from @binduak was:

Hi @Dimitri R @Ian Bacher We are getting below error with initializer 2.4.0-SNAPSHOT on OpenMRS 2.4.2 env. The other module csv imports(concepts, conceptset…etc) are working fine on the env but liquibase module upload with initializer is giving below error. But the same functionality worked fine on OpenMRS 2.1.7 env.

ERROR - BaseFileLoader.lambda$loadUnsafe$3(89) |2022-07-20T15:36:38,585| There was an error while updating the database to the latest. file: /opt/openmrs/configuration/liquibase/liquibase.xml. Error: /opt/openmrs/configuration/liquibase/liquibase.xml does not exist

Even if this specific issue does not end up related to a breaking change in our platform, my question remains: What test coverage do we have in place?

I guess these tests should probably live in Iniz, but we should have them represented in our relevant dashboards (eg Bamboo build-time alerts and the OMRS qaframework dashboard).

CC @alaboso @amugume who might find this interesting.


Initializer Validator?

That was supposed to be the answer to validating configs before they fail you on a runtime env.

See also the unfinished (by me, when I used to do some useful work):

  • SDK-269: openmrs-contrib-packager-maven-plugin to validate OpenMRS configs.
1 Like

Every module is in the same boat, and the solution is typically with unit tests for each feature and maven profiles that build and test against different platform versions. The reporting module does this as you can see here: Reporting Module - Reporting Module 894: Build result summary - OpenMRS Bamboo. Also as you can see, it can be difficult to keep this up-to-date, and requires active maintenance - the versions tested with reporting appear to be 1.9 - 2.2.

1 Like

:grimacing: @dkayiwa how do you think we could manage this? Eg do we have a pre-platform-release checklist? Should we add test support for the reporting module as a must-have before platform updates? That’s like 2 years out of date with the platform which feels wrong.

Looks right for critical modules like iniz.

1 Like