In a nutshell there is
- A microservice that queries OpenMRS every so often to extract flattened data, that’s the ETL process basically, externalised as that microservice.
- This flattened data is fed to a reporting analytics platform, Metabase in this case. I guess it assumes that the reporting platform is “nearby”, so running on the same server locally.
@angshuonline and others (@mksrom) to confirm and expand. With Bahmni Mart the configuration as to how the data is flattened is made in 1. Right now the out-of-the-box config for the original requester of this feature (MSF).
P.S. There is no reason for it to be Bahmni-specific and actually I don’t think it is set aside the branding, but again, others to confirm.
At ICRC we are coming up with a different approach for DHIS2. We use the good old Reporting module to do the ETL. As such it is just a handful of SQL dataset definitions that are used every so often to generate the CSV dump that DHIS2 expects. The wit was in the way we used special concepts and their attributes to parameterise the SQL queries and keep them very generic and configurable to target DHIS2 indicators en masse.This piece of work will be packaged as a reusable module when out of its alpha stage.
@amine to provide more details on this approach.
P.S. Makes me think that this is yet another case of using a special secondary source of concepts for a very specific purpose, cfr this.