Dockerizing Bahmni

Tags: #<Tag:0x00007f5ff8ed0a08> #<Tag:0x00007f5ff8ed0940>

Hey there,

please allow me to introduce myself. My name is Wolf and I am a software developer employed by ThoughtWorks. Having finished a larger piece of work for OpenMRS last year I was looking for the next topic and had a chat with @gsluthra who suggested looking into dockerizing Bahmni.

There are different approaches to running Bahmni and OpenMRS (standalone) on Docker, namely

With this post I would like to start a discussion about which principles and guidelines to follow when dockerizing Bahmni.

As a starting point, please allow me to share a few ideas:

  • Provide official docker images for all components that make up Bahmni.
  • These docker images are released by the respective community, i.e. OpenMRS folks publish docker images for OpenMRS, OpenELIS do the same for their product.
    • The docker images can be built locally to support development and testing.
    • Images that are built locally are not intended to be used in production.
  • The docker image for a component should support the extension points and configuration of that component, but not go beyond that.
    • As an example, it should be possible to add OpenMRS modules to a Bahmni distribution.
    • Docker images must not be used for tweaks and fixes, such as adding a table to the OpenMRS database. This should be fully encapsulated by the respective component.
  • A docker based Bahmni distribution contains
    • A docker-compose and .env file
    • A Bahmni distro docker image that contains configuration files only. This image does not contain deployable artefacts of Bahmni components (such as war files).

It would be great to have a chat about this on one of the next PAT calls and agree the next steps.

Thank you, Wolf

6 Likes

@wolf : Thanks a lot for kickstarting this. Some thoughts on Dockerizing Bahmni:

Docker images will be helpful in following situations:

  1. For developers who are looking to locally develop and extend Bahmni. In this case they would want the ability to only start the services they need (ex: WebUI, OpenMRS+modules, MySQL, Bahmni Config, etc) – and then map their code folder to a docker volume/directory.

  2. By Implementors looking to run a local Bahmni instance for demo and to make configuration changes. In this case, ability to map their local bahmni-config changes to reflect in docker.

  3. By System Admins & Implementors looking to deploy Bahmni either in-premise or on-cloud (test, production instances).

  4. Folks wanting to do multi-site/large scale deployments of Bahmni, on Kubernetes style elastic-containers; to use resources efficiently with multiple sites hosted in single cloud (state govt as an example). In this case the Apps, OpenMRS, etc which are stateless would need to be scaled up (multi-instance) on demand.

  5. We would like to remove the current dependency from CentOS and instead let people choose between light-weight docker base images like: Alpine, CoreOS or Ubuntu. OpenMRS community already has strong knowledge of Ubuntu, and maybe that could be considered first, if that helps make it easier to setup & adopt.

  6. With regards to Odoo we have been struggling with making the right python libs available during installation. Creating a base Odoo docker image could be helpful in such cases. Although as you stated in principles – its ideal if we just use Odoo provided docker images as base, and then add-on the Bahmni modules for Odoo.

Note: The community also has written their own addons for Odoo (and OpenMRS), and will want the ability to easily plug those in, when they use Bahmni provided docker-compose scripts.

1 Like

Few extra things

  1. Databases loaded from volumes
  2. Databases can come with base dumps or without.
  3. A clustered deployment, container orchestration via K8, although entirely possible, is probably something we can do without to start with. at the same time, I would refrain from using Docker swarm.
  4. Maybe we should start with a logical diagram of all containers, and then breakup the work.
  5. We ought to leverage good work done by Mekom
2 Likes