OpenMRS active cloud deployments


Does anyone know of any active deployments that host OpenMRS on the cloud?

@jblaya, saw your post in this Talk thread: OpenMRS in cloud Could you please elaborate your experiences of the same? What have been your experiences with scaling? How large is your deployment (users, no. of patient encounters)?

To the community at large: any experiences with deploying OpenMRS 2.0 on the cloud?

I’m asking because our project is working on developing an Android app that aims to use a cloud-hosted instance of OpenMRS as a back-end.


Hey @ngoel2,

For Openshift: There is a video made by @cshah that explains how to deploy it there.

1 Like

There’s also a docker image that can be read about on the wiki as well!

Here are some examples of Docker images:






Thanks @r0bby! Will into these.

yesterday, @surangak and i heard about a cloud based deployment through service provider Mekom in Laos. I dont have more info on it.


Sorry for popping up late, I somehow had missed this thread.

Yes as @terry said we are offering OpenMRS “as a service” and have a production instance running in Laos (yes, it is possible in Laos!) and two in the pipe in Cambodia.

We are actually right now doing a lot of work to produce an Ansible set of commands to setup & control those instances from Jenkins or from an administrator’s machine. We are using AWS + some DNS services but would like to keep the door open to other providers (such as DigitalOcean… etc).

Don’t hesitate to ask questions.


@mksd is this documented somewhere, how you did this? How are you ensuring that patient health information remains secure?

1 Like

@mksd thanks for responding!

A few questions to start with:

  1. What have been your biggest challenges so far?

  2. Echoing @r0bby’s question about security, what are you using for ensuring security of the cloud instance (Have you used fail2ban, or other tools, we would love some recommendations)

  3. What have your experiences been with using OpenMRS over low bandwidth internet connections in Laos? We’ve found that a minimum of a 1Mbps connection is needed for a smooth user experience.

  4. Experiences with latency using AWS? Where is your server located?

We’re in the process of setting up a “SaaS” instance of OpenMRS for about 10 users for a telemedicine project in India, the intention of course is to scale up to hundreds of users. More here:

Thanks Demetri!

Tagging @eeggert1

Hi @ngoel2,

I need to give some perspective on this deployment, our first OpenMRS one. Unlike you, when we came to put all this together, we had literally no time. So there are a lot of areas that had been overlooked initially and that are still being corrected a year later. And it is interesting that your thread is coming up now because we are putting a lot of work this summer in figuring out the best way forward with OpenMRS Cloud instances. Cfr the remark in my last message about our yet-to-come Ansible set of commands for controlling our deployments. By the way, if you are interested in a collaboration regarding this specific effort… let us know.

But even as rushed as everything was, we have had very little issues with the Cloud part of the instance. That is the great news.

  1. There was honestly no real challenge, the challenge would just have been to make this work on a local infrastructure. If you operate in an environment that is loose from a regulatory point of view (and that is the case with Laos and neighbouring countries) there is not much of a challenge. The nightmare scenario would be that we should be constrained to keep the data inside the country since such possibility does not exist, Cloud or not. You don’t have this issue in India since AWS just kick started a new data centre in Mumbai. Perfect, assuming such regulations exist in India anyway. We had issues with AWS snapshots, this kind of stuff happens from time to time. I don’t know if you intend to rely on them but for us - and perhaps we have been a bit unlucky there - that was not perfect. We are still investigating why those issues happen. On the other hand we want to get away from this anyway and ensure that we can deploy a production instance with a few Ansible commands and within a few minutes, thus reducing the needs of AWS in-house snapshots to move things around, and operating in a way that does not depend on the Cloud provider.

  2. This is a tricky question and I am sure it could spark hot debates. Where do one start worrying? If we run OpenMRS over HTTPS, are we good enough to say that the information is secure? As for us, we just run OpenMRS over HTTPS. The only open port on this Cloud instance is 443. SSH is also disabled and the only other access happens through a VPN (ours). So SSH is still possible of course but only once inside the VPN. We have not (yet) looked into security enhancements such as Fail2Ban, and like you we would love to get more suggestions regarding that area.

  3. In fact the bandwidth in Laos is surprisingly good, and still improving every year. The local infrastructure runs on a dual WAN providing a combined 14Mbps via two optical fibre connections from two different ISPs. In our opinion, to connect 10 to 20 concurrent users using the same local infrastructure (so the same Internet connection) to an OpenMRS Cloud instance, they should be provided at least 5 to 10 Mbps altogether. But your use case is different it seems, each one of your users has got his/her own Internet connection, correct? And would 1Mbps then be enough? We would assume that it is a bit tight but for one user it “should” work. But please do your own trials.

  4. Yes, the choice of data centre does make a difference. We initially started with US West (N. California) and later on shifted to Asia Pacific (Singapore). That would reduce the average ping from 350ms to less than 100ms! You may want to do your own trials here again but it seems pretty clear that the closer the data centre the lower the latency. Asia Pacific (Mumbai) looks like the #1 candidate for India obviously.

I hope this answered some of your questions. Don’t hesitate to ask more. Cheers. Dimitri