For the RPMs only, yes that would be just fine to download them, probably directly into the Yum local repo and then run the install. Nothing would even be needed to modify in the playbooks because if a package is present in the cache and Yum does not have a connection, it will probably use the local package (if it is of the right version of course).
The issue we have is that, whilst we have an internet connection, it is incredibly slow.
Also, you wouldn’t even need to work with yum caches, etc, as you could install the rpms using rpm -ih /local/path/to/package.rpm
. This would be quicker and easier to automate, and would (as far as I can see) only require a few changes to the yum
tasks in the playbooks.
However, not everything is shipped as a RPM. For instance the Oracle Java VM, OpenMRS modules, OpenMRS Core WAR file etc… Those are components that are fetched during installation.
That’s true, but there’s nothing to say we couldn’t include these files in the distribution, or at least download them into, say, /opt/bahmni/dependencies if they don’t exist and then ensure that ALL installations of these packages/WARs/omod files etc are installed/copied from a local path. If a local path can’t be used for whatever reason (i.e. with omod files, or whatever), we could start a simple ruby or python server (i.e. ruby -e -run httpd /opt/bahmni/dependencies -p 4000
, python -m SimpleHTTPServer 4000
).
I’ve done this for a copy of Bahmni 0.85 that I’ve been using in a development environment and haven’t had any issues at all.
Or did you consider creating a Docker container all set up the way you want (see Running Bahmni on Docker) and store the container in a repo accessible offline?
A docker container on a private registry may work in this instance, but I think it would be great to have a mainstream solution so that everyone can benefit from the performance/reliability improvements which would occur as a result of using locally sourced dependencies.