Platform 2.x , RefApp 2.1x Enhancements

Tags: #<Tag:0x00007f0f178f0ea8> #<Tag:0x00007f0f178f0d18> #<Tag:0x00007f0f178f0c50> #<Tag:0x00007f0f178f0b88>

Hello all , as we all know , some of the biggest challenges of the Java Ecosystem and OpenMRS at large are

  1. Startup time
  2. Memory Footprint

Thanks to the Amazing work already done by @wolf see TRUNK-4830 Which generally affect performance and consume memory.

As we approach the Platform / Refapp Release Season , I envision a faster and small footprint Platform / modules which would mean an expensive process of :

  1. An attempt to build on top of GraalVM

  2. And among other things attempt to progressively introduce Quarkus into our code base for newer features and or existing features.

It may not be a Community priority but could have a space in the Backlog for consideration.

Setbacks may be

  1. Learning Curve (But it is so low)
  2. Lack of Dev Resource time
  3. Feature Priority
  4. Support (am not so sure about the Maturity and support base of Quarkus vs Spring)
1 Like

thanks @tendomart for this ,sounds good

waiting to here what others in the community think about this

1 Like

Thanks for bringing this up @tendomart. These are indeed things that we should keep an eye on and see if we can leverage to help with OpenMRS.

It’s important to realise, though, that these aren’t just magic (though the documentation and websites often present them as though they are). They’re technologies which could greatly improve performance and memory consumption, but come with potential costs.

GraalVM, for example, can result in much faster startup times and much lower memory usage, but this is primarily brought about by doing compilation ahead of time and producing a native image. This results in quicker startup time (and lower memory costs), but can actually result in poorer through-put during application execution. HotSpot, the JVM’s Just-in-Time compiler, is actually one of the more efficient just-in-time compilers around and, given sufficient warm-up time, will usually out-perform GraalVM’s native images (which is why GraalVM’s sweet-spot tends to be things like serverless architectures where start-up time is actually one of the most important considerations in terms of through-put).

Quarkus and Micronaut are both JVM microservice frameworks that compete somewhat more directly with Spring, but work by providing a reflection-free or at least reflection-minimal dependency injection alternative. They tend to work faster then Spring because they rely on being able to resolve dependencies or dependency trees at compile time, which then means that at runtime they can use faster methods to instantiate and inject dependencies. Spring, on the other hand, tends to rely on runtime proxies and reflection, which can be slower, but allows for a great deal more flexibility in configuration and, to some extent, greater decoupling of parts across the application. OpenMRS’s modular architecture has tended to encourage modules to take advantage of this flexibility in ways that may not be easy to do in a performant manner using Quarkus or Micronaut, etc.

I don’t want to discourage anyone from trying out these technologies; they really can provide a better overall development experience and potentially encourage us to write a more correct application. I just want to be sure that we aren’t thinking of these technologies as silver bullets to solve start-up time or memory consumption.

3 Likes

@ibacher i completely agree with your assessment.

GraalVM's improved performance claims, and its polyglot feature to transparently mix and match supported languages, looks exciting to try out. But i do not like the fact that GraalVM comes from Oracle :slight_smile:

FWIW, GraalVM and Quarkus are in the ASSESS stage of the ThoughtWorks Technology Radar, while Micronaut is in the TRIAL stage.

@tendomart do you have some time to spike on some of these technologies and share your findings in relation to OpenMRS? Your results could affect their prioritisation within the OpenMRS Community.

Thanks again @tendomart for the research. Keep it going!

1 Like

Very true thanks @ibacher

@dkayiwa yes sure thanks I will take some time off and share my findings here

2 Likes