Saving consultation and opening patient dashboard takes huge time and throws error on Bahmni EMR frontend

Tags: #<Tag:0x00007fcebc2dd808>


We are trying to use our current openmrs db in bahmni v0.92 which is around 1.5GB in size which already has lots of patient,concepts and encounter data.

So after starting visit for patient in Bhamni EMR and opening his dashboard it takes huge time of around 1min - 1.30min and same happens in consultation while saving encounter it always takes huge save time (>2.0min) and then throws error popup saying “Server error.Encounter information not saved” and on network tab we see a response as “Proxy error 502 on openmrs/ws/rest/v1/bahmnicore/bahmniencounter” for the below mentioned encounter endpoint call. But the encounter info does get saved on backend and also sync to odoo and openelis in like after (3-4 min).So in alltogether this step takes a huge time.

All these worked fast when we first tried using the initial bahmni openmrs DB which was before replacing with our openmrs DB.

The concerned endpoints which take huge time are:
1.) /openmrs/ws/rest/v1/bahmnicore/diagnosis/search?patientUuid=06ddc924-d7e8-4d7e-a282-c4f3e1d480dc
2.) /openmrs/ws/rest/v1/bahmnicore/bahmniencounter

Does the problem lies with the architecture? Urgent help required @gsluthra @swathivarkala @sravanthi17

couple of question/pointers

  1. Check what is the DB version? MySQL 5.6 or MySQL 5.7. In the past, people have complained about 5.7 being slow
  2. Take the visit/encounter or the patient that you are dealing with. Check in the db for the start-date and end-date. Also check under the visit how many encounters, and for an encounter how many obs. If you find visit expanding across many days, maybe you need to set the visit close scheduler appropriately. I encounter is spread across long time, then maybe the encounter session time need to be reduced.
  3. There were few bugs closed and improvement done about 2 months back. @binduak whats the version these went? Also I think, you can just drop the newer library? @binduak if you can help out here.
  4. Rebuild index

I tried doing your requested approaches:

1.) DB version found to be MySQL 5.7.30 which is as per installed by Bahmni rpm
2.) The visit and encounter I am testing are just created for same date and that too for fresh patient created via Bahmni EMR, each encounter is consisting of one diagnoses and 2-3 lab tests. All the visits are spanning for same day.
4.) I have rebuild the index by going into openmrs>Administration>Maintenance>Search Index>Rebuild search index.

I followed all the methods but still the issue remains. Is there any other things that I should try. Please suggest.

I would suggest downgrading mysql version to 5.6. You can do that via setup.yml file. If you are doing this as a fresh install, configure and rerun installation and check the performance. If you are migrating from an existing db, take a dump (mysql dump) and then go through the WIKI instructions (advanced setup) on how to restore the dump

If v0.93 being released with default MYSQL version to v5.6, else every first time installation the users will encounter this issue? : )

Yep. we can do that. We should really move to higher version of MySQL (even better Postgres) We can set the default installation to MySQL 5.6. @gsluthra please create a card?

cc @buvaneswariarun

Can you please provide the steps to downgrade mysql version from 5.7.30 to mysql 5.6 in current installation of Bahmni 0.92 (without fresh installation again). I have already done a backup of my bahmni with bahmni -i local backup

In the /etc/bahmni-installer/setup.yml, introduce a parameter “mysql_version”, and set to 5.6, and run install

Its always a good idea to go through the release notes in detail.

  1. Release notes
  2. The WIKI also details the configurations that can be tweaked for installation

I have created a card for this: [BAH-1178] Ship MysQL v5.6 as default in Bahmni for avoiding performance issues - Bahmni - JIRA

  1. As pointed by you, I have tried running MySQL version 5.6 as well but the delay in Saving an Encounter with my database still remains unchanged. The downgrading also broke some functionalities in bahmni so I reverted back to v5.7.

  2. I tried changing the configuration of my MySQL my.cnf by setting variables innodb_buffer_pool_size = 1G and innodb_change_buffer_max_size=50. After this encounter was saving atleast but still the time taken was around 1.6min to 1.8min with my database which in case for initial openmrs DB was 2-3 seconds.

  3. While debugging the Bahmni Core Module code I found out that on trying to save an encounter the main time taken is on the line of code Line number127 of on ( method call. And it is taking around 1.7min - 1.8min for execution of this line.
    Link to java file: bahmni-core/ at release-0.92 · Bahmni/bahmni-core · GitHub

  4. Further we tried to analyze the queries and their execution time during this “saving” of encounter by turning ON slow queries system variable in MySQL conf.
    Queries log file bahmni_sql_slow_queries.txt (2.1 MB)
    Queries log file short summary(execution time) list.txt (468.1 KB)

  5. Can you look into this and suggest how can we reduce this 1.6min- 1.8min time which is being taken while saving an Encounter on Bahmni EMR.

I am still looking forward to your valuable guidance at this point.

As I told you in previous comment we are currently facing huge time issue while saving encounter. Just to highlight our background, we have an existing database running on openMRS platform version which we want to migrate to Bahmni and intend to use it here after. The database currently contains 1Lacs+ concepts and around 5Lacs of Obs and Encounters.

The main cause for this has been found from our side to be the queries related to fetching of concepts during saving of encounters.

We are also attaching you a FILE analyzed_queries.txt (5.3 KB) which consists of 5 most time taking queries and many are in similar fashion are taking huge time, which accumulate to take 1.7 - 1.8 min until we see green Saved toast on screen.

Please HELP US! as its urgently required.

What is the value for the setting/global property named? search.caseSensitiveDatabaseStringComparison

Hi Daniel,

I checked the global property search.caseSensitiveDatabaseStringComparison which was set to “TRUE”.

I changed it to “FALSE” and restarted openmrs and now the save time and other time lagging pages and flows are working perfectly fine with the ideal intended time. The saving time has been reduced from 1.6min to 1.8sec.

Again thanks a ton !

1 Like