Hi,
It has been a while since I’ve been on this channel, please forgive my ignorance on progress made here.
Context
A hospital that uses Bahmni is running an experiment to see how artificial intelligence can help doctors with consultation. Here is a description of the experiment.
- Nurses collect detailed history from patients in the form of an audio file. This happens in Bengali/Hindi languages
- The audio is sent manually to a Google Gemini “Gem” that takes in this information and suggests next steps
- The output is stored and then compared to notes from a doctor who then does the consultation
- The experiment is also reversed with the doctor talking to the patient first, and the nurse talking next
Based on what I hear, preliminary findings show that AI can assist, and sometimes even outperform doctors in constrained settings.
Need
They are looking to see if it is possible for this mechanism to be tightly integrated with Bahmni. The workflow would look something like this (in my imagination).
- Bahmni “listens in and records” a conversation between a provider and patient
- Once the conversation is over, the audio, along with patient history available in Bahmni, is sent to an LLM.
- The LLM analyzes and provides next steps - investigations/drug orders etc.
- The LLM output is auto-populated into Bahmni (unsaved)
- The provider looks at the suggestions, makes any necessary changes required and saves on Bahmni
Benefits
- More comprehensive medical record with lower effort
- Suggestions from AI can potentially reduce errors during consultation
Questions
- Is there existing work that we can utilize for this purpose? I tried looking, but I could have missed something
- Would anyone else also be interested in a solution like this?
- What would it take to build a native experience for this workflow on Bahmni?