Encounter Audit Module Midterm Presentation

Hi Everyone!

Just wanted to give you all an update on my Encounter Audit Module. Looking through some of the other blogs and mid-term videos it is so cool to see so many people up to such interesting (and complex) work in advancing OpenMRS.

I am embedding my mid-term update video from YouTube below. Thanks so much to my project mentors (Cosmin Ioan and Mike Seaton) - they have been awesome in helping me through this process and integrating me into the OpenMRS community.

Quick Summary: This module is called the Encounter Audit Module and is meant to assess the quality of data in an OpenMRS deployment based on encounter information. The module allows you to randomly sample encounter records based on user selected criteria (e.g. date, location, user, encounter type, etc). The user then enters the associated paper chart and the module compares the old and new records to assess data quality/reliability. This information will be stored so that reports on data quality can be generated.

I recommend viewing the midterm update video with 480p resolution.

Here are some questions for passersby:

  • Do you know of OpenMRS deployments where this module would be useful?
  • What features/functionality should be built-in to the module?
  • What technologies would you use to store audit data and to compare old/new records?

Here are some relevant links:

Code Repo Official Encounter Audit Wiki My Wiki (with timeline) YouTube Video Encounter Audit Blog


@pochedley, thanks for the presentation. Do you think you can get a 1.0 version of your module deployed to the module repository by the end of July?

Hi @burke,

Thanks for taking the time to watch my presentation - I really appreciate it. I think the module release will end up being in August per our timeline as there is still a fair amount of functionality to be built.

Be careful. We have a long, proud history at OpenMRS of GSoC projects that are 95% to 99.9% done by the end of the summer and then fade away, never to be used. This happens because the energy required to complete 0.2% of a project that is not yet released is vastly greater than the energy needed to add feature X to a deployed module.

I would suggest working with @cioan to define the absolute bare minimum functionality needed to have anything useful; create your floss and get it deployed. For example, a module that works for one simple set of criteria for the simplest forms and provides one very simple report. Focus on getting that deployed into the module repository. All the important features & functionality can be ticketed and you can knock off as many as you can get to, but you then have a 0% chance of the end of GSoC arriving (it comes quicker than you think) without a working module. In brief, lower your expectations, get something deployed, and then spend the remainder of the summer making it better and defining the path so that others can help in the future.

Hi @burke, your suggestions make sense. Thanks. We will work to release the first version of the module with bare functionality in the first week of August. That still should give us time to fix bugs and implement more features by the end of the summer.

Do what makes sense for your project. I’m just pushing because I want not only to see all GSoC interns having fun and becoming productive open-source developers, but also having the bragging rights that their module is being used and developed beyond GSoC 2014. :smile:

Thanks for the good work and best of luck with the remainder of GSoC 2014!

@pochedley, nice work so far!

Are you planning to do double-entry of the selected subset of records, or just have the user visually confirm correctness? Any thoughts about where you’re going to store the results of this? i.e. where will we mark the fact that a particular encounter was validated on a particular date?

In the past we’ve discussed adding the idea of Encounter Attributes to the OpenMRS data model, and while that definitely won’t happen on the timeline of this GSoC project, if that’s the sort of thing you’d have wanted to have, it’d be helpful to raise it for future work.

Thanks @darius! I appreciate that you took the time to look at this module.

The plan is to do double-data entry. This is to ensure the auditor doesn’t go on autopilot. After the record is re-entered, differences are displayed (so the auditor can then confirm that there are discrepancies).

@cioan and I plan to store the obs in a new table - encounteraudit_obs, which will list the original obs_id and the old and new obs as strings, which can then be used for generating reports. We also list a status in this table (e.g. records match, old record blank, values are different, etc.).

We plan to store encounter information in a table (encounteraudit_encounter) that lists the original encounter_id and a status for the encounter (e.g. audited, not audited, updated).

I don’t see discussion of Encounter Attributes on the dev list, but searching “add encounter attribute” brings up a ton of discussion and I may be missing it. Without knowing the details, this sounds like it could be useful in this module (we define a encounteraudit_status_type table) which is similar to attributes for other tables (e.g. providers). Who would I ping about this (or did I just do that)?

Thanks again!