A Patient-Discharge Registry

This solution records live discharge information into a database "Patient Discharge" table, based on the HL7 messages it receives from the network.

Solution Requirement

Based on live HL7 ADT data received from the EHR, gather information about a patient’s discharge and enter the information into a “Patient Discharge Registry”, which is a database table inside a SQL Server database server.

The information to be gathered includes patient’s demographics details such as name, date-of-birth, language, as well as clinical information such as discharge-date, admission ward, and attending doctor. 

The solution also automatically creates some metadata such as the “record create date (timestamp)” to assist the system’s auditing and further data processing.

Input Condition

Network HL7 messages received from EHR. These messages include the HL7 Discharge (A03) messages which will be used to retrieve the relevant information for the Registry.

Output Condition

The following information will be extracted from A03 HL7 messages –

  • HL7 Message ID (HL7 message identifier)
  • Patient ID (EHR identifier)
  • Last Name (demographic)
  • First Name (demographic)
  • DOB (demographic)
  • Sex (demographic)
  • Language (demographic)
  • Ward ID (admission ward)
  • Attending Doctor ID (doctor registration number)
  • Attending Doctor Name (full-name)
  • Discharge DT (discharge date-time)

In addition, a “Record Create DT” is generated as the current timestamp when a record is received. All these data elements will be written to a target database table. 

Target Database Environment

The following database parameters are used for building the solution.

  • Table: DischargeRegistry
  • Database instance: Test1
  • Database server: DELL-I7
  • Database login: User1

These parameters are required for constructing the connection string used for establishing the database connection when writing data to the database table. These parameters will need to be changed accordingly when applying the solution to a different database environment.

Solution Design

The target table’s structure and it’s data definition is documented in the table below, and where each column’s data will be sourced from.

To determine a required data element and its “address” in an HL7 message, you may refer to the source system’s interface specification, or to the published HL7 Standard. Alternatively, you may just browser the message data to look for what you want. You’ll find using a dedicated HL7 message viewer such as the SmartHL7 Viewer makes the browsing a lot easier –

Below are those elements that are identified (as highlighted) for being used in our solution.

Now we have designed our target data table, and where we can source these data from, we can design our solution structure using the available handlers

There are 7 steps in our proposed solution, each step is handled by a designated handler. Here are the designed tasks for these handlers –

Now we have completed our design, we are ready to build our solution.

Building The Solution

As we explained in the post “solution-building overview“, building a Foldda solution required no dedicated IDE and involves no programming. It involves three simple steps: 1) selecting the required handlers; 2) “stack” the selected handlers (folders) into a hierarchy reflecting the data processing flow; and 3) configure each handler in the solution with the parameters according to the design requirement. 

In our case, here are what’s involved in these steps –

Selecting The Required Handlers

2. Build the Data Processing Flow

Building Foldda is just stacking folders. In the solution, it’s like this – 

3. Configure each handler according to the requirement.

Changing a handler’s config settings is like this – 

Below are the config settings for each of the handlers in the completed solution:

<?xml version="1.0" encoding="utf-16"?>
<NodeConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Description>HL7NetReceiver listens to a network port for receiving HL7 messages. The transport protocol is MLLP, and the server response ACK to each HL7 message it receives.</Description>
<Handler>HL7NetReceiver</Handler>
<Parameters>
<Parameter>
<Name>server-port</Name>
<Value>1235</Value>
</Parameter>
<Parameter>
<Name>connection-timeout-sec</Name>
<Value>60</Value>
</Parameter>
</Parameters>
</NodeConfig>

<?xml version="1.0" encoding="utf-16"?>
<NodeConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Description>Select only A03 'discharge' messages.</Description>
<Handler>HL7Filter</Handler>
<Parameters>
<Parameter>
<Name>filtering-rule</Name>
<!-- filter these message types -->
<Value>MSH-9==ADT\^A03</Value>
</Parameter>
</Parameters>
</NodeConfig>

<?xml version="1.0" encoding="utf-16"?>
<NodeConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Description>Apply mapping to Sex (PID-8) and Language (PID-15).</Description>
<Handler>HL7Mapper</Handler>
<Parameters>
<Parameter>
<Name>mapping-rule</Name>
<Value>PID-15?=([email protected])</Value>
</Parameter>
<Parameter>
<Name>mapping-rule</Name>
<Value>PID-8?=(F;Female)(M;Male)(*;Unknown)</Value>
</Parameter>
</Parameters>
</NodeConfig>

<?xml version="1.0" encoding="utf-16"?>
<NodeConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Description>HL7ToTabularConverter converts HL7 messages into to tabular records.</Description>
<Handler>HL7ToTabularConverter</Handler>
<Parameters>
<Parameter>
<Name>csv-column-hl7-elements</Name>
<Value>MSH-10~PID-3~PID-5.1~PID-5.2~PID-7~PID-8~PID-15~PV1-3.1~PV1-7.1~PV1-7.2~PV1-7.3~PV1-45</Value>
</Parameter>
</Parameters>
</NodeConfig>

<?xml version="1.0" encoding="utf-16"?>
<NodeConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Description>Concatinate $10 and $11 as "full name" and save to $10; and insert a current timestamp to the column $11.</Description>
<Handler>TabularDataTransformer</Handler>
<Parameters>
<Parameter>
<Name>transformation-rule</Name>
<Value>$10=>CONCAT($$,", ",$11)</Value>
</Parameter>
<Parameter>
<Name>transformation-rule</Name>
<Value>$11=>TIMESTAMP("yyyyMMdd HH:mm:ss")</Value>
</Parameter>
</Parameters>
</NodeConfig>

<?xml version="1.0" encoding="utf-16"?>
<NodeConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Description>TabularColumnsSetHeader add a row, with comma-separated (or semi-colon-separated) column headers, to the top of the input tabular data. If input data has existing header, it will be replaced with the new header.</Description>
<Handler>TabularColumnsSetHeader</Handler>
<Parameters>
<Parameter>
<Name>column-headers</Name>
<Value>HL7 Message ID,Patient ID,Last Name,First Name,DOB,Sex,Language,Ward ID,Attending Doctor ID,Attending Doctor Name,Record Create DT,Discharge DT</Value>
</Parameter>
</Parameters>
</NodeConfig>

<?xml version="1.0" encoding="utf-16"?>
<NodeConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<Description>Database loading handler. Parameters are required to be defined with the correct data-type.</Description>
<Handler>TabularSqlDatabaseWriter</Handler>
<Parameters>
<!-- expect the input data (when input from file) has a header row -->
<Parameter>
<Name>csv-first-line-is-header</Name>
<Value>YES</Value>
</Parameter>
<!-- MS SQL Server connection string -->
<Parameter>
<Name>db-connection-string</Name>
<Value>Server=DELL-I7;Database=Test1;User Id=User1;Password=password1;</Value>
</Parameter>
<Parameter>
<Name>db-target-table-name</Name>
<Value>DischargeRegistry</Value>
</Parameter>
<Parameter>
<Name>column-spec</Name>
<Value>DOB;date-time;yyyyMMdd</Value>
</Parameter>
<Parameter>
<Name>column-spec</Name>
<Value>Record Create DT;date-time;yyyyMMdd HH:mm:ss</Value>
</Parameter>
<Parameter>
<Name>column-spec</Name>
<Value>Discharge DT;date-time;yyyyMMddHHmmss</Value>
</Parameter>
</Parameters>
</NodeConfig>




Final Outcome

Below is the final solution in-action – with some emulated network input. Please feel free to download and give it a try

And the resulted database records –

Here is also the database script for creating the target table before testing out the solution. Remember your database environments such as the database name and the login details will be different, so you’ll need to change the SqlDatabaseWriter handler settings (e.g. the connection string) accordingly.

CREATE TABLE [dbo].[DischargeRegistry](
	[HL7 Message ID] [varchar](50) NOT NULL,
	[Patient ID] [varchar](50) NOT NULL,
	[Last Name] [varchar](50) NULL,
	[First Name] [varchar](50) NULL,
	[DOB] [datetime] NULL,
	[Sex] [varchar](10) NULL,
	[Language] [varchar](50) NULL,
	[Ward ID] [varchar](50) NULL,
	[Attending Doctor ID] [varchar](20) NULL,
	[Attending Doctor Name] [varchar](100) NULL,
	[Discharge DT] [datetime] NULL,
	[Record Create DT] [datetime] NOT NULL,
	[Record Process DT] [datetime] NULL
) ON [PRIMARY]

You may use these sample HL7 data for testing out the solution. For simulating the EHR system as the inbound source, you will need to use a network HL7 sender tool, such as the SmartHL7 Sender, to feed this data into the solution.

Certainly, you can use this solution as a base and add in your own requirements, such as capturing different events and other data elements. The solution can be expanded to many variations. For example, it can output to multiple tables, or output to CSV files instead of to databases. 

Finishing Touch

Before rolling out your solution into a Production setting, consider the following enhancements and best practices for additional maintenance benefits:

  • Change each node folder’s name to make it shorter and more descriptive;
  • Add comments to each node in its config file’s “description/remark” section;
  • Use “Catcher” nodes to store inbound messages before any translation, and store outbound messages before sending them out. This gives you an audit trail, as well as the ability to replay the messages if required.
  • Run your solution as a Windows Service in the background;
  • Setting up logging-based monitoring and real-time email alerts.
Now you have a solution that will make you feel proud.