a close up of a logo

Project EmpowerMD: Medical conversations to medical intelligence

EmpowerMD: Accelerated engineering with Azure

Share this page

By Adi Unnithan, Software Engineer (opens in new tab)

As a fast-moving team, EmpowerMD needed to be resourceful in building the Intelligent Scribe Service (opens in new tab). Using HITRUST certified and HIPAA compliant (opens in new tab) services on Azure, we were able to stay focused on building our product, keeping infrastructure and operational costs low. Here’s an overview of the services we used to get productive quickly.

EmpowerMD Azure Architecture

Azure App Service

We deploy both our frontend and backend to Azure App Service (opens in new tab), a fully managed platform for hosting apps. App Service provides, among many other features, the ability to easily setup authentication (opens in new tab) with Azure Active Directory (AAD). This feature also enabled us to seamlessly, securely make calls between our frontend and backend.

Azure Functions

Every API call in our backend is a Node.js-based Azure Function (opens in new tab). These API calls are repeatable, stateless, and scalable—we can easily scale up our resources to support a growing userbase.  By using Azure Functions we don’t have to manage servers and we only pay when a function runs.

Durable Functions (opens in new tab), an extension of Azure Functions, helps us orchestrate complex scenarios like fan-out/fan-in (opens in new tab) where we need to run several operations in parallel and wait for them to complete. Implementing fan-in can be complicated with many potential scaling, state management, and reliability issues. Durable Functions shrinks this complexity down to a single line of code.

Microsoft Graph

When dealing with medical data users should only see clinical encounters they are authorized to access. This is based on their membership in security groups within their organization. Depending on their role—for example, a doctor, scribe, or transcriptionist—they may have different permissions and capabilities within an encounter. This type of authorization scenario based on security groups was made much easier with the Microsoft Graph API (opens in new tab).

Azure Cosmos DB

For storing data we use Azure Cosmos DB (opens in new tab), a schema-less NoSQL document database. This means that entities—typically rows in a relational database—are stored as denormalized JSON documents. These documents don’t have a predefined structure or schema. This allows us to create entities in our database that look a lot closer to the objects in our application. As a result, we don’t have to think about the intricacies (opens in new tab) of working with a relational database—and this improves programmer productivity.

Our team was proficient with MongoDB, a popular NoSQL database with considerable community support. Fortunately, Cosmos DB provides a MongoDB API (opens in new tab) so we didn’t have to learn a new API.

Azure Key Vault

Securely storing secrets, keys and certificates with auditing can be a difficult and error-prone process. Azure Key Vault (opens in new tab) removes the need for us to manage this in a range of scenarios.

Azure Machine Learning

One of the most common problems in building an AI product is determining how to take an ML model that a data scientist has built and deploy it to production. We rely on Azure Machine Learning (opens in new tab) (Azure ML) to help us take care of “MLOps” (DevOps for machine learning) tasks such as this.

Out of the box, Azure ML supports deploying a model with Flask (opens in new tab), nginx (opens in new tab), and a WSGI server (opens in new tab) to an endpoint hosted in Azure Container Instances (ACI) (opens in new tab). This can be a hassle to set up and even harder to scale out. AzureML provides the ability to configure and run a scalable deployment to Azure Kubernetes Service (opens in new tab).

Our data scientists were happy to not deal with infrastructure either. Azure Machine Learning can also help them automate the training, testing, and deployment of models through AzureML Pipelines. (opens in new tab)

Azure DevOps

Combining the power of Azure Machine Learning with Azure DevOps (opens in new tab) can accelerate a variety of MLOps scenarios. Before Azure DevOps, developers needed to set up and manage services to support builds, releases, and code reviews. It can be frustrating to maintain all these separate infrastructure pieces much less configure them to know about each other.

Using DevOps, we can submit and review pull requests built upon our familiarity of git. With the push of a button, we can run continuous integration and deployment to test, stabilization, and production environments. Azure Artifacts (opens in new tab) gives us a private package repository with out-of-the-box support for Python pip and Node.js npm. Azure Boards (opens in new tab) makes it easy to review and track work items.

DevOps can weave these pieces of day-to-day development together. As an example, for healthcare compliance and auditability purposes we need to link pull requests and commits to work items.

A DevOps release pipeline can continuously deploy models to Azure Machine Learning. Likewise, when model training code is checked in, a DevOps build pipeline is triggered as part of continuous integration. The CI process can also run an AzureML Pipeline to evaluate a newly trained model against a model running in production.

Azure Application Insights

If problems occur in our production services, we need to know right away and quickly determine how to fix it. Azure Application Insights (opens in new tab) is a powerful tool that lets us know of performance and functional issues in areas ranging from JavaScript code running in the browser to calls we make to Cosmos DB.

 

The Azure platform has been vital to helping our team build a world-class healthcare AI product quickly. It’s stepped in where we needed reliable, compliant services and components. Moreover, it’s increased our product focus by taking care of routine, difficult, and detailed tasks. We’re looking forward to using more of the Azure platform as we continue to build out the Intelligent Scribe Service.