The Federal Data Strategy’s 2020 Action Plan released in December set the stage for how government agencies should prioritize data in the coming year. Since that time, many agencies have taken aggressive steps to turn their data holdings into strategic assets. One area of focus has been the increased adoption of AI and machine learning technologies. In my role, I work closely with the agencies and their data teams sitting on the front lines of this innovation. The early adopters who began their big data journey over the last few years are starting to see how data and predictive analytics can support their mission goals and create additional value for their stakeholders. Here are a few examples from working with teams across federal, state, and local agencies.
How AI is helping the Government
COVID has shined a light on the dire need for population-scale predictive analytics. We are working with large public health entities to use AI to predict disease spread and adapt their response to the pandemic in real-time. By empowering epidemiologists to analyze different public data holdings at scale, these agencies have built disease surveillance systems that can predict a 7-day rolling average for case positivity or test rates. This is critical for influencing when and where to roll-out or roll-back shut down orders. We’ve even seen examples of this within the military where similar machine learning models are being used to predict COVID outbreaks among servicemen. This is critical when determining the mission readiness of their resources.
One specific example is our work with the Medical University of South Carolina. They are leaders in providing telehealth services in the US. Chatbots on their website help triage potential COVID patients. They applied predictive analytics to thousands of chatbot records to determine which patients are high-risk and require early intervention. This helps expedite patient flows while getting the right intervention to the right person, ultimately saving lives.
Military and Defense
The US Air Force recently spoke at our Spark + AI Conference on how they are using AI to predict part failures across their fleet of jets. They examine wear patterns and data sets from flight travel to identify when they might have a part failure. Early identification enables staff to stock the right part at the right time and repair that plane before it must be pulled out of commission. These types of innovations can result in millions of dollars of savings and improved efficiencies across the supply chain.
The Los Angeles City Fire Department recently joined us for a panel discussion on the use of big data. They shared how they are using AI for smart routing and optimal placement of emergency resources. Ambulances, fire engines, and other vehicles and emergency response staff have different capabilities and skill-sets. When an emergency occurs, they need to deliver the right resources as close to the incident as possible to provide the appropriate emergency response resources quickly and efficiently. Machine learning can be used to help predict where to best deploy emergency response resources based on historic patterns to ensure the fastest response time possible.
There are many examples of AI being applied to public transit. The US Department of Transportation’s SWIM project for the FAA uses advanced analytics to optimize the routes of airplanes and make them more efficient. At a local level, machine learning is being applied to real-time data feeds from public transit vehicles to improve city bus routes and aid in road planning. By predicting traffic patterns and accident hot spots, transit agencies can optimize their transit ecosystem to better meet citizen needs.
Being successful with AI
With all these great examples, the question is not whether to invest in AI, but how to do it successfully? It starts by federating all your data. AI algorithms are only as good as the data they are fed and with data assets sitting in siloed source systems and data warehouses, agencies need to consider a new approach to enabling analytics at scale.
The good news: powerful cloud-based solutions provide agencies with the tooling and scale they need to bring together vast amounts of structured and unstructured data, process that data efficiently in the cloud and analyze it with powerful analytics and AI tools. By bringing your data together in one place with the scale of the cloud, you can answer critical questions not in days or weeks but in a matter of minutes, dramatically impacting the success of your mission.
But getting the data together is only half the problem. Agencies need to consider how to best enable diverse mission teams to collaborate on these data sets. Databricks helps agencies overcome these challenges with a collaborative environment for analytics and AI that allows data engineers to prepare data, analysts and data scientists to build analytics and machine learning models, and functional teams to review findings through published dashboards, all within one platform.
Whether Databricks is the solution your agency is using or not, success with AI hinges on adopting a unified approach to data analytics and AI that brings together diverse sets of data, best of breed analytics tools and people from across the agency to work together to drive value from these critical data assets.
Get Started on Your AI Journey
- Join our Industry Leadership Forum to hear from the Air Force, US Army, FBI and VA on how they are building data driven organizations
- Download our Comprehensive Guide to Machine Learning for the Public Sector
- Contact us to start a POC or get started with a free trial