Big Data

Data for a Difference


Today’s data-rich landscape provides unprecedented opportunities for government to become more responsive and productive. It would seem like a no-brainer for agencies to jump headlong into adopting data analytics to inform their decision-making. Many agencies stall, however, when it comes to implementing such programs, often due to a common perception that big data is simply too big for users to tackle.

Data-Scrabble-Pieces

Unfortunately, this perception is based on experience. Traditionally, legacy – or nonintuitive – analytics tools have made it hard for the average government employee to identify issues, take corrective action and, ultimately, improve services by using data. When faced with these legacy tools, typically only a skilled data analyst would be able to effectively make use of them. Based on these past experiences, some agencies believe they have to hire teams of advanced data scientists to manage their big data programs and platforms. However, this simply isn’t possible given today’s limited budgets and workforce gaps.

Fortunately, that army of data specialists is unnecessary. With the right analytics tools in place, agencies can use the resources and staff they already have to evolve into an efficient, data-driven culture and empower decision-making without the guesswork and expertise – and in a fraction of the time.

The key to success with data analytics is deploying a solution that is user-friendly for all employees, not just those with an advanced data-analysis background; and the tool of choice is in-memory processing. By bringing together data from disparate systems and providing visibility across platforms, in-memory processing can help anyone derive value out of their data.

Big Data in Law Enforcement

Data’s power to inform decision-making on the part of employees – even those without advanced data-analytics training – is evident in the field of law enforcement. Police departments throughout the country increasingly rely on data to inform decisions about things like when and where to deploy police officers on the ground. When adding more “boots on the ground” just isn’t an option, placing real-time data in the hands of officers has gone a long way toward managing crime, even with limited resources.

In Santa Cruz, Ca., the police department adopted a “predictive policing” program in an effort to increase the amount of time officers have for proactive policing by cutting out the time between receiving radio calls and reaching the incident site. Using their organization’s analytics program’s advanced probabilistic algorithms, officers in Santa Cruz receive real-time crime data – broken down into four different crime categories in multiple 500-square-foot zones – at the beginning of each shift. This points officers in the right direction and considerably increases the odds that they will be “in the right place at the right time” to prevent crime or quickly apprehend offenders.

Santa Cruz Police Deputy Chief Steve Clark explained the department’s move toward utilizing more sophisticated data analytics, stating, “Time is a zero-sum game. I only have the number of officers, times the number of hours that they’re working, to address crime issues in the city…I couldn’t afford more officers, so I had to get smarter about how we were going to deal with our limited time resources.”

When Santa Cruz police officers gained access to aggregated data and were trained to use it in their daily operations, the police department was better able to serve its mission, and the public benefitted from improved security.

Big Data to Fight Drug Addiction

The nationwide fight against the opioid drug addiction epidemic again illustrates data’s immense power to guide agencies’ policies and strategies to serve the public. A 2016 National Governors Association report revealed that 78 people die every day from an opioid-related drug overdose. These deaths are not attributable only to the street drug heroin, however, a common misconception. Increasingly, many of these overdose deaths are due to misuse and abuse of painkillers, which can be legally prescribed by healthcare providers for legitimate purposes.

Just as in law enforcement, real-time and aggregated data is at the very core of state and local strategies for combatting the opioid crisis. Programs that use anonymized patient data, for example, are helping to build profiles to identify the most at-risk patients and intervene early; and Prescription Drug Monitoring Programs (PDMSs) have become indispensable analytical tools in many states to track overprescribing, “doctor shopping” and other critical trends.

Dave Hopkins, with the Kentucky Office of the Inspector General, states “State PDMPs are a very valuable prevention tool. If we can get the information to our prescribers and pharmacists, and they can identify the patient that is at risk earlier on, there is a much better chance of getting them into treatment.”

To learn more about how data can be used to combat drug misuse and abuse, check out this webcast.

When agencies take an “analytics for everybody” approach, critical data-skills shortages are filled and agencies are empowered to run analytics for themselves. Agencies no longer need to be focused on hiring people with existing data-analysis skills because regular employees will be able to master the process themselves. With a bit of training and a user-friendly platform that delivers real-time data across the agency, data analysis is simplified for employees. If government can shift its perception of what data is and who can use it, as well as acquire modern advanced reporting tools that replace cumbersome legacy applications, the possibilities to make decisions based on solid data evidence are endless.

To learn more about the power of big data and easy-to-use analytics solutions with in-memory processing for government, check out:

Related Articles