Big Data

Big Data IT Infrastructure Challenges for Federal Agencies


Analytics is a growing and important part of using data to help agencies achieve goals and serve constituents. The ability to gain actionable insights from the ever-growing stored data sets allows us to not only understand trends and problems where we couldn’t before, but also allows for new innovations to discover products and services that facilitate innovation and solve dilemmas.

The Federal Government undoubtedly not only deals with big data, but has a big responsibility to use it to serve the public. These Federal agencies, however, face huge challenges in supporting data analytics workflows. Jessica Davis wrote about two such challenges in her InformationWeek Government article, Big Data Poses Challenges for Federal Agencies, in October 2015, focusing on staffing and IT infrastructure.

Why Big Data Matters to the Fed

One of the primary reasons for Federal Agencies to use big data analytics is to reduce operational costs. These processes can help identify errors and determine how to prevent them in the future. Another reason is security. According to Davis’s article, “55% of agencies are using big data to improve their IT security.” An example given was the use of analytics to identify threats and find inconsistencies in machine data. These processes reduce the risk of costly and dangerous attacks and possibly even prevent them.

While not all agencies are equal when it comes to data analytics processes and workflows, consistency exists in the challenges they face. While staffing is an important issue, IT infrastructure shortfalls for managing big data were identified as the larger problem and, fortunately, are perhaps more easily addressed.

Big Data Analytics IT Infrastructure Challenges

Reporting on results of an Unisys survey, Davis shared the enormity of the infrastructure hurdle:

But there were a few other concerns that looked even bigger to agencies with existing big data projects. Of them, 73% said they were concerned about the strain the projects put on their existing IT storage, computer, and networking infrastructures. These executives were worried that getting ready for these projects would mean a long and expensive infrastructure refresh. Civilian agencies were more concerned than defense agencies about this issue.

The conclusion that infrastructure overhauls and complete refreshes are needed to support big data analytics is not always accurate. In many cases, the strategy should align with Federal cloud initiatives instead of expanding existing ways of managing the environment. In many cases refreshes can be avoided and expansions of owned resources minimized.

In an upcoming webinar, we’ll look at how adding a cloud-enabled caching layer can drastically improve storage performance, reduce network traffic, and put both cloud compute and storage performance within easy reach. Simple modifications to the existing environment can make unlimited compute resources available to data scientists and data analysts without disruption or moving of data sets from existing NAS.

Join us and Avere on June 23, 2016 at 1:00 PM for, Modernizing the Government Data Center. In this webinar, you will hear how flexibility is key to extending resources and meeting the needs of constituents and partners to deliver government services quickly and thoroughly. The right tools can be implemented quickly and easily without steep learning curves or additional human resources.

Speaker Scott Jeschonek will cover:

  • Why flexibility is important to the modern federal data centerbig data infrastructure
  • How to get immediate performance gains from your existing IT infrastructure
  • How to be ready to add cloud compute and storage resources to meet growing demands
  • How to protect existing resources and minimize resource strain while gaining modern flexibility

 

This post was originally published by Avere.

Related Articles