Securing Containerized Applications in Government Agencies

Government organizations, like their private-sector counterparts, are adopting containerized environments at a rapid pace. Across industries, 50% of organizations using the cloud will deploy containers by 2022, says Forrester, and agencies from within the U.S. Department of Defense to the National Institutes of Health to the U.S. Department of Agriculture have embraced containers already.

There are good reasons for this shift in application development and operations. For development, containers offer advantages over “waterfall” approaches. Waterfall methodologies organize development projects in distinct linear phases. Containers support agile and DevOps processes, which emphasize automation and collaboration to build applications more iteratively and rapidly.

For operations, containers let you quickly spin up resources to scale compute power to meet new demand. And if you encounter an issue with an application component you don’t need to shut down the entire application to resolve it because they’re built on microservices. Instead, you can fix the component while the rest of the application remains functional.

But while containers simplify some aspects of IT, they can complicate others. In particular, containers introduce new cybersecurity challenges. Understanding the unique cyber-risks of containers, along with the tools and strategies for mitigating them, can help you take advantage of the benefits of containers while also keeping them secure.

SolarWinds Securing Containerized Apps Blog Embedded Image 2022Containers Are Just One Piece of the Cyber Puzzle

Containers present old and new cyber issues. For starters, container images can contain vulnerabilities. More problematic, cybercriminals can design a malicious image to look like a legitimate image. They can then upload the image to a public registry such as Docker Hub to trick admins into deploying the malicious version.

Microservices also introduce cyber-risk, because the more microservices you use, the more components communicate with one another. Your agency might run microservices across both on-premises and multi-cloud environments—placing compute in Microsoft® Azure, say, and storage in AWS need to be tied together in a secure fashion. And if you fix a problem component and redeploy it, the redeployment needs to be based on a secure, up-to-date snapshot.

With containers, infrastructure monitoring becomes more challenging. Containers call for specialized monitoring tools providing insights into more than the containers. You also need to monitor the rest of your system and network components in the context of those containers. For example, if an application stops working, you need a way to identify the source of the problem quickly and easily, whether it’s an application component, the container, or the server or network.

You can address some of these issues with tried-and-true approaches such as vulnerability scans. A security information management system (SIMS) can also collect relevant data such as log files in a central repository for analysis.

You’ll need additional security for your containers if your agency is implementing a zero-trust framework and technology providers are beginning to respond. Container platforms like Dockers and Kubernetes offer greater visibility, further enhancing security. And third-party providers proactively look for vulnerabilities in containers as they’re being deployed using security analysis tools.

Technologies around service mesh, which control how application components share data with one another, are also gaining maturity. Software-defined wide area networks (SD-WANs) enable encrypted communications across environments. They let you specify, for example, where certain containers can talk only to other certain containers or when communication can be one-way but not two-way.

An infrastructure monitoring and management platform can help you administer and secure your containers. Providing a single pane of glass to manage in both on-prem and multi-cloud environments can simplify the security complexity inherent to containers. An effective platform should enable you to:

  • Track details such as hosts, host clusters, environment dependencies, and deployments
  • Review metrics for containers, hosts, and other infrastructure elements
  • Analyze container activity in an application-stack management tool
  • Organize containers in a mapping tool for managing the physical and logical relationships among infrastructure entities
  • Display detailed data about individual containers on a single screen

Technology is Just One Piece of the Cyber Solution

But technology is only part of the solution—people are the rest.

IT functions are often structured with teams for DevOps, which handles application development and operations, and SecOps, which handles cybersecurity and operations. Part of the goal behind DevSecOps (development, security, and operations) is to bring together the brainpower of both teams. Over time, your agency should develop a technical talent pool with diverse expertise and experience to help cover all your cybersecurity bases.

In March 2021, FedRAMP released vulnerability-scanning requirements for containers. A key intention of the requirements is to promote knowledge of best practices for the safe use of clouds. Training in best security practices around containers is as essential for your developers and software engineers as it is for your security pros.

Also, look to your IT providers for their input and expertise. They should be happy to share their knowledge and experience to ensure you get the most from their cloud- and container-focused technologies and you also know how to implement them securely.

Your organization should meet the FedRAMP requirements for container security, but keep in mind the guidance doesn’t cover every detail necessary to ensure strong security for your containerized environments. After all, cyber vulnerabilities, cyber threats, and your unique cyber risks will constantly evolve. You need continuous monitoring, ongoing analysis, and continuing education for your IT team. And it’s just as important to document processes for extending a cyber-safe culture throughout your organization as you deploy more containers.

 

Visit our website for information on the SolarWinds® Server & Application Monitor solution and how it can help you monitor your containerized applications.

Modernizing Cybersecurity & MultiCloud Services with TMF

What is TMF?

The Technology Modernization Fund (TMF) is an action plan created as part of the 2017 Modernizing Government Act. The goal is to have a funding vehicle that would aid agencies in accelerating project completion. This is enacted via a loan which will be repaid based on individual project agreements. In order to implement and perfect technology modernization, the Technology Modernization Board (TMB) prioritizes projects that engage several agencies at once, address security gaps, and improve the public’s access to services. The TMB is responsible for eighteen different projects across ten federal agencies, seven of which were awarded American Rescue Plan (ARP) funding to address urgent IT modernization challenges.

How it Works

The fund is overseen by the TMB, which is comprised of government IT officials and cybersecurity experts with expertise in technology, transformation, and operations. The board reviews IT-related project proposals submitted by government agencies to determine which projects deserve funding, and how much. Priority is given to proposals which meet certain criteria [1] such as improving security, increasing operational efficiency, and adapting scalable technology. Since its inception, the TMB has loaned hundreds of millions of dollars to agencies and programs such as Foreign Labor Application Gateway and Farmers.gov. Technology modernization proposals are sent to the board through a two phased approval process.

The first phase is the Initial Project Proposal (IPP). IPPs act as a low burden prescreening for both agencies and the Board. Agencies submit a rough outline of their project proposal, while only approved and unique projects go to the Board for review. In their proposal, agencies must discuss a general plan process and whether project funding has been explicitly denied or restricted by Congress.

The second phase, the Full Project Proposal (FPP), is submitted directly to the Board. FPPs must have a comprehensive description of the proposal, project milestones, and a funding schedule. Agencies should also have a pitch presentation prepared for the Board.

Once funded, TMF projects are reviewed quarterly by the Board to ensure milestones and schedules are met. Corrective action is implemented when necessary to help agencies remain on track, and technical experts are there to provide support to teams to improve capability and fix troubleshoot issues.

TMF’s Importance Today

The TMF process is helpful as it provides greater flexibility to agencies and funds technology modernization efforts by allowing repayment options and payback terms for up to five years. Across the board, government agencies have accelerated modernization efforts because they no longer need to wait for funding. Now, agencies can act and gain funding as the project goes on. Oftentimes, company departments that are overlooked by either the government or their own agency will go through the TMF to gain adequate funding for important projects. This financing shows that accountability and oversight make a difference. It allows agencies to provide new capabilities in a timely manner in a rapidly changing environment. Without the TMF loans, these contributions, delivery, and improvements would not be possible.

Modernizing Zero Trust

One of the eighteen funded projects allows the U.S. General Services Administration (GSA) to modernize legacy network systems and implement an advanced zero trust architecture [2]. Through technology modernization funds, the GSA will advance zero trust architecture by improving zero trust blocks. Firstly, the GSA will replace directory designs to meet the new expectations of hybrid cloud architecture. These updates will be multi-domain and multi-cloud applicable. Secondly, it will develop modernized enterprise single-sign-on that will include multi-factor authentication. This way, security will be improved by a micro-segmented authentication system that adheres to a zero trust strategy. Lastly, the GSA will add artificial intelligence and machine learning driven algorithms to help detect threats to systems. All together, these measures will help protect government clients’ sensitive information from bad actors.

Carahsoft Cybersecurity & MultiCloud TMF Blog Embedded Image 2022Data Modernization

Another project goal is to modernize the Department of Labor (DOL)’s enterprise data management and analytics capabilities. The aim is to improve the availability and analytic capabilities of data to developers, journalists, researchers, and other federal agencies. Currently, the DOL faces issues with data consistency, quality, and availability. Proposed improvements include incorporating predictive analytics software to report capabilities to the DOL’s IT department, and implementing data management capabilities and to support application programming interface (API). This would share data with both the DOL and the public. These efforts could aid in cost savings, increased efficiency, and improved services [2].

TMF and Multicloud Services

One initiative that the TMF has provided funding for is cloud-based security enhancements. So far, these include funding for:

  • The United States Department of Agriculture (USDA), to complete migration to the cloud for all applications ($500 thousand)
  • To the Department of Energy (DOE), to migrate enterprise emailing to the cloud ($3.7 million)
  • To the U.S. Department of Housing and Urban Development (HUD) to move critical business systems from on-premises databases to the cloud ($13.8 million) [3]

Single clouded services have limited control and less flexibility. With the combination of two or more public clouds, private clouds, or a combination of both, an agency will gain better control and oversight on the cloud. This will allow a customer’s sensitive information to be better protected. This is especially important in light of 93% of businesses are moving to multi-cloud architecture [4].

The Future of TMF

With the Technology Modernization Fund, government agencies are able to improve their cybersecurity, increase data management capabilities, and support the public they are created to serve. TMF acts as a mitigated process to gaining funding for projects. Due to the implementation of the TMF, improvements in security such as multi-factor authentication, API, zero trust, and segmentation were enacted in the federal government. Because of TMF, government agencies are better able to serve customers by keeping their information secure and meeting their constituents and employees’ needs in a modernized, efficient, and scalable manner.

 

View Adobe’s Experience Cloud Demo page for more insights on Technology Modernization Fund and cybersecurity.

 

[1] “Awarded Projects,” The Technology Modernization Fund. https://tmf.cio.gov/projects/

 [2] Miller, Jason. “Special Report: Benefits of Technology Modernization Fund Validated,” Federal News Network. https://federalnewsnetwork.com/reporters-notebook-jason-miller/2020/11/special-report-benefits-of-technology-modernization-fund-validated/?_sm_au_=iVVnDfDJW3W3ZHZskN1JRKsp6MH81

 [3] Wiggins, Don. “Advance Your Government Mission with Secure Hybrid Multicloud,” Equinix. https://blog.equinix.com/blog/2021/02/16/advance-your-government-mission-with-secure-hybrid-multicloud/

[4]  Parmar, Dipti. “Why Organizations Need a Multicloud Strategy and How to Create One,” The Forecast by Nutanix. https://www.nutanix.com/theforecastbynutanix/technology/why-organizations-need-multicloud-strategy