The Ongoing Quest for Cybersecurity

 

Government agencies were already under pressure to modernize their cybersecurity strategies before the pandemic hit, and as workplaces closed and government employees struggled to access data and systems from makeshift home offices, the cybersecurity risks grew. The use of virtual private networks in the U.S. increased to match the early spike in COVID-19 cases, rising 124% in the two weeks from March 8 to March 22, 2020, according to Statista. Around the same time, the Cybersecurity and Infrastructure Security Agency (CISA) issued an alert titled “Enterprise VPN Security,” which offered both warnings and guidance on how to handle the surge in usage. With so many employees logging in remotely, agencies found that they had to shift their focus from securing a well-defined perimeter to securing the data that fuels government operations. In a recent survey of FCW readers, protecting data topped the list of cybersecurity priorities, with 75% of respondents citing it. In response to such concerns, CISA released its Ransomware Guide in September 2020. And in May, President Joe Biden mandated that agencies adopt zero trust in his Executive Order on Improving the Nation’s Cybersecurity, and the National Security Agency released a paper a few months ahead of that mandate titled “Embracing a Zero Trust Security Model.” Read the latest insights from industry thought leaders in Carahsoft’s Innovation in Government® report on cybersecurity.

 

The Future of Cybersecurity is Autonomous

“Analysts have too much atomic data and not enough context about that data. When they don’t have the full picture, they can’t take appropriate action. Re-creating each attack by hand takes painstaking care. And though analysts often relish this challenge, there’s simply not the time to do so for every presented case. Forward-thinking organizations are using artificial intelligence/machine learning (AI/ML) capabilities to fortify user endpoints and server workloads across an array of operating systems. These automations are designed to monitor the growing number of attack vectors in real time and present the full context of an attack in an easy-to-understand view that’s modeled after a kill chain.”

Read more insights from SentinelOne’s COO, Nick Warner.

 

Tailoring Zero Trust to Individual Users

“Zero trust is an important construct for helping agencies protect their infrastructure in today’s cybersecurity landscape. It focuses on accrediting individuals and their access to government resources. Agencies should make those decisions about access based on a comprehensive understanding of users. Security policies that treat all users as equally risky can be restrictive. Such policies set the bar high and hamper employees’ ability to work, or they set the bar low, which defeats the purpose of having security. Instead, agencies should evaluate users on an individual basis by taking the time to understand what employees do and how they do it — what’s normal behavior and what’s not. Then they can assess the risk of an individual based on that context.”

Read more insights from Forcepoint’s President of Global Governments and Critical Infrastructure, Sean Berg.

 

Modernizing Security for a Mobile Workforce

“Securing data and apps begins with positively identifying the user. In government, agencies have used multifactor authentication and all kinds of certificates, but those are simple pass/fail security checks. Once users are allowed to cross the security barrier, they often have wide-ranging access to government resources. This means adversaries and malicious (or careless) insiders passing the security checks receive free rein as well. Government needs to move to a continuous authentication model, which leads to better security and a better user experience. It involves seamlessly authenticating users every step of the way — when they touch the keyboard or scroll through an app on a screen. That activity, down to the microscopic vibrations in a person’s fingertip, can be sensed and understood so that IT administrators can answer the question: Is this really the authenticated user, or is it somebody else?”

Read more insights from BlackBerry’s Chief Evangelist, Brian Robison.

 

The Dangers that Lurk in Mobile Apps

“Government employees are increasingly reliant on mobile applications to do their jobs. But without formal monitoring programs in place, agencies might be unaware of the risks inherent in commercial and government-built apps. As a result, few agencies are investing resources and time to address a serious problem. The average mobile device has 60 to 80 apps, representing a huge potential for vulnerabilities at agencies whose employees are using those devices for work. Thousands of apps could be tracking employees or intercepting data. NowSecure founder Andrew Hoog has said mobile apps are the ultimate surveillance tool, given the mix of personal and mission activities in one space.”

Read more insights from NowSecure’s Chief Mobility Officer, Brian Reed.

 

Why Data is a Critical Cybersecurity Tool

“Once agencies have gathered their data in a scalable, flexible platform, they can apply artificial intelligence to derive insights from the data. AI speeds analysis and is particularly effective when agencies move from signature-based to behavior-based threat detection. A signature-based approach is good for detecting threats we already know about, but a behavior-based AI approach can adapt to new threats by looking for anomalies such as changes in the behavior of a server or endpoint device. AI also helps with investigations by reconstructing the sequence of events that happened during an intrusion, which fuels agencies’ ability to prevent future attacks. With AI, agencies can start to apply more sophisticated algorithms in their hunt for vulnerabilities and cyber threats.”

Read more insights from Cloudera’s Principal Solutions Engineer and Cybersecurity SME Lead, Carolyn Duby.

 

IIG FCW Cybersecurity Blog Embedded Image 2021Zero Trust Data Management Foils Ransomware Attacks

“Agencies must ensure recoverability because none of these protections matter if they can’t recover data and systems that run their critical missions and operations. Agencies need to gather and protect data at the edges of their networks, in their data centers and across different clouds. And regardless of where agencies decide to store that data, they need to be able to access it instantly. Recoverability service-level agreements of minutes and hours are possible and delivered today across the whole of government and the Defense Department. Gone are the days of weeks and months to get back online.”

Read more insights from Rubrik’s Public-Sector CTO, Jeffrey Phelan.

 

Reclaiming Control over Complex IT Environments

“When employees were sitting in a government office behind a firewall, IT administrators had a clearly defined perimeter to protect. Now IT administrators are still focused on protecting the agency’s mission and assets, but the responsibility has become more difficult because they’ve lost some visibility and control over the infrastructure. In response, many organizations are moving toward strategies based on zero trust, which requires validating users and devices before they connect to government systems, or least privilege, which involves only giving employees access to the resources and applications they need to perform their jobs. Zero trust and least privilege require continuous monitoring and a risk-based approach to adding or removing authorizations.”

Read more insights from SolarWind’s Group Vice President of Product, Brandon Shopp.

 

The Role of Authentication in Data Protection

“Users who need to access low-risk applications and data — for example, publicly available product information — can use an authentication method such as one-time password tokens. But if that same user wants to access higher-value data such as corporate finance records, the required level of authentication should increase, perhaps requiring public-key infrastructure (PKI) authentication with a smartcard. The key is to manage those activities via one pane of glass or one platform that supports the entire risk-based and continuous authentication process. In the past, we’ve been able to base decisions on where users are located — for example, whether they’re accessing data from within the network or remotely via VPN — but that is no longer enough. New technology tools enable agencies to gain a deeper understanding of users’ online behavior so they can make more informed decisions about authentication.”

Read more insights from Thales TCT’s Vice President of Product Management, Bill Becker.

 

Verification and Validation to Enhance Zero Trust

“Networking teams rely on standard configurations to maintain the security policy. These standard configurations dictate connectivity and traffic flows to ensure users can access appropriate resources while preventing unauthorized access. The idea of a standard configuration seems simple, but maintaining it is extremely difficult. Validating configurations is clearly mission critical, but monitoring and validating network behavior are even more telling and help ensure that policies are not inadvertently being circumvented and that there is no unintended connectivity.”

Read more insights from Forward Networks’s Technical Solutions Architect, Kevin Kuhls.

 

Extending Zero Trust Down to the File Level

“A software-defined perimeter integrates proven, standards-based security tools to create the ideal foundation for zero trust. When used together, those two approaches give agencies the granularity to customize their security protocols. For example, the IT team could allow USB mice but not USB thumb drives that can store data, and they could block potentially unwanted applications that anti-malware engines might not identify as malicious, such as bitcoin-mining or file-sharing apps. Zero trust is a mindset rather than a specific group of tools. The National Institute of Standards and Technology’s Special Publication 800-207 on zero trust architecture advocates taking a holistic approach to authenticating devices and users and extending that attitude to agency assets, services and workflows.”

Read more insights from OPSWAT’s Senior Director of Government Sales, Michael Hylton.

 

Download the full Innovation in Government® report for more insights from these government cybersecurity leaders and additional industry research from FCW.

Maximizing the Benefits of MultiCloud

The government’s approach to cloud technology has changed dramatically in the years between the 2010 Federal Cloud Computing Strategy, known as Cloud First, and the 2019 Cloud Smart Strategy. The first policy pushed agencies to consider cloud technologies before others, while the second offers actionable advice on how to deploy the technology. Today, 81% of federal agencies use more than one cloud platform, according to a MeriTalk survey. Because of its inherent flexibility and scalability, cloud technology played a key role in agencies’ response to the pandemic and their ability to shift employees to remote work. Now government leaders recognize that multicloud environments are crucial for ensuring resiliency during a crisis. The Cloud Smart Strategy explicitly references hybrid and multicloud environments as essential tools for improving mission outcomes and service delivery. Despite the benefits of multicloud environments, they can present management challenges for many agencies, such as difficulty migrating mission-critical legacy apps to the cloud or ensuring the interoperability of products and services from multiple vendors. In a recent survey of FCW readers, security was the biggest challenge to managing a cloud ecosystem, cited by 74% of respondents. The Cloud Smart Strategy makes it clear that cloud technology has become indispensable to government agencies but adopting hybrid and multicloud requires thoughtfulness and planning; read the latest insights from industry thought leaders in Carahsoft’s Innovation in Government® report on multicloud.

 

Empowering the Government’s Earliest Adopters

“Multicloud environments offer agencies the opportunity to go beyond simply managing data to analyzing it for valuable insights and better decision-making. Cloud technology was created to deal with the exponential increase in data collection and the increasing demands for storage. In other words, cloud was developed to handle big-data challenges. Furthermore, cloud technology offers tremendous opportunities for agencies to off-load some monotonous day-to-day IT management tasks in favor of higher-level activities. If there are only a handful of people in an agency’s IT organization, they could spend all their time creating new storage clusters and provisioning that storage as data collection increases. If an agency can leverage the automation that comes with cloud to store and replicate data and then make sure that data is backed up and protected, the agency can enable those individuals to focus on true data analysis, data science and data discovery.”

Read more insights from Google’s Cloud Engineering Manager, Sean Maday.

 

Rethinking Legacy App Migration and Software Factories

“Many government agencies have started to build software factories to reduce security risks and greatly improve the innovation cycle. If not implemented well, however, they can increase security risks, especially when each program or project builds its own software factory. Instead of creating more software factories, agencies should move toward centralizing software build environments and rationalizing duplicative processes that can be used for both legacy and modern application development teams regardless of their development methodology. They should strive to standardize all tooling for agile/DevSecOps, create enterprise services that support development teams, and establish policies that monitor for insider threat and eliminate risks during software development.”

Read more insights from MFGS’s Public Sector CTO, David Wray, and CTO for Alliances and Partners, Kevin Hansen.

 

Developing a Long-Term Vision for MultiCloud

FCW Maximizing MultiCloud Blog Embedded Image 2021“A multicloud approach can be a double-edged sword, with benefits and risks. When agencies have access to a cloud environment, it’s easy for them to spin up new compute resources or storage solutions. But this flexibility opens up risks in terms of performance and security. Even when an agency is working with public cloud service providers, it’s the agency’s responsibility to make sure its resources are configured properly. Many data leakage incidents in the cloud are the result of a configuration issue. Furthermore, in a multicloud environment, technologies are created independently of one another and won’t always work well together. Agencies must make sure they have the appropriate visibility across multicloud environments and on-premises systems so they can understand and manage all aspects of their IT systems. This includes controlling costs and decommissioning purpose-built cloud resources when they are no longer needed.”

Read more insights from SolarWinds’s Group Vice President for Product, Brandon Shopp.

 

Taking a Fresh Look at Cloud’s Potential

“Agencies need to understand the business goals for a particular cloud-based application or workload and then make decisions about the best architectural approach. They also need a comprehensive security model that’s architecturally coherent from a deployment and operations perspective. The model should take into consideration the entire life cycle of applications as agencies modernize into the cloud. By combining the security and compliance aspects of modernization with a coherent IT architecture, agencies can drive down costs for managing those applications in the cloud. The cost savings can allow agencies to fund further modernization efforts or conduct research and development activities around core workloads or advanced capabilities such as artificial intelligence.”

Read more insights from Microsoft Federal’s CTO, Jason Payne.

 

How Cloud Storage Enables Innovation

“In the early days, cloud storage was designed to be “cheap and deep” — a place to inexpensively store data without worrying about capacity. At the time, cloud could not compete with on-premises storage in terms of access speeds. Thanks to technological advances in the past several years, however, data is as quickly accessible and available in the cloud as it is via on-premises systems. As a result, the number of applications that are eligible for cloud storage has increased dramatically, and cloud has become a primary storage option for enterprises. Beyond backing up data, agencies can use live applications in the cloud for video surveillance or active archiving, for example.”

Read more insights from Wasabi Technologies’ Senior Director of Product Marketing, David Boland.

 

Raising MultiCloud Management to the Next Level

“Many agencies are using cloud the way they used non-cloud data centers 15 or 20 years ago. But instead of customizing their cloud environments, they should use tools like Terraform, Juju or Pulumi to create, deploy and manage infrastructure as code on any cloud and then enable automation and orchestration in their cloud platforms. In addition to using predetermined, software-defined configurations for cloud deployments, agencies should develop a more strategic approach to funding their multicloud environments. Agencies should also take a fresh look at their cloud funding models. Beyond the total cost of ownership, they need to reevaluate how they pay for cloud products and services. They can choose to treat that spending as a capital expenditure (CapEx), which typically has a higher cost of ownership, or as an operational expense (OpEx).”

Read more insights from Dell Technologies’ Cloud Technologist, Patrick Thomas.

 

The Elements of a Strong Cloud Portfolio

“Custom code is arguably the root cause of most IT challenges in government. For example, the Alliance for Digital Innovation, of which Salesforce is a member, released a study that found the federal government could have saved $345 billion over 25 years if it had embraced commercial technology rather than building systems from scratch. In order to improve customer service and reduce their dependency on custom solutions, agencies should implement a multicloud strategy that is not solely based on rehosting and refactoring applications on infrastructure solutions. Agencies need to make sure they adopt a mix of software as a service (SaaS), platform as a service (PaaS) and infrastructure as a service (IaaS). And they should consider low-code options within the SaaS and PaaS categories to limit their reliance on custom solutions.”

Read more insights from Salesforce’s Regional Vice President, Public-Sector Digital Strategy, Christopher Radich.

 

A Framework for Gleaning MultiCloud Insights

“By constantly monitoring compliance, agencies ensure that the cloud environment is safe and productive. In other words, their data is protected and their employees have the ability to use that data to perform their jobs and achieve mission goals. In addition, monitoring compliance and resource optimization is the key to ensuring uptime and appropriate capacity, as well as answering questions about costs. Agencies need to understand how they’re running and operating cloud applications and then make sure they’re applying the right framework for managing security policies. Furthermore, flexibility and efficiency are central benefits of a multicloud environment. Moving on-premises software into such an environment typically requires a complete re-architecting of those applications.”

Read more insights from SAP National Security Services’s CSO, Brian Paget.

 

Optimizing Cloud Investments with a Digital Twin

“In most agencies, it’s impossible for any person to get an understanding of all traffic flows and behavior. Agencies need access to normalized data presented in easy-to-consume visuals to ensure compliance, reduce outages and prevent incidents. Similarly, multicloud environments incorporate a wide variety of services and products, and it is essential to have a unified view that links what’s in the cloud (or clouds) and what’s on premises. A digital twin can supply that single source of truth and ensure that applications are readable across clouds and on-premises systems and that the network’s security posture is not being invalidated. And just as robust GPS apps will find the most efficient path, a digital twin knows all the possibilities and can answer agencies’ questions about the most efficient, secure and cost-effective way to route cloud activities.”

Read more insights from Forward Networks’s Technical Solutions Architect, Scot Wilson.

 

Download the full Innovation in Government® report for more insights from these government multicloud thought leaders and additional industry research from FCW.