Locking Down Information Management Security on Campus

According to one report, ransomware attacks against higher education doubled in 2020 compared to 2019, with an average ransom demand of $447,000. Traditionally, criminals tended to be opportunists; they’d strike at random and hope to get lucky. Now they’ve organized into highly sophisticated networks and cartels that will target any entity of substance they consider a viable target. Higher ed fits the profile, but some institutions are better positioned to withstand cybersecurity attacks than others. A combination of zero-trust and defense-in-depth allows these schools to defend against malware and ransomware. Ultimately, the job of the cybersecurity professional in higher ed is to “plan for the worst day,” as one cybersecurity expert recently noted during a Campus Technology leadership summit. But how can agencies overcome these obstacles to adapt to an increasingly targeted and threatening cybersecurity landscape? Learn how your institution can safeguard against threats, overcome evolving technical demands, and more in Carahsoft’s Innovation in Education report.

 

Gaining Total Visibility

“We can no longer piece together a set of disparate tools to solve acute security or compliance issues. Really, the only way forward is to use a mix of integrated security technologies that deliver, first, a view into traffic and, second, a flexible enforcement model that relies on artificial intelligence and machine learning to identify attacks. The solution starts and ends with visibility. The goal is to understand how data flows through the network, cloud and endpoints so that IT can provide a consistent security view no matter how services are being used. It’s important to understand how your users are tapping those services and to surface those things that traditional tools can’t see. As one example, we have a service called Xpanse, which will take an outside-in view of the network and start to build relationships, looking at how endpoints are interacting with other endpoints that are outside of the network, contributing to the building of a map showing how the institution is connected to the rest of the world.”

Read more insights from Palo Alto Networks’ Security Strategist, Hunter Ely.

 

A Unifying Viewpoint for Security

IIE Campus Tech June Info Management Security Blog Embedded Image 2021“Automation of the easy security work — known threats, known responses, malware detection, cleanup — addresses both problems, and everybody wins. The campus gains better operational success. And when humans don’t have to intervene with the ordinary, they’re free to do more interesting work. They grow in their positions, because they’re not just clicking buttons all day. Automation is especially important in an era of remote status quo and zero-trust. IT has to assume that there’s a high probability of any authentication request being nefarious. And that means being able to look at data in context: Is this person at a higher risk? Is the laptop or smartphone compromised? Should we let them on the network today? Have we scanned this device in the last three days? Then let’s not allow them access to this HR data. If they get their machine scanned, then they can come back and try again. While higher ed has long been predicated on allowing open access, now that can only happen when it’s the appropriate thing to do. Users have to be classified — student, researcher, staffer — and access has to be controlled. When everything looks normal, they get unfettered access. But when their machine or account is compromised, the access should be denied. Easier said than done, right?”

Read more insights from Splunk’s Minister of Magic, Jesse Trucks.

 

AI and the Carrot Approach to Zero-Trust Network Access

“Some 20 years ago, I was outfitted with a BlackBerry device, and it was the first time I could get e-mail from the road. But it wasn’t the built-in keyboard that made that device so special. It was really the fact that my organization’s IT department trusted the BlackBerry security model so deeply, I could use my device to access sensitive corporate information. BlackBerry’s mission hasn’t changed. But now, that security emphasis is used to secure some 500 million endpoints — including cars — produced by various companies. That’s why higher education has rediscovered BlackBerry. The university IT organization trusts the company to keep devices secure, whether they’re owned by the institution or individual people — students, staff or faculty. And now, without having to use a college-owned device that navigates through the college-owned firewall, users can once again be liberated, just like we were two decades ago, when we first got a taste of the freedom allowed by mobility.”

Read more insights from BlackBerry’s Director of Sales, Chris Russo.

 

Protecting the Campus from the Outside In

“Is it any wonder threats are on the rise? As the number of system and data breaches rack up in higher education, security experts have adopted a defense-in-depth stance. Putting multiple defensive measures in place begins with a baseline security posture that wants to understand everything coming into and going out of the network, preferably in real time. The tricky part is achieving that level of visibility and response when the threats could originate from any one of the many thousands of devices accessing institutional resources. One route is deploying domain name system (DNS) security. Let’s think about DNS for a moment. It may be decades-old but it’s still heavily relied upon; without it, the entire network is shut off from the internet. Regardless of their location, endpoints require DNS to connect to any application, service or data source. And so does malware, which uses DNS at multiple stages of an attack. That’s why DNS is a marvelous transport system for malfeasance. Traditional security mechanisms don’t police it well because there’s so much of it — millions of DNS queries a day for the typical university.”

Read more insights from Infoblox’s Director and General Manager for U.S. Education, Rufus Coleman.

 

Uncovering the Hidden Costs of Cloud Security

“While the public cloud has been a boon for higher education on many fronts, it has also become a conundrum, especially when it comes to storage for the purposes of security and safety. As the needs add up, so does the expense. The first not-so-hidden cost is the baseline cost of data storage. As an example, think about the capacity required to sustain video recordings of people entering and exiting buildings on campus. A network of 100 cameras, each capturing 8 frames per second with a modest resolution of 720 pixels, operating continuously at just medium quality, would require 200 terabytes of capacity. On Amazon Web Services, the cost for storing 200 TB on S3 would be about $56,000 for the year. If the institution were to upgrade to newer cameras capturing 15 frames per second at 1080 pixels, generating five times as much data — a full petabyte — the expense would quintuple, to about $289,000. Microsoft Azure would be slightly under that ($262,000) and Google Cloud a bit more ($327,000). Second, there is the additional hidden cost of the traditional route those cloud storage providers follow for transactions related to the data. They’ve all predicated the value of their services on fractional pricing (a tenth of a penny for this, a couple of pennies for that) for seemingly insignificant activities, such as egress or API requests.”

Read more insights from Wasabi’s Senior Director of Product Marketing, David Boland.

 

Staying on Top of Cybersecurity: A Conversation with Two University CISOs

“In March 2020, I was feeling more comfortable in terms of what our border looked like and the things that we were protecting our constituents from. Then the pandemic happened and people started grabbing devices off of their desks and old laptops out of storage closets and dragging them home to put on home networks — and who knows how they were being secured, if they were being secured at all. I thought I had a fairly good plan in place and tools deployed across my infrastructure to protect us, but that was all out the window. And so, over the last year we’ve been looking at services and products we can deploy that will protect our users as well at home as we could when they were on campus. And there’s nothing like having a community of your peers to have those conversations with and to learn what they’re doing, how long it took them to get there, what bumps they ran into along the way and ultimately, how they were able to steer around those. That’s significantly beneficial to all of us, and that is a huge value of participating with Internet2 overall and through the NET+ program for specific cloud and security solutions.”

Read more insights from Tom Dugas, CISO for Duquesne University, and Rick Haugerud, CISO for the University of Nebraska-Lincoln.

 

Community-Powered Problem-Solving

“We facilitate the community engaging with each other to identify best practices. For example, let’s say there’s a particular challenge that a campus is trying to figure out. They may go into a community call, where campuses can ask their peers: How do you solve this problem? And then they can get immediate feedback. Or there are many ways institutions collaborate digitally, including e-mail lists, Slack channels and wikis, where they can engage with peers to identify best practices. That is all part of the NET+ program, where advisory boards and community events help to foster more optimal service offerings and benchmarking. And a program manager like myself is engaged with and supports these types of discussions. After a number of campuses have verbalized similar challenges, we’ll realize maybe there’s something there that we need to write up, to share broadly with the community, where they can look at a frequently asked questions repository and find the answers to their questions. And that’s even faster than going and asking their peers.”

Read more insights from Internet2’s Program Manager for Security and Identity, Nick Lewis.

 

Download the full Innovation in Education report for more insights from these cybersecurity thought leaders and additional industry research from CampusTech.

The Best of What’s New in Cybersecurity

 

Cybersecurity reached a tipping point in 2021. One big driver is a wave of disruptive attacks — some targeting critical infrastructure and important supply chains — that has put a national spotlight on this long-simmering issue. These attacks are a wake-up call to elected officials and line-of-business leaders regarding the risk presented by growing cybercriminal activity. That call has gone all the way to the Oval Office, where the Biden Administration issued an executive order aimed at shoring up the nation’s cybersecurity through better sharing of threat information, greater adoption of Zero Trust security architectures and secure cloud services, and other measures. The COVID-19 pandemic has been another important driver, turning up the heat on modernizing security approaches and tools in state and local government. Another critical factor: There’s new money available for cybersecurity modernization. Read the latest insights from industry thought leaders in cybersecurity in Carahsoft’s Innovation in Government® report.

 

Achieving a Sustainable Cybersecurity Strategy

“The pandemic accelerated trends that were already in motion. Digital innovation increased to meet the need for digital interactions when face-to-face interactions weren’t possible. In addition, the massive shift to working from home impacted risk. When the pandemic hit, most organizations didn’t have all the policies, procedures and tools in place to effectively secure those environments. Another disruptor is the changing geopolitical landscape. Cyber warfare is becoming a mainstream weapon for many nation states. And then there is the explosion of fraud as a service. Attackers are taking advantage of the fact that organizations’ defenses are not ready for remote work and these other changes.”

Read more insights from Cloudera’s Field CTO, Carolyn Duby.

 

 Intelligent, Ubiquitous Security

“Organizations need prevention and visibility on the endpoints themselves because these devices are in varying risk environments and will eventually be connected to the network, if they aren’t already. Very few sizable breaches occur without accessing or compromising an endpoint. Organizations should focus on prevention first and then visibility because the value of visibility lessens if you don’t have the resources to act on what you see. Preventing an attack early is far less expensive and time-consuming than stopping it later. Organizations need to apply a uniform Zero Trust defense strategy across all devices — mobile included — and personnel.”

Read more insights from Blackberry’s Vice President of Global Services Technical Operations, Tony Lee.

 

GovTech December Cybersecurity Blog Embedded Image 2021Disaster Recovery in the Age of Ransomware

“One reason cloud storage services are succeeding is because they provide high performance at a much lower cost than the large cloud providers. Many hyper-scale cloud storage providers use service tiers where organizations can store certain data “deep and cheap” for governance or compliance reasons. However, data retrieval can take hours or days and data egress fees can be very expensive. By contrast, a high-performance storage service that doesn’t use service tiers offers a better model for organizations that are fighting ransomware and need active data and a fast response time. Cloud storage services also don’t charge a data egress fee — unlike many hyper-scale cloud providers. This means disaster recovery teams can regularly practice restoring their data without paying a fee every time they do so.”

Read more insights from Wasabi’s Director of Product Marketing, Drew Schlussel.

 

Cybersecurity at Scale

“The first thing to understand is whether you’re going to lift and shift on-premises workloads or have everything cloud native moving forward. Understanding your cloud strategy will inform your security approach. For example, if you’re going to lift and shift a data center where applications are hosted on servers, your workload protection needs to be tuned toward server vulnerabilities, which are very different from vulnerabilities on laptops and desktops. Also, it’s not just endpoints that are vulnerable. The automation or orchestration layer can also be an attack vector. Finally, it’s important to have tools that monitor conformance to your cloud governance standards so you can avoid misconfigurations that expose your environment to attack.”

Read more insights from Trend Micro’s Vice President and General Manager for U.S. Federal Business, Chris Radosh.

 

Download the full Innovation in Government® report for more insights from these cybersecurity thought leaders and additional industry research from GovTech.

Maximizing the Benefits of MultiCloud

The government’s approach to cloud technology has changed dramatically in the years between the 2010 Federal Cloud Computing Strategy, known as Cloud First, and the 2019 Cloud Smart Strategy. The first policy pushed agencies to consider cloud technologies before others, while the second offers actionable advice on how to deploy the technology. Today, 81% of federal agencies use more than one cloud platform, according to a MeriTalk survey. Because of its inherent flexibility and scalability, cloud technology played a key role in agencies’ response to the pandemic and their ability to shift employees to remote work. Now government leaders recognize that multicloud environments are crucial for ensuring resiliency during a crisis. The Cloud Smart Strategy explicitly references hybrid and multicloud environments as essential tools for improving mission outcomes and service delivery. Despite the benefits of multicloud environments, they can present management challenges for many agencies, such as difficulty migrating mission-critical legacy apps to the cloud or ensuring the interoperability of products and services from multiple vendors. In a recent survey of FCW readers, security was the biggest challenge to managing a cloud ecosystem, cited by 74% of respondents. The Cloud Smart Strategy makes it clear that cloud technology has become indispensable to government agencies but adopting hybrid and multicloud requires thoughtfulness and planning; read the latest insights from industry thought leaders in Carahsoft’s Innovation in Government® report on multicloud.

 

Empowering the Government’s Earliest Adopters

“Multicloud environments offer agencies the opportunity to go beyond simply managing data to analyzing it for valuable insights and better decision-making. Cloud technology was created to deal with the exponential increase in data collection and the increasing demands for storage. In other words, cloud was developed to handle big-data challenges. Furthermore, cloud technology offers tremendous opportunities for agencies to off-load some monotonous day-to-day IT management tasks in favor of higher-level activities. If there are only a handful of people in an agency’s IT organization, they could spend all their time creating new storage clusters and provisioning that storage as data collection increases. If an agency can leverage the automation that comes with cloud to store and replicate data and then make sure that data is backed up and protected, the agency can enable those individuals to focus on true data analysis, data science and data discovery.”

Read more insights from Google’s Cloud Engineering Manager, Sean Maday.

 

Rethinking Legacy App Migration and Software Factories

“Many government agencies have started to build software factories to reduce security risks and greatly improve the innovation cycle. If not implemented well, however, they can increase security risks, especially when each program or project builds its own software factory. Instead of creating more software factories, agencies should move toward centralizing software build environments and rationalizing duplicative processes that can be used for both legacy and modern application development teams regardless of their development methodology. They should strive to standardize all tooling for agile/DevSecOps, create enterprise services that support development teams, and establish policies that monitor for insider threat and eliminate risks during software development.”

Read more insights from MFGS’s Public Sector CTO, David Wray, and CTO for Alliances and Partners, Kevin Hansen.

 

Developing a Long-Term Vision for MultiCloud

FCW Maximizing MultiCloud Blog Embedded Image 2021“A multicloud approach can be a double-edged sword, with benefits and risks. When agencies have access to a cloud environment, it’s easy for them to spin up new compute resources or storage solutions. But this flexibility opens up risks in terms of performance and security. Even when an agency is working with public cloud service providers, it’s the agency’s responsibility to make sure its resources are configured properly. Many data leakage incidents in the cloud are the result of a configuration issue. Furthermore, in a multicloud environment, technologies are created independently of one another and won’t always work well together. Agencies must make sure they have the appropriate visibility across multicloud environments and on-premises systems so they can understand and manage all aspects of their IT systems. This includes controlling costs and decommissioning purpose-built cloud resources when they are no longer needed.”

Read more insights from SolarWinds’s Group Vice President for Product, Brandon Shopp.

 

Taking a Fresh Look at Cloud’s Potential

“Agencies need to understand the business goals for a particular cloud-based application or workload and then make decisions about the best architectural approach. They also need a comprehensive security model that’s architecturally coherent from a deployment and operations perspective. The model should take into consideration the entire life cycle of applications as agencies modernize into the cloud. By combining the security and compliance aspects of modernization with a coherent IT architecture, agencies can drive down costs for managing those applications in the cloud. The cost savings can allow agencies to fund further modernization efforts or conduct research and development activities around core workloads or advanced capabilities such as artificial intelligence.”

Read more insights from Microsoft Federal’s CTO, Jason Payne.

 

How Cloud Storage Enables Innovation

“In the early days, cloud storage was designed to be “cheap and deep” — a place to inexpensively store data without worrying about capacity. At the time, cloud could not compete with on-premises storage in terms of access speeds. Thanks to technological advances in the past several years, however, data is as quickly accessible and available in the cloud as it is via on-premises systems. As a result, the number of applications that are eligible for cloud storage has increased dramatically, and cloud has become a primary storage option for enterprises. Beyond backing up data, agencies can use live applications in the cloud for video surveillance or active archiving, for example.”

Read more insights from Wasabi Technologies’ Senior Director of Product Marketing, David Boland.

 

Raising MultiCloud Management to the Next Level

“Many agencies are using cloud the way they used non-cloud data centers 15 or 20 years ago. But instead of customizing their cloud environments, they should use tools like Terraform, Juju or Pulumi to create, deploy and manage infrastructure as code on any cloud and then enable automation and orchestration in their cloud platforms. In addition to using predetermined, software-defined configurations for cloud deployments, agencies should develop a more strategic approach to funding their multicloud environments. Agencies should also take a fresh look at their cloud funding models. Beyond the total cost of ownership, they need to reevaluate how they pay for cloud products and services. They can choose to treat that spending as a capital expenditure (CapEx), which typically has a higher cost of ownership, or as an operational expense (OpEx).”

Read more insights from Dell Technologies’ Cloud Technologist, Patrick Thomas.

 

The Elements of a Strong Cloud Portfolio

“Custom code is arguably the root cause of most IT challenges in government. For example, the Alliance for Digital Innovation, of which Salesforce is a member, released a study that found the federal government could have saved $345 billion over 25 years if it had embraced commercial technology rather than building systems from scratch. In order to improve customer service and reduce their dependency on custom solutions, agencies should implement a multicloud strategy that is not solely based on rehosting and refactoring applications on infrastructure solutions. Agencies need to make sure they adopt a mix of software as a service (SaaS), platform as a service (PaaS) and infrastructure as a service (IaaS). And they should consider low-code options within the SaaS and PaaS categories to limit their reliance on custom solutions.”

Read more insights from Salesforce’s Regional Vice President, Public-Sector Digital Strategy, Christopher Radich.

 

A Framework for Gleaning MultiCloud Insights

“By constantly monitoring compliance, agencies ensure that the cloud environment is safe and productive. In other words, their data is protected and their employees have the ability to use that data to perform their jobs and achieve mission goals. In addition, monitoring compliance and resource optimization is the key to ensuring uptime and appropriate capacity, as well as answering questions about costs. Agencies need to understand how they’re running and operating cloud applications and then make sure they’re applying the right framework for managing security policies. Furthermore, flexibility and efficiency are central benefits of a multicloud environment. Moving on-premises software into such an environment typically requires a complete re-architecting of those applications.”

Read more insights from SAP National Security Services’s CSO, Brian Paget.

 

Optimizing Cloud Investments with a Digital Twin

“In most agencies, it’s impossible for any person to get an understanding of all traffic flows and behavior. Agencies need access to normalized data presented in easy-to-consume visuals to ensure compliance, reduce outages and prevent incidents. Similarly, multicloud environments incorporate a wide variety of services and products, and it is essential to have a unified view that links what’s in the cloud (or clouds) and what’s on premises. A digital twin can supply that single source of truth and ensure that applications are readable across clouds and on-premises systems and that the network’s security posture is not being invalidated. And just as robust GPS apps will find the most efficient path, a digital twin knows all the possibilities and can answer agencies’ questions about the most efficient, secure and cost-effective way to route cloud activities.”

Read more insights from Forward Networks’s Technical Solutions Architect, Scot Wilson.

 

Download the full Innovation in Government® report for more insights from these government multicloud thought leaders and additional industry research from FCW.