Nutanix AHV and Rubrik’s Layered Security – The Key to System Resilience and Efficiency

Protecting critical infrastructure from cyber threats and ensuring business continuity in the face of disasters is a top priority for organizations today. Luckily, Nutanix AHV, a modern, secure virtualization platform that powers and enhances virtual machines (VMs), can help. Rubrik’s integrated solutions fortify AHV environments against ransomware attacks and enable efficient disaster recovery. By leveraging features like immutable backups, anomaly detection and on-demand cloud-based disaster recovery, organizations can enhance their cyber resilience and minimize the impact of disruptive incidents.

A Simple and Secure Path to VM Management

Nutanix AHV is simple to use and secure by design. The platform works through a centralized control plane, where AHV is integrated into a single application programming interface (API). This eradicates a complicated setup on the customer side. By maintaining constant management and a virtualization layer, Nutanix AHV allows organizations to fulfill mission objectives.

Nutanix AHV features several built-in security features, such as micro-segmentation, data insights, audit trails, ransomware protection and data age analytics.

Nutanix features:

  • Built-in, self-healing abilities protect against disk failure, node failure and more
  • A vulnerability patch summary automatically alerts users about susceptibility risks and anomalies that need to be addressed
  • A life cycle manager provides readmittance testing and deployment testing
  • More than one copy of backup data, ensuring that users do not lose valuable information
  • Multi-site replication including to and from the public cloud.

Securing data in Nutanix AHV requires more than just the basic perimeter defenses, but a multi-layered strategy. With Rubrik’s data protection abilities, which include immutable backups, automatic encryption and logical air-gapping, agencies and organizations can recover information within minutes and resume mission objectives in the event of a breach.

Securing Data with Rubrik’s Rapid Recovery Abilities

Rubrik, a security cloud solution provider that keeps your data resilient, enables the near-instant recovery of virtual machines and data within the Nutanix AHV environment. Rubrik provides multiple recovery options within AHV, such as file-level recovery, live mount, export, mount virtual disks and downloadable virtual disk files. Through Rubrik, businesses can recover files from older hypervisors into newer AHV environments without having older hypervisors online. Once granted access to the AHV environment, Rubrik automatically discovers and integrates protocols and base level policies for VMs. Rubrik’s recovery process restores data in minutes, regardless of VM size. As VMs get larger and larger, frequently hitting 50 terabytes, this speedy and precise response empowers organization’s incident response plans to be swift and efficient. After scanning the meta data, users are granted file level recovery after anomaly detection, allowing users oversight on affected data.

As the data that organizations manage grows exponentially, data security becomes critical to business functions. Rubrik offers comprehensive data security, continuously monitoring and remediating data risks within the network.

Through Rubrik, businesses can recover files from older hypervisors into newer AHV environments without having older hypervisors online. Once granted access to the AHV environment, Rubrik automatically discovers and integrates protocols and base level policies for VMs.Rubrik’s recovery process restores data in minutes, regardless of VM size. As VMs get larger and larger, frequently hitting 50 terabytes, this speedy and precise response empowers organization’s incident response plans to be swift and efficient.After scanning the meta data, users are granted file level recovery after anomaly detection, allowing users oversight on affected data.

Rubrik also provides constant monitoring for backups. Typically, businesses do not regulate data backlogs, which increases the likelihood that they miss attackers that sit in the system environment for a few days before collecting data. With Rubrik’s threat monitoring and hunting, organizations can search through backups and detect when an anomaly entered the environment. Through Nutanix and Rubrik’s integration, IT teams can reduce complexity, gain oversight, cut down on operational costs and improve resiliency and efficiency.

Automation: The Key to a Proactive Incident Response

Modern cyber threats require a proactive approach to incident response. With automation and orchestration, facilitated by the combined capabilities of Nutanix and Rubrik, organizations can detect, respond to and recover from cyber incidents more efficiently.

Rubrik has a built-in anomaly detection, which searches protected data for strange behavior, such as mass deletion or encryption. As the volume of data on a network increases, organizations often have sensitive data they are not actively monitoring or even know sensitive data maybe exposed. Rubrik clusters are always scanning protected data for anomalies, sensitive data, and known IOC’s allowing customers to select resolution options, such as isolating compromised VMs, or the ability to restore product systems from last known good copies.

Readiness impacts recovery time, and recovery time impacts organization operations. Nutanix AHV’s recovery organization authorizes IT teams to organize VMs into a set of templates, which can be used to create blueprints and launch application recovery. Nutanix also provides organizations with the flexibility to apply policy to each workload, taking control of network security and BC/DR policy with VM level granularity. By allowing organizations to map out their application owners, Nutanix AHV enables businesses to move from a reactive to a proactive security posture, minimizing the impact of attacks and ensuring swift recovery.

Nutanix and Rubrik’s integration creates a powerful security and operational synergy, empowering organizations with the tools they need for network safety and, if necessary, a swift and comprehensive restoration of critical systems, empowering organizations to resume business missions. Nutanix AHV enables organizations to reduce complexity, improve security and achieve a higher level of resilience and operational efficiency.

To learn more about how Nutanix AHV and Rubrik’s integration delivers streamlined data protection, rapid recovery and robust incident response capabilities, watch our webinar, Fortifying AHV: Cyber Recovery and Incident Response with Nutanix and Rubrik.


Software, AI, Cloud and Zero Trust as Top Priorities for the Army and DoD at Large at TechNet Augusta 2023

Many of the major cybersecurity, data, DevSecOps and other trends from the past couple of years continue to grow and be top priorities for every segment of the Department of Defense (DoD). At TechNet Augusta 2023, Government and industry experts shared the specific needs of their organizations across those areas and solutions to help achieve their goals. The main theme of the event was “Enabling a Data-Centric Army” and expanding those principles and their mobilizing technologies to the entire DoD. For the Army in particular, the shift from hardware to software, the use of artificial intelligence (AI), cloud capabilities and Zero Trust were headlining topics at the conference.

Shifting from Hardware to Software

In an effort to increase agility and expand access to resources, the Army is transitioning its equipment from hardware to software. Amending its materiel release process to decouple software from hardware allows the Army to deploy software outside of the long hardware acquisition cycle. To mobilize this endeavor, the Army Futures Command (AFC), is modifying its software requirements to focus on high-level overviews that are then refined by operators. Alongside this shift, the Army and other departments requested that technology providers ensure that their software solutions integrate with each other. Going forward, the Army also asked industry to provide software that is not tied to specific hardware. This separation will be key to establishing data-centricity. Nearly every speaker echoed the importance of this shift for their departments.

Utilizing AI

With this major transition to a software-heavy environment, Army Chief Data and Analytics Officer David Markowitz believes it will be an ideal use case for generative AI in software development. Having a controlled environment in software development would make it easier to properly govern compared to the complexity of some of the other uses. As AI usage increases across the DoD, military leaders requested industry create AI platforms with layered complexity of features enabling users of any skill level to utilize the technology effectively. In regard to AI applications for data, Army CIO Leonel Garciga stated that additional guidance on “Data Use on Public/Commercial Platforms” would be released soon to clarify its policy. Overall, officials concurred that the DoD is not looking to become 100% reliant on AI aid but instead maximize AI’s strengths to augment human critical thinking and empower commanders to make data-driven decisions.

Enabling Cloud Capabilities

Over the past year, the Army has exponentially increased its cloud migration and virtualized capabilities. Housing information in the cloud optimizes data storage and simplifies ease of access particularly with the increase in data output, and the push for AI data analytics and data-driven decisions. Hybrid cloud solutions offer the readiness, adaptability and duplication of vital information necessary for military operations to continue smoothly in any situation. Currently, DoD leaders seek industry solutions for modernizing and moving applications to the cloud simultaneously. Acquiring technology with this ability would reduce both the security risk and the work required from the military to implement it.

Expanding Zero Trust

Overarching every aspect of the DoD is the critical need for cybersecurity. Garciga plans to emphasize Zero Trust implementation heavily in conjunction with improving user experience and cyber posture. While multi-factor authentication offers a great starting point, military leaders explained that it is not enough and that they look to partner with industry to close virtualization vulnerabilities through continuous monitoring and regular red teaming. At the conference, the Army Cyber Command (ARCYBER) outlined seven principles for IT providers to follow for all capabilities they deliver:

  • Rapidly Patch Software
  • Assess All Production Code for Security Flaws
  • Improve Security of Development Networks
  • Isolate Development Environments from the Internet and from the Vendor Business Network
  • Implement Development Network Security Monitoring
  • Implement Two-Factor Authentication (2FA) on Development Network and Testing Services
  • Implement Role-based Permissions on Development Network

Empowering DoD Success

A consistent thread woven throughout the event was the vital nature of open communication and partnership between the DoD and technology companies to achieve the established goals. Within each of these areas including the shift from hardware to software, use of AI, cloud capabilities and Zero Trust, the DoD looks to innovate and explore new methods and solutions to stay ahead on the world platform. Together through collaboration, industry can have a vital role in keeping American citizens safe one technology update at a time.

 

Explore our Federal Defense Technology Solutions Portfolio to learn how Carahsoft can support your organization through innovative, agile defense resources and IT capabilities.

*The information contained in this blog has been written based off the thought-leadership discussions presented by speakers at TechNet Augusta 2023.*

Overcoming Data Challenges With Virtualization

Despite the variation in their individual mandates, all regulatory agencies have one main objective: to protect the public. However, there are hurdles to this goal. There are heavy costs associated with data warehousing, as large projects require extensive telecommunication and server space. This can be both expensive and time-consuming. Luckily, by implementing data virtualization tools, agencies can overcome these constraints and provide more effective services.

What is Data Virtualization?

Data virtualization is an approach to data management that helps organizations accelerate the turnaround time for converting data into digestible information. These data sources can range from a variety of locations, including distributions and data stores and any documents, emails or spreadsheets an agency has. With such a wide array of data, accessing and understanding all vital information can be both time-consuming and overwhelming. Data virtualization is necessary to streamline access to the answers and information agencies and users require.

Thentia Data Virtualization Blog Embedded Image 2022How It Works

Data virtualization software begins by creating a layer over or around all existing data sources in an organization. Through its complementary interface, the software outputs the needed information. This process saves an abundance of time that is otherwise spent reading labels and searching for a single piece of information.

Another major benefit is that data virtualization software creates a layer of abstraction between the data source and what the user ultimately sees. The software arranges heterogeneous data from all the different sources across an organization, and then quickly presents it to the user. By properly interacting with the data sources, data virtualization software ensures that all data sources are correctly represented. This way, users can receive sufficient context behind the information they are accessing.

Boons that Enhance Virtualizing Servers

Typically, data virtualization exists between the user and their vast array of data sources. Virtualizing tools have several benefits. They:

  • Reduce the processing time and cost
  • Provide the same opportunity to accomplish a variety of goals and objectives
  • Reduce expenses associated with data integration

In addition to these numerous advantages, virtualizing servers have the same security benefits that any other IT system has. For one, data servers exist on a single network, and are isolated from potential threats. Servers have network isolation and segmentation to prevent the unnecessary cross of information. With granular access control, users can implement micro-segmentation to further this boon. Lastly, by maintaining updates and new security patches, virtualizing servers can stay up to date with the latest cybersecurity practices. For a professional licensing agency, it is always beneficial and necessary to take steps to secure their software. Additional steps don’t need to be taken to protect virtualizing servers.

Choosing the right data virtualization software

The process of implementing data virtualization can be daunting at first. As each organization differs in the types of information it collects and how that information is categorized, data virtualization will also differ. However, there are a few elements that regulatory agencies should consider. First, regulators should determine the setup/layout of their existing organization structure. Questions to consider include:

  • What existing technology is owned?
  • What systems are being worked with?
  • What are the agency’s needs?
  • What are the agency’s top priorities?

All these factors contribute to how data virtualization is implemented. Once the respective regulator reaches a higher end of technological maturity, it should begin looking into fully implementing data virtualization. With the proper virtualization software, regulators can swiftly sift through information.

Data virtualization servers reduce time, resources and cost for regulators

For a variety of agencies, data virtualization can greatly streamline and improve their access to information. By transforming manual systems into a digital, accessible process, virtualization servers reduce time, resources and cost for regulators in their ongoing work to best utilize data to aid the public.

To learn more about Thentia’s data virtualization solutions, visit our website.

The Technology Modernization Fund’s Projects for Zero Trust Innovation

In the wake of the 2019 pandemic and its long-lasting effects, most agencies have begun adapting new security standards to meet changing needs. The 2017 government action plan, The Technology Modernization Fund (TMF), aims to aid federal agencies with funding in order to accelerate the thorough completion of projects. While the TMF did not directly award grants for pandemic relief, many of its cybersecurity projects meet critical needs that were enhanced by the pandemic and working from home. Three of its major projects are centered around helping departments accelerate the transition to zero trust, the most effective security strategy. These projects focus on Zero Trust Architecture in the US Department of Education (EDU), Advancing Zero Trust with the GSA, and Zero Trust Networking with OPM.

Zero Trust Architecture

One project is the TMF’s funding of Zero Trust Architecture for the EDU. The TMF will invest $20 million into the department over a two-year span to strengthen the zero trust architecture in order to increase the security of citizen data maintained by the department.[1] This includes over a hundred-million students and borrowers that the EDU supports. The project’s goal is to improve security of data through strategy, architecture, design, and implementation roadmaps. The department will also have a catalog of services with Secure Access Service Edge (SASE) and Security Orchestration, Automation, and Response (SOAR) technologies.

Advancing Zero Trust

The TMF has awarded $29,802,431 to go toward the US General Services Administration (GSA) for advancements in zero trust.[1] As the GSA’s shared services support millions of users and hundreds of facilities, it is vital for the agency to advance their cybersecurity architecture. Through its TMF funding, the GSA intends to focus on three main areas:

  1. Users and Devices: The GSA aims to replace directory designs to meet new demands of telework and multi domain, hybrid cloud architecture. It will implement virtualization with security strategies, (including a single-sign-on multi-factor authentication option,) in order to:
    • Increase cybersecurity identification and protection
    • Add both equitable, online identity verification and in-person options for improved accessibility to vulnerable populations
    • Reduce the barriers to Login.gov by expanding the website features for both current and future users.
  2. Networks: By leveraging SASE solution and upgrading public buildings’ security network, the GSA hopes to add micro segmentation to secure networks.
  3. Security Operations: The GSA intends to adopt increased machine learning and artificial intelligence driven algorithms. This would help connect diverse data sources, highlight system threats, and provide managerial oversight.

Carahsoft TMF Zero Trust Blog Embedded Image 2022Zero Trust Networking

Finally, the TMF has allotted $9.9 million towards accelerating the adaptation of zero trust networking in the Office of Personnel Management (OPM).[1] The OPM hopes to improve cybersecurity architecture across data, identity, devices and endpoints, network and environment, and application workload areas. They aim to use the TMF funds to reduce the surface area for attacks and increase managerial visibility over networks. In doing so, they will improve the security of data and privacy protections for over two million civilian federal employees.

Increased financial assistance allows companies to implement vital security measures. In 2015, OPM experienced a security breach for their personnel systems, which led to the loss of more than 20 million personnel records.[2] In order to solidify their systems, the OPM conducted research on zero trust before ever receiving funding. As a result of the TMF’s aid, the OPM can now implement this latest security architecture faster than anticipated. OPM has created five different teams to execute this: architecture engineering, cloud operations, service management, service automation, and migration.[3]

Government Services for the People

TMF is vital to accelerating and improving agency projects. With the extra funding, organizations can afford better oversight measures, such as non-company experts. The variety of security measures, such as managerial oversight, smaller attack surface areas, micro segmentation, and identity verification, that TMF aids are all vital to the changing work landscape. Even features such as expanding website attributes will enhance public connection to government agencies in a post-COVID-19 landscape. With this increased connection, the federal government can better achieve its goal of being for and by the people.

 

View Adobe’s Experience Cloud Demo page for more insights on Technology Modernization Fund and cybersecurity.

 

[1] “Investments: Advancing Zero Trust” The Technology Modernization Fund. https://tmf.cio.gov/projects/#advancing-zero-trust

[2] “The Awards Focus on Zero Trust and Include a Major Investment in the Login.gov Federal Digital Identity Solution,” FCW. https://fcw.com/security/2021/09/7-new-tmf-awards-include-one-classified-project/259192/

[3] “Technology Modernization Fund support awarded to 7 new agency IT projects,” FedScoop. https://www.fedscoop.com/opm-speeding-zero-trust-tmf/

The DoD’s Move to 5G Infrastructure and Devices

 

Over the last several years, the discussion around 5G moved from hope and planning to pilots and test beds. Now agencies and industry are on the cusp of a 5G reality. Agencies already are spending billions of dollars on these 5G tests and now the Federal Communications Commission and others are providing more money to further roll out 5G infrastructure. Taken altogether, 5G is close to that tipping point where a technology become ubiquitous. The FCC has allocated $9 billion to roll out 5G infrastructure across rural America. Meanwhile, the Defense Department and the Coast Guard already are seeing the benefits of 5G to servicemembers. Hear from leaders at DoD, the Coast Guard, FCC and CISA on how 5G can bring new capabilities and innovations that allow agency personnel to experience data, training and operations in ways not possible before in the latest Federal News Network Expert Edition report.

 

Enterprise-Grade Security Is Vital for Secure 5G Infrastructure

“Top of mind regarding 5G benefits is security. To be fair, 5G also comes with its own risks: The rapid proliferation of endpoint devices enabled by 5G means a massive expansion of the threat surface. And because most of those devices are mobile or sensors, they’re not secure to begin with. But 5G also enables the solution to these problems. For one thing, it adds heightened authentication, which is important because the biggest vulnerability to a network is the user. Users can add malicious software to devices, which can access data they’re not supposed to or influence the way the network operates.”

Read more insights from Palo Alto’s Senior Systems Engineering Specialist for 5G and Mobility, Bryan Wenger.

 

How DoD, IC Can Adopt Commercial Tech in the Mission Space Through Industry Co-Innovation

“From an operational perspective, technologies like 5G are going to exponentially increase the amount of data available within the enterprise, because nearly anything can become a sensor. That means, for example, in the area of contested logistics, the DoD will be able to have greater understanding and visibility into its supply chain nodes. More accurate inventory and consumption levels will provide better insight into the demand signal and allow for automation through a logistics system. It’s a smart depot all the way down to the individual soldier, but this makes it all the more critical to properly manage this data. This is an area where commercial technologies are well established and proven to work.”

Read more insights from SAP NS2’s CTO, Kyle Rice.

 

IIG FNN 5G Edition Blog Embedded Image 2021Neutral Host Networks, Private LTE Can Give Agencies Greater Flexibility, Security

“Neutral host networks can provide agencies with more autonomy and control over their networks. For example, a federal facility can set up a neutral host LTE network to mimic security controls they would usually use on their enterprise Wi- Fi. That also provides an infrastructure separate from service carriers in that area, but that is also capable of supporting and extending the service range of those carriers. In many remote or rural areas, there aren’t enough subscribers to justify investment in a large-scale LTE deployment. Federal agencies could potentially sublease a network as a revenue stream or cost offset. It’s like paving a road with private funds, then setting up a toll booth to cover the cost.”

Read more insights from Dell’s Lead System Architect, Chris Thomas.

 

JMA Brings Savings, Flexibility to 5G with Software Virtualization

“Virtualization is when you take something that used to be done in hardware, and you do it in software. Take your phone as an example: You used to have a dedicated iPod to do your music, and now it’s an application on your phone. The same thing can be said now in mobile wireless. At a cell site, you used to deploy numerous racks of equipment, to do what’s called the RAN function, the radio access network function. We at JMA take those racks of equipment, and we’ve now converted that into a 100% software solution that we call XRAN. Others in the industry have also converted RAN into software, but they still rely on specialized hardware accelerators. JMA’s is unique in that it provides 100% 5G capability in software.”

Read more insights from JMA’s Senior Vice President for the Federal Market, Andrew Adams.

 

Download the full Federal News Network Expert Edition report for more insights on the future of 5G from Carahsoft’s technology partners and leaders at DOD, the Coast Guard, FCC, and CISA.

The Best of What’s New in Hybrid and Remote Work

When the COVID-19 pandemic struck in March 2020, agencies scrambled to expand secure connectivity and acquire mobile devices, but most state and local CIOs say their organizations transitioned relatively easily to working from home on an emergency basis. Now, with COVID-19 cases in the U.S. dropping dramatically and economies reopening, public agencies face a more complicated issue: figuring out where and how state and local government employees will work going forward. A 2020 CDG national survey found almost 75 percent of respondents anticipate hybrid work — where employees work from home at least on a part-time basis — will be their long-term model. The trend is particularly strong at the state level where just 16 percent of respondents anticipate returning to a fully in-person work environment. Read the latest insights from industry thought leaders in hybrid and remote work in Carahsoft’s Innovation in Government® report.

 

Modernizing Contact Centers to Enable Remote Work

“To ensure callers have a secure, fluid and reliable customer experience, agencies must maintain diverse channels of communication. Another challenge is ensuring that contact center agents have secure and timely access to their agency’s database, intuitively orchestrated communications and sufficient bandwidth for reliable connectivity. Organizations also need to minimize the learning curve associated with introducing new endpoints such as Bluetooth-enabled headsets, softphones and web real-time communication (WebRTC), which eliminate the need for traditional desk phones and enable workers to use their laptop for voice or digital interactions.”

Read more insights from Genesys’s Senior Solutions Consultant, Ivory Dugar.

 

The Digital HQ: Flexible, Inclusive and Connected

“What we’ve seen over the past year hasn’t just been about working from home. It’s been working from home during a pandemic. As the pandemic has stretched into its second year, employees are feeling the strain. The data show that even though the work-from-home experience is better than working in the office full time, employee satisfaction with work-life balance has declined and stress and anxiety have increased. A contributing factor to that stress is the pressure to demonstrate productivity. A third of remote workers say they feel pressure to make sure their managers know that they’re working.”

Read more insights from Slack’s Future Forum Senior Relationship Manager, Dave Macnee, and Customer Success Leader for Public Sector, Kevin Carter.

 

IIG GovTech Blog Embedded Image 2021Giving Remote Workers Access to Resources They Need

“Centralized IT management and virtualization technology are critical to manage infrastructure and address changes quickly and at massive scale — whether that’s to patch a vulnerability across all user devices, upgrade applications or deploy additional computing resources. IT can make a change once via software and then distribute it to everyone’s device within minutes with minimal downtime. Software can monitor network traffic and resource utilization in aggregate and then automatically allocate resources as needed so organizations don’t have to invest in higher-performance user devices or purchase more hardware. In addition, organizations can isolate workloads and systems for security or other purposes, meaning multiple workloads and operating systems can run on the same device.”

Read more insights from NVIDIA’s Senior Manager of Public Sector, Chip Carr.

 

Managing Process and Cultural Change

“It’s projected that 30 to 35 percent of the public sector workforce will remain remote. A lot of these workers will probably be younger. To attract and engage the workforce of the future, you have to keep systems, processes and tools up to date. Younger people run their lives on their phone. If you expect them to submit to completely manual paper-driven processes, you’ll probably never get a chance to hire them, much less retain them. You also have to find out what they need to be successful in a remote environment; show them a path to promotion; and demonstrate that remote, hybrid and on-prem teams are aware of and understand their value to the organization.”

Read more insights from SAP Concur’s Senior Director of Public Sector, Jim McClurkin.

 

Navigating the New Frontier

“Having more flexibility and removing the location barrier opens up real opportunities, especially when it comes to competing for specialties like IT. Some states prohibit hiring out of state, but organizations can still widen the pool to include candidates beyond their local headquarters. They can recruit candidates who want to reside in areas with a lower cost of living or who don’t have the time to commute, for example. This flexibility also helps attract minorities and women, which in IT work, has been a real challenge.”

Read more insights from CDG Senior Fellow, Peter K. Anderson.

 

Download the full Innovation in Government® report for more insights from these hybrid and remote work thought leaders and additional industry research from GovTech.

Deconstructing the Benefits of Virtualization Infrastructure for Government Agencies

Getting Started with VDI

Different hardware and software components must come together in a coherent way to create a good desktop virtualization experience. Important components, such as GPUs for high fidelity video are often invisible to the end user. But government agencies that are assembling VDI solutions must take all of these pieces into consideration and understand how they depend on each other.

When choosing and architecting a VDI solution, the first question to ask is: what is the workload you need to support? Agencies must not only consider which applications users need but also the kind of data they are accessing. Virtual machines come in preset sizes, so can choose the best size that fits the workload. You also need the right hardware stack to support that VDI solution and optimize the users’ experience.

Supporting the Edge

One consideration for government agencies is supporting the devices at the edge—which have particular concerns with data sovereignty, latency, and accreditation. You also want a solution that can be connected and or disconnected with an air gap as needed so you can support mission workloads and security needs. Agencies should look for the capacity to extend AI to edge devices so you can run advanced analytics closer to where that data is being generated. This allows real time insights into workloads so users can envision new possibilities.

Image Quality

When you use graphics or video, frame rate and the quality of those frames become very important.  The workload determines what capacity you need.  Are you using Zoom or Teams? Do you have big graphics-heavy applications like Google Earth? Or are employees using large workstation applications like ANSYS, or MATLAB ArcGIS, which require more hardware? Some use cases—such as a doctor looking at an MRI image—require the highest resolution. 30G is the standard within media entertainment because that gives the best image quality. But images are often created on the server and then encrypted, compressed, and decompressed—processes that can impact the user experience.

Hardware solution architects can help agencies understand the workload and put together the right package to achieve your goals. Requirements for use and quality impact your choices at the server and the client side as well as hardware choices in the datacenter; your CPU, GPU, RAM, and IOE all need to work together seamlessly.

It is important for your endpoint devices to be capable of decoding the data stream from the server and keeping up with the content. If you have four times the number of pixels, it will tax the monitor hardware and require the endpoint device to decode more. The datacenter architecting should not be undermined by insufficient endpoints.

Teradici Benefits of Virtualization Blog Embedded Image 2021Hardware

Hardware plays a critical role in VDI and hardware flexibility can profoundly affect the user experience. You want a solution that spans everything from the edge data center out to the cloud—whether it’s on premises, in a public facility, or in a dedicated sovereign cloud. Agencies should look for hardware that can work across the entire range of hardware requirements—including compute, storage, and networking. Solutions should offer managed services regardless of your cloud provider.

Government agencies can benefit from hardware as a service, which allows you to divest big upfront costs and distribute them over time. You should also seek out robust worldwide services across advisory, professional services, cloud consulting, etc.

Network is an absolutely critical part of delivering a remote solution. State-of-the-art VDI machines and architecture can be slowed significantly by a bad network. A common challenge is getting data from the desktop from the server out to the endpoint, which is obviously a very network dependent process. The top priority is getting the right media to support that latency requirement that you want. Then you can leverage in-memory caching capabilities to take it to the next level and lower the latency further.

The Role of the GPU

A GPU is more than just something to generate pixels. It provides high frame rates, lowers latency, and allows a remote user experience to feel local. Government agencies always want to reduce network latency and encoding on the GPU is one way to accomplish this. Without a GPU, the CPU must do all the graphics work, which requires compression and slows the process. Having the GPU available for graphics work frees up your CPU to run applications.

You need the right GPU to ensure that there are no bottlenecks in the system. It’s critical to get the right number of pixels and refresh rate at the endpoint. There are a number of services that will bring together all the components into different cohesive solutions, creating a turnkey system that can start working immediately.

 

View our presentation to learn more about how virtualization infrastructure can benefit your federal agency.

Tweaking Your Monitoring Strategy to Give End Users a Seamless Experience

Technology and end users have an almost dichotomous relationship; as technology gets more complex, end users expect a more seamless experience. At the same time, end users are also looking for maximized application availability and performance.

How can a federal IT team meet these increasing demands? The answer is monitoring.

Monitoring is obviously not a new concept. Yet today, most monitoring instances have been implemented as an afterthought or as a way to solve a single, specific problem. To provide a seamless user experience while enhancing application performance, federal IT pros must establish monitoring as a core IT function. The key is understanding how to normalize metrics, alerts, and other collected data from different applications and workloads—regardless of their location—to enable a more efficient approach to troubleshooting, remediation, and optimization.

With this approach, agencies can also benefit from a much more proactive IT management strategy, improved infrastructure performance and security, and reduced costs.

SolarWinds Monitoring Strategy Blog Embedded ImageStart With a Modular Platform

The place to start building this monitoring infrastructure is through a platform. Look specifically for a modular platform, one with the ability to add tools seamlessly based on changing needs and capable of growing with your agency. It must also be something different teams from separate IT disciplines can use. Look for a platform designed to simplify integration by providing a common set of services across products, including a unified U/I, customizable dashboards, and intelligent alerts and reports.

Consider Cross-Stack Correlation

Cross-stack correlation is another thing to consider. It’s critical for federal IT teams to be able to visualize and correlate data across the entire IT stack. Monitoring networks, systems, virtualization, infrastructure, and storage environments involves collecting and analyzing millions of different metrics; sorting through these disparate data points can be a significant challenge.

This is where cross-stack data correlation comes in. It allows you to correlate data across the application delivery stack to compare all collected metrics regardless of their location, whether they’re physical or within the delivery path. You can also overlay different types of metrics and measure different aspects of the infrastructure on a common timeline within a single view. The whole process should be simple enough to create ad hoc dashboards on the fly.

Filter Out Alert Noise

Federal IT teams should also aim to find a platform providing them with the ability to filter out noise through things like machine learning-based anomaly detection. Filtering out alert noise means speeding up mean time to resolution for a given issue, ensuring minimal impact to application availability and performance.

Accelerate Troubleshooting With Advanced Reporting Capabilities

There’s one more thing to consider with a unified monitoring approach: advanced reporting capabilities. Look for a solution capable of accelerating troubleshooting across the entire IT stack by sharing data for contextual visibility and relationship mapping. Consolidated metrics and data within a single view puts disparate information into a meaningful context and provides the mechanism for creating advanced alerts and responses for faster root cause identification and automated resolution.

At the end of the day, the goal of the federal IT pro is to deliver—reliably and consistently—applications and services to end users. To meet this goal, it’s imperative to be able to quickly determine the issue and its solution. Having a platform-based resource providing cross-functional collaboration and correlation as well as advanced reporting capabilities is critical to resolving problems quickly and providing a seamless end-user experience.

Visit our webpage to learn more information on key features of multi-vendor network monitoring that scales and expands with the needs of your network.

The Rise of Edge Computing

The proliferation of internet-of-things (IoT) sensors and an increasingly mobile workforce were dispersing government IT operations farther from the data center long before the coronavirus struck. But the pandemic has spotlighted agency employees’ increasing need for robust, secure capabilities in the field — or at home, in the case of remote work — and decision-makers need fast access to data analytics in a wide variety of situations. All those factors are driving interest in computing at the network edge, or processing data at the site of generation rather than storage. Edge computing has profound implications for a wide range of government missions across local, state, and Federal government, and with the emergence of 5G networks, it is becoming easier to incorporate. And if implemented thoughtfully, the benefits can be immense – reduced network stress, increased cybersecurity and savings in cost, time and storage. Read the latest insights from industry thought leaders in edge computing in Carahsoft’s Innovation in Government® report.

 

Streamlining the Adoption of Edge Computing

“Open source is a necessary component of edge computing for two main reasons. First, open source is much more secure than its proprietary counterparts due to the increased transparency. For edge deployments with hundreds or even thousands of sites, initially securing and maintaining them are solved through Red Hat open source. Second, open source supports a level of innovation most proprietary systems simply can’t match. When thousands of people work on a technology, that gives it a substantial advantage in terms of new ideas and accelerated innovation.”

Read more insights from Red Hat’s Practice Lead of OpenShift Virtualization, Storage and Hyperconverged Infrastructure in the North American Public Sector, Garrett Clark.

 

A Unified Approach to Edge Computing

“To avoid piecemeal implementation, edge computing must be part of an agency’s overall IT infrastructure. When done well, it will empower agencies to make more efficient and faster decisions because they’ll be able to harness more data from across the entire landscape. It will also give end users better and faster access to data in the field so they can take advantage of those insights in real time. Edge devices will not replace existing IT but instead will expand on what’s already in place. By incorporating edge computing into enterprise modernization, agencies can also start applying machine learning and other emerging technologies to harness the power of data. However, with edge devices and data now outside agencies’ firewalls, security must be embedded into edge computing. Important tools include automated security and centralized management, perhaps via the cloud.”

Read more insights from Nutanix’s Senior Director of Public Sector Systems Engineers, Dan Fallon.

 

FCW NovDec Blog 2020 Embedded ImageHow to Unleash the Power of Edge Computing

“Edge computing holds a great deal of promise as a stand-alone capability, but when paired with technologies such as advanced connectivity and enterprise data platforms, edge computing can fuel new customer and employee experiences at scale. When agencies combine edge computing with advanced connectivity, for example, they can empower rich, personalized experiences for customers as well as employees. Imagine moving from a 2D world of video consumption to a 3D world with immersive experiences personalized at scale for the individual. Edge computing coupled with advanced connectivity and SAP’s data platform can serve as the foundation to bring these new experiences to life. To help fuel this innovation, advanced connectivity such as 5G and Wi-Fi 6 play an integral role.”

Read more insights from SAP’s Vice President, Global Center of Excellence, Frank Wilde.

 

Accelerating Mission Success at the Edge

“Sometimes an agency will want to be in a cloud environment, sometimes it will choose an edge computing environment, and often, it will need both. In that situation, some quick analytics can happen at the edge, but then the data can move to the cloud for a deeper evaluation that will draw out more predictive insights and analytics. There are three key considerations agencies should keep in mind when moving to edge computing. First, they should think about it as part of a larger continuum alongside their core technologies, including cloud. Second, agencies should design for consistency in management and orchestration. Regardless of where a workload is running, a consistent approach helps agencies manage IT resources and costs and allows the organizations to scale and expand. The third consideration is more far reaching, but I encourage agency leaders to think about the opportunities that edge computing opens up.”

Read more insights from Dell’s Global Marketing Director of Edge and IoT Solutions, Kirsten Billhardt.

 

Beyond the Data Center and the Cloud

“We expect the number of connected devices to reach nearly 45 billion by 2025, gathering close to 80 zettabytes. Unfortunately, sending that growing amount of data to the cloud for processing is not always the best option due to bandwidth limitations and cost concerns. Many government systems are also not connected to the cloud and need to process data locally. Edge technology evolved to meet those challenges by bringing the advantages of cloud closer to the edge. Business applications enabled by edge computing include autonomous delivery, machine control, environmental monitoring, fleet vehicle diagnostics, vision-based analytics and defect detection. Edge computing is particularly beneficial in two situations: when a great deal of data needs to be migrated to the cloud for storage but there is little or no bandwidth and when data needs to be collected and acted on quickly at the edge (e.g., autonomous vehicles and drones).”

Read more insights from AWS’s Principal Technical Business Development Leader for IoT in the Worldwide Public Sector, Lorraine Bassett.

 

Edge: The Next Paradigm Shift in IT  

“Agencies can protect their data and applications across any cloud strategy (including on-premises, private, hybrid, multi-cloud or edge computing) with a cloud-agnostic, edge-based Web Application and API Protection (WAAP) solution. A globally distributed WAAP will protect websites, applications and APIs from downtime and data theft due to web attacks and distributed denial-of service (DDoS) attacks. All network-layer DDoS attacks, including those by large IoT botnets, are instantly dropped at the edge because a WAAP functions as a reverse proxy and only accepts traffic via ports 80 and 443. Any application-layer DDoS or web attack will be automatically inspected and stopped at the edge without disrupting access for legitimate users. Additionally, modern application architectures are shifting toward greater use of microservices and away from monolithic pieces of software. Small, independent microservices are assembled into more complex applications so they can leverage fully functional and distributed processes from third-party APIs.”

Read more insights from Akamai’s Senior Vice President of Web Performance, Lelah Manz.

 

Download the full Innovation in Government® report for more insights from these government edge computing thought leaders and additional industry research from FCW.

Best of What’s New in Health and Human Services

The COVID-19 pandemic is forcing dramatic modernization. Driven by urgent social distancing requirements, Health and Human Services (HHS) organizations virtualized an array of services that traditionally have been performed face-to-face, and unlike typical HHS modernization projects, these changes happened with unprecedented speed. And although these moves were made in immediate response to the COVID pandemic, they’re likely to have long-term impacts on the digital experience for HHS clients, how and where HHS staff members work, and how these organizations purchase and deploy technology. Pandemic-driven uptake of virtual work and digital services could have long-term positive impacts on HHS workforces and the clients they serve; internally, these changes could improve employee satisfaction and retention within HHS organizations. Learn the latest insights from industry thought leaders in healthcare in Carahsoft’s Innovation in Government® report.

Focusing on Outcomes that MatterIIG GovTech September 2020 Health Blog Image

“One place that organizations get stuck is in ‘good enough.’ Unless something’s horribly broken, they stay with what works today instead of pursuing continuous improvement cycles that include customer satisfaction. Organizations that are satisfied with their current operation and their current level of service tend not to want to adopt — or can’t adopt quickly — opportunities that digital technology can offer. Change is exponentially more difficult to execute without a culture that pursues excellence in service quality. To foster a culture that responds to and embraces change, it’s important to adopt a quality approach like Lean or another continuous improvement cycle.”

Read more insights from Salesforce’s Health and Human Services Industry Executive, Rod Bremby.

 

Using Data to Lead Through Change

“The reality is there will never be a truly perfect dataset. Early in the pandemic, I supported agencies that knew their data wasn’t perfect, but they also knew they had to save lives. They executed without hesitation; they built analytical dashboards and evolved them as processes and data collection capabilities improved. That approach enabled them to make increasingly better, more rapid decisions. Other agencies are still working through multiple iterations to get their data and reporting just right; meanwhile they are not making data-informed decisions. This pandemic has proven that it’s the unknown questions that we discover along the way that create change and ultimately drive progress.”

Read more insights from Tableau’s Senior Manager of Solution Engineering, Anthony Young.

 

Virtualization: Rapid, Flexible and Cost-Effective Path to Digital Transformation

“Organizations that are most effective in modernizing their application portfolios do three things well: 1) crafting an application modernization strategy to identify what to modernize and how to do it; 2) crafting a cloud strategy to determine how to integrate cloud services into their modernization strategy; and 3) standardizing on a single platform to build, run, manage and secure applications running in a multi-cloud environment. This platform provides a single pane of glass through which organizations can develop and deploy modern container-based applications across a multicloud environment. Virtualization technologies for things like cloud load-balancing, firewalls and software-defined networking further enable organizations to integrate cloud services with their on-premises workloads while providing robust end-to-end security.”

Read more insights from VMware’s State and Local EducVMwareation Strategist, Herb Thompson.

 

Integrating the Continuum of Care

“Enterprise iPaaS helps integrate disparate or hybrid architectures across the continuum of care. It provides a single instance, multitenant architecture that frees organizations from having to do things like manage code versions. iPaaS also lets organizations modernize without replacing everything they currently use. They can augment and move forward to support low code, agility, and intelligence and insights. That creates a very high return on investment because organizations can focus on their business initiatives and clinical or business outcomes instead of undertaking enterprise IT projects.”

Read more insights from Dell Boomi’s Healthcare CTO Evangelist, John Reeves.

 

Improving Citizens’ Digital Journey Through HHS

“The two key pillars of creating exceptional digital experiences are content and data, and artificial intelligence (AI) and machine learning (ML) can help with both. Using AI and ML, organizations can automate repetitive tasks that prevent them from producing and personalizing content at scale and on every single device. For example, organizations can use the Dell Boomi Enterprise IPaaS platform to automate aspects of website design, layout and creation, as well as the conversion of PDFs to adaptive interactive forms. In terms of data, organizations can use AI to sift through volumes of data and unlock insights that help them understand customers, predict trends, monitor unusual activity and act faster.”

Read more insights from Adobe’s Health and Human Services Director, Megan Atchley.

 

Re-Imagining Healthcare

“Organizations can use AI and ML to look at data in its entirety and automate processes that improve the patient experience and patient care. In addition, AI and ML can help healthcare organizations understand and improve revenue cycle management and internal operations. Chatbots are another emerging technology. With the appropriate bot framework, organizations can quickly develop intelligent, automated questionnaires that patients can step through to find out whether they need a COVID test or a checkup, for example. The chatbot uses their responses to move them to the next appropriate step in the care plan. Collaboration technologies also have become more important for effective virtual visits with patients and for virtual consultations between clinicians.”

Read more insights from Microsoft’s U.S. Chief Medical Officer, Clifford Goldsmith.

 

Download the full Innovation in Government® report for more insights from these healthcare thought leaders and additional industry research from GovTech.