Better Together: How HPE, AMD and Nutanix Empower Modern Enterprises

The rapid evolution of enterprise technology has made modernization an urgent priority. Businesses today face challenges ranging from complex infrastructure and escalating costs to the rising demands of artificial intelligence (AI) and hybrid cloud environments. Together, Hewlett Packard Enterprise (HPE), Advanced Micro Devices (AMD) and Nutanix provide unified solutions that simplify operations, strengthen security and deliver unmatched performance, empowering organizations to navigate current demands and prepare for the future.


Addressing Market Challenges with Innovation

In a dynamic market where infrastructure complexity and cost pressures are top concerns, the combined expertise of HPE, AMD and Nutanix is driving transformative solutions. Nutanix’s hyperconverged infrastructure (HCI) simplifies multicloud management, enabling organizations to run workloads across on-premises, public and private clouds or colocation sites. With intuitive tools like Prism, Nutanix delivers flexibility, cost efficiency and robust security.

On the hardware side, AMD’s EPYC Central Processing Units (CPUs) have revolutionized the data center market, achieving a 34% market share through scalability (i.e. higher core count options that help reduce server footprint). Designed for diverse workloads, including analytics and hybrid workforce applications, AMD solutions like the 4th Gen EPYC CPUs provide outstanding performance while optimizing total cost of ownership (TCO).

Meanwhile, HPE’s ProLiant DX Gen 11 servers offer fast deployment, tailored configurations and scalable options for diverse business needs. Supported by OpEx models like GreenLake, HPE ensures financial flexibility, making modernization accessible for organizations of all sizes.


Unlocking the Potential of AI

HPE AMD Nutanix Better Together Modern Enterprises Blog Embedded Image 2025

AI is reshaping industries, and the HPE, AMD and Nutanix partnership enables enterprises to meet these infrastructure demands. Nutanix’s HCI platform, paired with AMD’s EPYC CPUs, deliver optimized performance for AI and machine learning (ML) workloads. The Nutanix DX 385 model supports up to four double-wide Graphics Processing Units (GPUs), providing accelerated compute for AI-driven environments. With features like network microsegmentation and automated lifecycle management, Nutanix ensures secure, optimized environments for AI applications.

AMD’s EPYC processors are tailored for AI applications, from small-scale enterprise large language models (LLMs) to large-scale generative AI. High core density and features like Secure Encrypted Virtualization (SEV) ensure robust performance and security. HPE complements this with ProLiant DX servers designed for AI workloads, including their “GPU in a Box” model, which simplifies deployment and scales with demand, making it easier for businesses to meet the demands of AI-driven applications. Together, these technologies provide enterprises with the computational power and flexibility to unlock AI’s potential within hybrid cloud environments.


Simplifying Modernization Across Infrastructure

Modernization is no longer optional—it is a necessity for businesses navigating an evolving IT landscape. Businesses face the dual challenge of balancing legacy infrastructure needs with the demands of the future. HPE, AMD and Nutanix simplify this transition by addressing performance, security, management and integration, ensuring organizations modernize effectively while maintaining operational continuity.

Performance

Nutanix software on AMD EPYC-powered HPE ProLiant DX servers handles workloads like virtualization, analytics, big data and AI/ML with exceptional performance. The 4th Gen EPYC CPUs deliver high performance across metrics including per core and per server, reducing infrastructure costs. High-frequency CPU options enable the provisioning of more virtual machines and workloads without increasing physical cores, ensuring businesses can scale seamlessly as demands evolve. HPE delivers two high-performance NVMe storage options, designed to boost data center performance while ensuring reliability and security. HPE NVMe Mixed Use (MU) SSDs use Peripheral Component Interconnect Express (PCIe) Gen4 to boost performance for Big Data, high-performance computing (HPC) and virtualization with fast transfers and low latency. HPE NVMe Read Intensive (RI) SSDs optimize read-heavy workloads like web servers, storage and caching with high-speed PCIe Gen3 and Gen4.

Security

Nutanix integrates features like automatic auditing, encryption and network microsegmentation to ensure compliance and safeguard IT environments. AMD EPYC processors add another layer of protection with SEV, isolating virtual machines with memory encryption for silicon-level protection. HPE’s Silicon Root of Trust protects firmware from the boot process and continuously monitors the Basic Input/Output System (BIOS), ensuring server integrity and preventing breaches​.

Management

Managing modern IT environments is simplified with Nutanix’s one-click updates and lifecycle management capabilities, which integrate seamlessly with HPE’s Service Pack for ProLiant. Nutanix Prism offers a unified management plane, enabling centralized control for clusters, applications and data. The intuitive management interface reduces complexity, empowering IT teams to handle hybrid cloud environments with ease and efficiency.

Integration

Pre-installed with Nutanix Acropolis OS (AOS), HPE ProLiant DX servers offer out-of-the-box solutions optimized for AMD EPYC processors. These systems support diverse hypervisors, including Nutanix Acropolis Hypervisor (AHV) and third-party options, giving businesses the flexibility to tailor infrastructure setups to specific needs. This collaboration ensures workload-specific performance and seamless integration across various deployment environments, helping businesses modernize without disruption.


HPE, AMD and Nutanix demonstrate the power of collaboration by offering a unified approach to modernization. By combining high performance, robust security, streamlined management and flexible integration, their solutions provide businesses with the tools they need to meet today’s challenges and prepare for tomorrow’s demands. Collectively, they simplify the journey to modernization, proving that they truly are better together.


Discover how HPE, AMD and Nutanix are better together in delivering powerful, secure and scalable solutions for modern enterprises. Watch our webinar, “Modernize Your Infrastructure with HPE & Nutanix – Powered by AMD,” to explore cutting-edge innovations and actionable strategies that transform IT environments.


Carahsoft Technology Corp. is The Trusted Government IT Solutions Provider, supporting Public Sector organizations across Federal, State and Local Government agencies and Education and Healthcare markets. As the Master Government Aggregator for our vendor partners, including HPE, AMD and Nutanix, we deliver solutions for Geospatial, Cybersecurity, MultiCloud, DevSecOps, Artificial Intelligence, Customer Experience and Engagement, Open Source and more. Working with resellers, systems integrators and consultants, our sales and marketing teams provide industry leading IT products, services and training through hundreds of contract vehicles. Explore the Carahsoft Blog to learn more about the latest trends in Government technology markets and solutions, as well as Carahsoft’s ecosystem of partner thought-leaders.

Democratizing AI: How Pre-trained Models Plus RAG Can Empower State and Local Agencies

Smaller state agencies need out-of-the-box options that solve immediate needs without a lot of funding or skilled machine learning expertise. Combining RAG with pre-trained LLMs and the agency’s own data accelerates development of AI capabilities and speeds time to value.

In my role at HPE over the last two years, I’ve had meetings with government agencies, defense departments, and research institutions around the world about AI. We’ve discussed everything from how to identify the right use cases for AI, to ethical concerns to getting a handle on the wild, wild west of AI projects across their organizations.

Some of these larger public sector organizations and government agencies have received funding from sources like the U.S. National Science Foundation, U.S. Defense Advanced Research Projects Agency (DARPA), the European Commission’s EuroHPC Joint Undertaking (EuroHPC JU), or the European Defense Fund, which has allowed them to develop AI centers of excellence and build end-to-end AI solutions. They have far-reaching goals — goals such as building the first large language model (LLM) for their native language, becoming the first sovereign, stable, secure AI service provider in their region, building the world’s most sustainable AI supercomputer, or becoming the world leader for trustworthy and responsible AI.

HPE Democratizing AI for SLG Blog Embedded Image 2024

But it takes a lot of resources to train an AI model. The infrastructure needed to train a foundational model may include thousands of GPU-accelerated nodes in high performance clusters. Data scientists and machine learning (ML) engineers are also needed to source and prepare datasets, execute training, and manage deployment.

That’s why many agencies are looking for out-of-the-box options that bring rapid capabilities for solving immediate challenges. Many of these are state and local agencies and higher education institutions. They don’t have the same level of requirements, funding, or expertise to build their own LLMs.

So does that mean the door to powerful AI models is closed on smaller state and local agencies?

No — not if you can gain an understanding of the available pre-trained models that can generate value with AI immediately. There is so much that can be accomplished without ever training a model yourself.

Inference is AI in Action

What exactly is inference? It’s the use of a previously trained AI model such as an LLM to make predictions or decisions based on new, previously unseen data.

Sound complicated? It’s just a fancy way of saying that you’re using an existing model to generate outputs.

In contrast with model training, which involves learning from a dataset to create the model, inference is using that model in a real-world application. Inferencing with pre-trained models reduces both funding requirements as well as the amount of expertise needed to deploy and monitor these models in production.

The pre-trained model market has been steadily growing, as have the number of cloud, SaaS, and open source inference options available. Open AI’s GPT-4o, Anthropic’s Claude, Google’s Gemini, and Mistral AI are among the most popular LLMs used for text and image generation. They’re just some of the thousands of models available through libraries like NVIDIA NGC and HuggingFace.

And just last month in Las Vegas, HPE also made an important announcement of their new NVIDIA AI Computing by HPE portfolio of co-developed solutions. These solutions include HPE’s Machine Learning Inference Software (MLIS), which makes it easy to deploy pre-trained models anywhere including inside your firewall.

Pre-trained Models with Your Data

The advantages of running a pre-trained model with the right platform seem pretty clear — you get the capabilities without the costs of training. However, it’s important to note that a pre-trained LLM excels in general language understanding and generation but is trained on some data other than your own. This is great for use cases where broad knowledge is sufficient and the ability to generate coherent, contextually appropriate text are essential.

So what do you do if you need to generate more specific and up-to-date outputs? There is another machine learning (ML) technique called retrieval augmented generation (RAG) which combines the pre-trained LLM with an additional data source (such as your own knowledge base). RAG combines LLM capabilities with a real-time search or retrieval of relevant documents from your source. The resulting system works like an LLM that’s been trained on your data, but with even more accuracy. RAG is particularly useful for tasks requiring specific domain knowledge or recent data.

Improving Outcomes for State Agencies

Getting started with AI models begins with understanding which problem you want to solve and whether it is most efficiently and effectively solved with AI. Here are some ways different kinds of organizations can leverage pre-trained LLMs:

Law enforcement agencies can use pre-trained models for incident reporting and documentation, to analyze crime data for predictive policing, or to analyze audio and video transcription for evidence management. They can improve community engagement through sentiment analysis and reduce administrative burdens through automated report generation.

Conversational AI can also make many types of citizen services more efficient and user-friendly — from permit applications to public query engines for local government agencies. And LLMs can automate document processing, reducing manual tasks for government workers and improving speed and accessibility of services to citizens.

LLMs can enhance the education experience for students and reduce the burden on teachers. AI-powered virtual assistants can provide tutoring and study support to students outside of school hours and assist researchers in conducting literature reviews by summarizing academic papers or extracting information.

As you consider leveraging pre-trained LLMs, think about the unique problems your agency or institution faces and how this approach could quickly solve those challenges without the need for extensive expertise or the burden and cost of training a model from scratch.

Final Thoughts

As the world and society evolves, the relationship between citizens and their governments, students and their teachers, will evolve too. In fact, they already are. Taking advantage of pre-trained models to solve long-standing automation issues or cumbersome documentation processes can give your organization the catalyst it needs to modernize to meet these new dynamics.

AI is being democratized by a growing number of pre-trained LLMs that are available off the shelf. And you don’t need to have complex data science skills to leverage them, just the right tools.

The door to AI is open for state and local agencies, regardless of size or sophistication. A part of my job is to understand the challenges and goals of public sector organizations of all sizes when it comes to AI.

To learn more about HPE Private Cloud AI, visit the Private Cloud AI solutions overview page and contact the HPE team for questions and comments.

This post originally appeared on HPE.com and is re-published with permission.

Better Cloud with Nutanix and HPE

Today, almost everything online is conducted and saved through the cloud. Government agencies face the obstacle of modernizing their software infrastructure and navigating cloud-based solutions to achieve mandates. That’s why Nutanix, an American cloud computing company that unites public cloud simplicity and agility with private cloud performance and security, has taken up the mission to radically simplify and secure how organizations across all industries and sectors run apps and manage data. With its recent partnership with Hewlett Packard Enterprise (HPE), Nutanix aims to create and provide its own private cloud platform that unifies storage, provides database and desktop services, provides hybrid cloud infrastructure and offers cloud management with the goal of supporting any application and workload. All these objectives have been optimized into one secure, easy-to-use product—Nutanix Cloud Platform.

One Unified Cloud

Nutanix pioneers the cloud market with an adaptable, endlessly scalable user-interface. With its built-in intelligence system, Nutanix Cloud Platform can manage apps and data to maximize efficiency and performance. Its features are robust and resilient, as it will replicate data in small slices so that the software can efficiently recover from outages and withstand cybersecurity attacks.

Nutanix HPE Cloud Blog Embedded Image 2023

HPE and Nutanix’s global partnership brings customers more options. Unlike other cloud spaces, which have predetermined settings, Nutanix Cloud Platform grants users additional flexibility to adapt the cloud to their needs. Users can customize their clouds, apps and technology stacks with rapid time-to-value benefit. The cloud platform has the largest breath of platforms among any cloud, the ability to run ESX AHV and the freedom to scale up or down. Nutanix Cloud Platform includes a hybrid cloud infrastructure, a unified control plane, unified APIs, a secured base, a built-in hypervisor and a built-in lifecycle management.

Nutanix enables every industry to meet its goals. Fourteen different platforms are certified on HPE, giving users the option to choose which solution they use. Over the last 24 months, Nutanix has maintained a 91% Net Promoter Score reflecting its satisfied customer base, considering that the average NPS score is 45%.

Secure with Nutanix

As the world’s largest retailer of software, Nutanix must not only be prepared to deliver a beneficial product, but a secure product. Since multiple Federal, military and intelligence agencies use Nutanix, and since the basics of Government standardize around Nutanix, its cybersecurity is an issue of national security. Nutanix provides several vital security features, including:

  • Factory security hardening and baseline
  • Automated configuration validation and self-healing
  • Data-at-rest encryption
  • Localized encryption key management built into the system
  • Network segmentation and micro segmentation
  • Multi-factor authentication, role-based access and security assertion markup language
  • Data protection, including snapshotting and multi-site capabilities, synchronized replication and constant availability
  • Security on back end that monitors the network and investigates violations to ensure continuous compliance on company scanning tools
  • Encryption capabilities built into the software that cluster lockdown to ensure data cannot be accessed by outside actors

In addition, on request of the Government, Nutanix added a Kernel-based Virtual Machine, which makes the software substantially easier to use. The cloud platform’s certified solutions and joint engineering encourages users to acquire and expand vaster capabilities. By automating the process, Nutanix Cloud Platform promotes sustainable life cycle management.

Nutanix’s cloud is always improving. Manufacturers share testing notes to evaluate the most accurate assessment of the product. There is a dedicated support group for Nutanix and HPE customers that can help users with any issues that arise. Through consistent updates and a shift from capacity-based licensing to processor based, these cloud providers ensure the product is user friendly and easy to bundle with other products.

Better Together

With Nutanix and HPE’s partnership, the cloud has been revitalized as a user-friendly, unified platform to keep industries secure, as well as to provide a streamlined platform for all workloads and data. With Nutanix Cloud Platform, customers can minimize cost, performance and risk all with one product.

View our webinar and dive deeper into the benefits of Nutanix Cloud Platform from Nutanix and HPE’s partnership.

The Best of What’s New in Cybersecurity

In November 2021, federal lawmakers approved dedicated funding for state and local government cybersecurity efforts. The new State and Local Cybersecurity Grant Program — included in the massive Infrastructure Investment and Jobs Act — provides $1 billion for cybersecurity improvements over four years. Then, in March of this year, President Biden signed into law the Cyber Incident Reporting for Critical Infrastructure Act of 2022 as part of the Consolidated Appropriations Act of 2022. Taken together, these laws point toward significant changes in the nation’s historically decentralized approach to cybersecurity. New cybersecurity legislation is being driven by a threat environment that seemingly grows more menacing by the day. It’s likely that state and local agencies will receive additional federal cybersecurity support going forward, along with greater federal oversight. Learn how your agency or municipality can take full advantage of the increased funding to protect against increasing challenges in Carahsoft’s Innovation in Government® report.

 

Navigating Security in a Fast-Changing Environment

GovTech June Cybersecurity Blog Embedded Image 2022“Threat actors are constantly devising new attacks and methodologies, so organizations must stay on top of trends and constantly evolve how they build and secure their software supply chain. It isn’t a ‘set it once and you’re good’ kind of thing. President Biden’s executive order on improving the nation’s cybersecurity and some bills going through Congress will help address some of the issues. Among many things, the executive order mandates service providers disclose security incidents or attacks. It’s also important to establish a community where security professionals across the nation can exchange security and threat information. You don’t want to solve these things in a vacuum. We’re stronger as a community than as individual organizations.”

Read more insights from SolarWinds’ Group Vice President of Product, Brandon Shopp.

 

User Identities in a Zero-Trust World

“State and local governments — which have become top targets of phishing, data breaches and ransomware attacks — must be able to prevent cybercriminals from accessing all endpoints, including those associated with a distributed workforce. Prior to the pandemic, employees primarily accessed databases, applications and constituent data from within the secured network perimeter of an office. Now users are connecting from their home networks or unknown networks — even cafes — that don’t have the security protections that exist within a physical office. That heightens the need for Zero Trust, which has ‘never trust, always verify’ as a main tenet.”

Read more insights from Keeper Security’s Director of Public Sector Marketing, Hanna Wong.

 

Secure Collaboration for the Work-from-Anywhere Future

“The first step is to look at your content governance model. What does that content life cycle look like from ingestion or creation to consumption and archive? Compliance must be part of that entire process. Then, it comes down to your platform and tools. Are you selecting a platform like Box, where your entire content repository is unified and ensures compliance from the point of entry to the point of disposition — all while offering a seamless user experience? Or are you signing up for a disparate and disconnected strategy where you are now responsible for tracking and making sure that different data sources are compliant? Content fragmentation, even in the cloud, can introduce unnecessary exposure and a compliance risk.”

Read more insights from Box’s Managing Director for State and Local Government, Murtaza Masood.

 

What High-Performing Security Organizations Do Differently

“State and local governments are still trying to get a handle on remote access. At the beginning of COVID, most agencies didn’t have a 1:1 ratio of devices to send home with people, so they were forced overnight into a bring-your-own-device support model and virtual desktop infrastructure (VDI) implementation. In many cases, the VDI implementation wasn’t very secure, nor was it optimal. Now agencies are asking how secure their setup is, and they have to go backward to address that, which can cause some real challenges.”

Read more insights from HPE’s Master Technologist in the Company’s Office of North America CTO, Joe Vidal, and Server Security and Management Solutions Business Manager, Allen Whipple.

 

Download the full Innovation in Government® report for more insights from these cybersecurity thought leaders and additional industry research from GovTech.

Safe & Sound Schools: Cybersecurity in K-12

A year ago, IT professionals in K-12 school systems became heroes to their communities when their skills and resourcefulness turned on remote learning for nearly all. But while IT teams were enabling teaching and learning to continue uninterrupted in spite of everything else going on in the world, they were also seeing their systems beset by relentless attacks. More school districts than ever have been victimized by ransomware, data breaches, and other forms of digital malfeasance. While there’s no way to guarantee your schools will avoid all cyber incidents, the preemptive moves you take will make digital and online activities ever safer for your district users. Learn how your institution can adapt to this new environment in Carahsoft’s Innovation in Education report.

 

Closing in on Cybersecurity Stability

IIE Journal October Safe Schools Blog Embedded Image 2021“Traditionally, for good reasons, the conversation in K-12 has been focused on education. The priority for spending has been steered toward academics — getting more support and training for teachers and trying to control the classroom size, for example. Technology, and especially cybersecurity, was a scheduled expense, up there with predictable plumbing problems and textbook replacement, but contained within the IT organization. However, IT — and especially cybersecurity — has now become a strategic element for education. Parents, superintendents, board members and executives within administration have realized that keeping data and systems safe can have a district-wide impact. Experience a data breach or a ransomware event and you’ll suffer damages that strike your budget as well as your reputation: Families will leave your schools to go to the district next door that didn’t have a break-in. That means it has become something that should be part of all decision-making.”

Read more insights from Palo Alto Networks’ Cybersecurity Strategist, Fadi Fadhil.

 

Getting Away from the Ransomware Triple Threat

“Even though it’s now a simple matter to go online and learn how to launch a cyber-attack and buy the tools to do so for just a few dollars, ransomware has become a more complicated process, involving triple extortion. Originally, the idea was that the bad guys would get into your computer system, encrypt your data and tell you that in order to get the data back, you’d have to pay x bitcoins. That was pretty direct; you either paid the money and hoped they’d give you your data or you had backups, because a good backup policy would prevent an attack from imposing any lasting damage. So the criminals revised their approach. They turned around and said, ‘OK, we’ve encrypted your data. Pay this amount to get it back. And by the way, we also stole your data. If you want to prevent this data from being made public, you will pay the same amount of ransom, and this is the deadline.’”

Read more insights from HPE’s Distinguished Technologist in Cybersecurity, James Morrison.

 

The Essential Cybersecurity Service You’ve Never Heard Of

“The cybersecurity threat to K-12 educational institutions has been consistently growing since 2018. Unfortunately, for many schools, efforts to protect against cyber-attacks have not seen similar growth. K-12 public schools became the number one target for ransomware attacks across all public sectors in 2020. Meanwhile, less than a quarter of school districts have anyone dedicated to network security, according to the latest CoSN leadership report. And even institutions with dedicated network security staff may struggle with a lack of funding to dedicate to cybersecurity measures. This poses a challenge for schools that cannot build cybersecurity defenses that match the sophistication of the malicious actors intent on attacking their data-rich networks. Fortunately, cybersecurity help is available, and at no cost. Recognizing that schools, along with other state, local, tribal and territorial government agencies, rarely have the resources they need for cybersecurity, the Center for Internet Security, an international nonprofit, offers essential cybersecurity services through the Multi-State Information Sharing & Analysis Center (MS-ISAC).”

Read more insights from the Center for Internet Security’s (CIS) Senior VP of Operations and Security Services, Josh Moulin.

 

Greatness Awaits: Dump the Paperwork

“Envision this scenario: Requests for payment are sent in via online interface or digitized en masse through a designated service center. The data is vetted to make sure vendors are approved and expenses fall within the expected range or amount. The documentation is immediately tagged for the proper workflow, being approved at each level through a mobile app or computer application. Approvers can be added or removed from the workflow list as staffing or delegation needs change. Those who sit on approvals too long can be notified that the clock is running. Likewise, managers can be alerted when people on their team try to shove payments through without adequate controls or documentation in place. As a result, the right invoices are paid on time, without incurring penalties or losing out on possible rebates offered by the vendors. Any physical space dedicated to holding onto paper documentation can be dedicated to other purposes. On the expense side, schools can eliminate adult arts-and-crafts.”

Read more insights from SAP Concur’s Public Sector Senior Director, Jim McClurkin.

 

Virtual is Here to Stay, so Make It Better

“With the return to the physical classroom, you might think schools should tuck away their Zoom licenses for the next time an emergency strikes. But that would be short-sighted. Educators have seen how technology can play a role in delivering learning options for students who can’t attend in person. Now that K-12 administrators are reimagining and redesigning education, school districts would be foolish not to learn from their pandemic experiences. Their big lesson? Schools need virtual options. They need them for students who, because of physical, emotional or mental disabilities, can’t be in the classroom; who have dropped out just shy of a few credits and really want to earn that diploma; who are working to support their families; who are taking care of younger siblings; or who want to participate in dual enrollment and can’t get the unique classes they need through their own schools.”

Read more insights from Class Technologies’ VP of K-12 Strategy, Elfreda Massie.

 

Start with the End(point) in Mind

“While the concept of zero trust serves as a useful framework for understanding the goal of posting a guard at every entry and maintaining clear lines of authorization and authentication, getting it done is another matter. Somebody has to do the work of implementing endpoint management and security. Consider the challenge of mobile endpoint patching. IT churns through cycles continuously applying long lists of patches, mitigating risks for which there may be no exploit and that may not be in line for attack. According to a recent Ivanti report, “Patch Management Challenges,” 71% of IT and security professionals find patching to be overly complex and time-consuming. And the patching efforts may only address district-owned devices along with the small share of end users with their own devices who are willing to go through the patch process. What about everybody and everything else? The key is knowing what patches are crucial and being able to prioritize patch decisions that are going to provide comthe greatest security. The patch management approach needs to apply threat intelligence and risk assessment. Then it needs to be enabled on all devices — district-owned or not — without the process relying on interaction from users.”

Read more insights from Ivanti’s Public Sector CTO, Bill Harrod.

 

How to Tame the Cloud with One Call

“K-12 professionals are continually trying to keep their heads above water. They’re drowning in paperwork, processes, regulations and general bureaucracy. And they just need relief. If you’ve got 100 different contracts, every time you touch those contracts to manage them, support them, make amendments, check that they meet state and federal compliance guidelines, and more, it increases the total cost of ownership for every one of those cloud products and services. E&I helps you reduce this work, so that you can spend more time and energy in what you love to do, which is helping students learn.”

Read more insights from E&I Cooperative Services’ Vice President of Technology, Keith Fowlkes.

 

Download the full Innovation in Education report for more insights from these cybersecurity thought leaders and additional K-12 industry research from THE Journal.

Conversations With CXOs: Crash Course on the Future of Government

For government employees looking to build successful and satisfying careers in public service, the curriculum is changing. It’s not enough to develop mastery of agency processes and policies or to stockpile continuing education credits on traditional core competencies. Instead, public servants need to develop a working knowledge of current trends in IT and management that are reshaping how government operates. IT and management: That’s the operative phrase. Technology is continually improving the efficiency of work processes and the productivity of employees. But efficiency and productivity only go so far. It’s at the intersection of technology and management that real change is happening. Agencies are gaining new insights into their operations and services, and using those insights to fuel innovations across their organizations. Government employees at all levels have the opportunity to be part of this transformation, but they need to get up to speed on the key trends. Where are they to begin? Download the guide to read more about four competencies that could be critical to the careers of public servants.

 

Edge Computing Raises Ransomware Risk

“The problem is that edge computing – in which data is being aggregated, accessed or processed outside the network perimeter – is leaving data exposed to cyber criminals who see an opportunity to make money through ransomware schemes. According to Gartner, a research and consulting firm, edge computing will grow 75% by 2025. In government, the surge is being fueled both by a growth in end-user devices in mobile and remote computing and in non-traditional devices associated with the Internet of Things (IoT) and operational technology (OT), such as sensors and cameras. In many cases, agencies support edge computing by moving data into the cloud, rather than requiring end-users or devices to go through the data center. This hybrid cloud environment mitigates performance and latency problems but also makes the network perimeter even more porous.”

Read more insights from HPE’s Distinguished Technologist for Cyber Security, James M.T. Morrison.

 

Agencies Need to Maintain a Sense of Cyber Urgency

“Security isn’t just the responsibility of individuals. Agencies also must ensure they treat security as a top priority. SolarWinds recommends two areas of focus: Prioritize the development of cyber experts. Given the high demand for cyber experts, agencies should focus more energy on developing talent in house. Shopp said one approach is to convert IT professionals, who are already tech savvy, into cyber professionals. Prioritize collaboration between tech pros and leaders. Policies and strategies aimed at reducing risk should reflect both technical and organizational expertise and requirements. Shopp said agencies also should collaborate more with trusted industry partners. SolarWinds, for example, isn’t just a technology vendor; it also has a large development shop, as many government agencies do, and can exchange ideas about cyber strategies, tools, and best practices.”

Read more insights from SolarWinds’ Group Vice President of Product Management, Brandon Shopp.

 

IIG GovLoop CXO Crash Course for Gov Blog Embedded Image 2021How to Move DevOps from Disarray to Unity

“An agency’s initial forays into integrating their development and operations teams can bear fruit quickly, leading to better quality software produced at a faster clip. The risk is that an organization will treat its initial forays as the endgame, not realizing that a more mature approach, with greater payoffs, is possible. In short, the DevOps initiatives never grow up. GitLab, which has years of experience helping organizations with DevOps adoption, has identified four stages in a DevOps journey, culminating in an approach that delivers even greater benefits than envisioned at the outset.”

Read more insights from GitLab’s Federal Solutions Architect, Sameer Kamani, and Senior Public Sector Solutions Architect, Daniel Marquard.

 

Why Stronger Security Hinges on Identity Data

“To understand the need for an Intelligent Identity Data Platform, consider two scenarios. In the first case, a user logs into an application from her office at 2 p.m. each day. In this case, she will be considered a low risk, based on three factors: Her credentials, her usage patterns and location data. In the second scenario, this same user logs into the application from her office but at 2 a.m. The aberration in her routine (i.e., usage pattern) raises a red flag, as would a change in her location. Even this simple use case requires an agency to have a holistic picture of an end-user, which is not possible without a central platform.”

Read more insights from Radiant Logic’s Vice President of Solutions Architects and Senior Technical Evangelist, Wade Ellery.

 

The Case for Data Literacy

“Someone who works in national defense requires different data skills from those in environmental or financial management auditing. ‘We firmly believe it’s not a one-size-fits-all approach,’ Ariga said. Training must be catered to tradecraft. It’s the reason GAO is creating its own data literacy curriculum specific to the oversight community, instead of relying on third-party training that focuses on generic, often commercial aims. Additionally, the best time for people to learn data skills is when they actually need them. On-demand tools such as microlearning videos and a walk-in Genius Bar ensure staff can access data solutions and build literacy when they need, instead of waiting months to register for a class.”

Read more insights from the Government Accountability Office’s Chief Data Scientist and Director of the Innovation Lab, Taka Ariga.

 

The Future of AI Hangs on Ethics, Trust

“Over the next five years or so, we could see a revolution in the use of AI, Sivagnanam said. Think about the self-driving car industry. At this point, human drivers are still a necessary part of the equation. But AI pioneers are hard at work trying to change that, and quickly. Similar advances are likely in other applications of AI. Over the next three to five years, Sivagnanam hopes to see the AI industry mature. As part of that, he expects to see the development of regulations and guidelines around AI and ethics, both from the federal government and from industry organizations. That work is already getting underway, and NSF is playing a role. Through a grants program called Fairness in Artificial Intelligence (FAI), NSF supports researchers working on ethical challenges in AI.”

Read more insights from the U.S. National Science Foundation’s Chief Architect, Chezian Sivagnanam.

 

Q&A: Getting Schooled on Zero Trust Security

“Zero trust means zero trust. We’re monitoring your internal systems. To an extent, we are monitoring what individuals are doing. That’s not to say we’re Big Brother. We’re not monitoring the keystrokes of every user in the state or anything like that. For the agencies, multi-factor [authentication] is a huge one. We’ve seen time and time again accounts get compromised because they had a bad username and password. If that’s the only thing protecting a system, that’s not enough. The bottom line is we know people create bad passwords. That’s a given. You can increase awareness about how to create good passwords, and you certainly want to try that. In many cases, people will just figure out ways around complexity requirements to get an easy-to-remember password versus a secure and strong password. You want to encourage people to have unique passwords for every single site. At some point, you need to give them a secure method of being able to remember all these passwords.”

Read more insights from Connecticut’s CISO, Jeff Brown.

 

3 Tenets for Advancing Equity in Your Everyday Work

“If there were one thing you could do to eliminate health disparities or advance health equity, what would it be? This is a question that Dr. Leandris Liburd gets asked often, but it’s not one she’s fond of. The answer isn’t a simple one, and the COVID-19 pandemic has magnified that truth. There isn’t a magic pill to ensure that no one is denied the possibility of being healthy because they belong to a group that has been economically or socially disadvantaged. And measuring success is about more than data points. Choosing one thing to advance health equity ‘is not possible when you’re dealing with these kinds of complexities,’ Liburd said in an interview with GovLoop. ‘So we have to do a lot of things at the same time.’”

Read more insights from the CDC’s Director of the Office of Minority Health and Health Equity, Dr. Leandris Liburd.

 

Download the full GovLoop Guide for more insights from chief information officers, a chief data scientist and other senior leaders in federal, state and local government.

 

The State of Artificial Intelligence in Government

Government agencies have been discussing artificial intelligence (AI) for more than a decade, and as technology and legislation progress, the focus on public sector impacts is stronger than ever. A 2019 executive order highlights American leadership in AI as key to maintaining the economic and national security of the United States. The Trump administration has also issued regulatory guidance on AI, instructing all federal agencies to prioritize and allocate funding for AI programs that serve their individual missions. Numerous national agencies and even multinational partnerships have identified AI as a priority. AI’s similarity to human intelligence means it could potentially impact every corner of society, from cybersecurity to medicine. To learn more about how your agency can use AI to analyze data, recognize patterns and automate manual tasks, get up to date with The State of AI in Government, a guide created by GovLoop and Carahsoft featuring insights from the following technology and government AI thought leaders.

 

AI Requires a New Approach to High-Performance Computing

“High-performance computing (HPC) needs to evolve. The traditional HPC architecture, now decades old, worked well for previous generations of HPC applications. But today’s applications, driven by AI, require a new approach. The problem? The old systems were too static. That wasn’t a problem when applications had static performance requirements. But AI is different. When developing an AI system, the workload changes from one stage of the process to another.”

Read more insights from Liqid’s Public Sector Chief Technology Officer, Matt Demas, and Director of Sales, Eric Oberhofer.

 

Bring AI to the Edge

“Legacy computing structures always glued data scientists to data centers. The two were tethered together, meaning scientists couldn’t work where the data didn’t reside, much like how a lab scientist needs their lab chemicals and instruments. Data science, however, is not entirely like lab science, because endless inputs come outside of a controlled environment. AI models are most effective when exposed to open air. The solution is to bring software-based applications to the edge, except for massive data projects.”

Read more insights from HPE’s Defense Department Account Team Technologist, Jeff Winterich, and Red Hat’s Public Sector Staff Solutions Architect, Ryan Kraus.

 

GovLoop Dec. AI in Government Embedded Image3 Ways Cloud Improves AI

“Cloud-based AI can help agencies move faster. During the pandemic, it has. One example is automating document workflows so that AI replaces manual data entry and extracts metadata to enhance search capabilities. As a result, AI speeds up timelines for constituents. Without having to wait on employees to manually enter data or respond to simple queries, citizens receive the front-facing information and services they need faster. Agencies can build AI faster in the cloud, too. Developers access capabilities through simple application programming channels, so they don’t have to build or integrate models from scratch. Cloud services like Amazon SageMaker remove the busywork and infrastructure so that data science teams are more productive and efficient when rolling out [machine learning].”

Read more insights from AWS’s Tech Business Development Manager of AI and ML for the Worldwide Public Sector, Joe Pringle.

 

How AI Demands a New Vision of the Data Center

“Technology originally developed to improve PC-based gaming and multimedia applications nearly 30 years ago is now driving advances in the field of artificial intelligence. In the early 1990s, when PC gaming was beginning to take off, the Graphics Processing Unit (GPU) was invented by NVIDIA to render an image by breaking it up into multiple tasks that could be executed in parallel. Today, the same approach accelerates processing for a wide range of applications, not just on PCs but also on the world’s fastest computers.”­­­

Read more insights from NVIDIA’s Vice President of the GPU Data Center Architect, Curt Smith.

 

DoD’s Battle Against COVID-19, With AI at the Helm

“When you’re talking about a domestic threat like COVID-19, for us to, for instance, predict how COVID-19 is going to be affecting a certain military installation, you might need data from things that would be nontraditional DoD data. So, you might need data from CDC, [or] from Department of Labor when it comes to unemployment. So, these sorts of datasets I think are really hard for the DoD to have, because they’re not traditional military data. But at the same time, for us to do accurate modeling, we do need datasets like that. So, this project had a lot more sort of rigorous policy review for data, more so than a project like predictive maintenance, for instance.”

Read more insights from Chief of Policy at the Department of Defense’s Joint Artificial Intelligence Center, Sunmin Kim.

 

Using AI to Improve Veteran Care and Save Lives

“It’s been an amazing journey from a veterans’ experience perspective. The Veterans Experience Office came out of the crisis of Phoenix, when there were the issues with the lists of appointments and veterans were not getting timely appointments – and the data was showing things differently. We did not have the customer datasets. We had a lot of operational data, we had a lot of financial data, but we did not have necessarily the data for [customers]. And I think that from the customer perspective, I think that’s a key aspect with AI. You can’t have AI if you don’t have the right data in place … and that’s something the VA has been very diligently working on.”

Read more insights from Department of Veterans’ Affairs Chief of Staff at the time of the interview, Lee Becker; Director of Enterprise Measurement, Anil Tilbe; and Acting Executive Director of Multichannel Technologies, Laura Prietula.

 

Improving Public Health Through AI

“Traditionally, public health plays the role of a data aggregator. We’re collecting large volumes of information because we’re interested in understanding how often illnesses or injuries occur, not just at an individual level, but across entire communities or entire populations as a country at large. And we use that information to try to understand why those diseases or injuries occur, and then we use that to take action that will allow us to address really significant threats to the public health at their source. AI can play a role at many different places in that information chain.”

Read more insights from the Centers for Disease Control and Prevention’s Entrepreneur in Residence, Paula Braun.

 

Download the full GovLoop Guide for more insights from these artificial intelligence thought leaders and additional interviews, historical perspectives and industry research on the future of AI.