Quantum Computing’s Latest Breakthrough: Why Government Encryption Standards Face a New, Unexpected Threat

Last week, international scientists made headlines by successfully cracking a 50-bit RSA encryption integer using D-Wave’s Advantage quantum computer. While it’s true that a 50-bit key is vastly smaller than the 2048-bit keys used in modern RSA encryption, the significance of this achievement lies in how it was done. Unlike traditional attacks based on Shor’s algorithm and quantum gate computers, the researchers utilized a quantum annealing system, designed for optimization rather than direct factoring. This shift in approach raises important questions about the timeline for when quantum computers could crack full-scale RSA encryption, potentially accelerating the threat to current cryptographic standards far sooner than expected.

Marion Square Quantum Computing and Cybersecurity Blog Embedded Image 2024

For years, the vulnerability of public key encryption has been understood primarily as a factoring problem, since the security of encryption algorithms like RSA relies on the difficulty of factoring large composite numbers. Shor’s algorithm, widely regarded as the most probable path to breaking public key encryption, is designed specifically to factor these numbers exponentially faster than classical methods, posing a significant future threat to encryption systems. However, in a surprising turn, the international researchers in this recent attack used a quantum annealing computer, which is designed for optimization tasks, not factoring. This innovative approach represents a completely different method of breaking RSA encryption, highlighting that the threat from quantum computing may emerge from unexpected directions, advancing the risk timeline beyond what many experts anticipated.

This breakthrough also underscores the growing versatility of quantum annealing in solving problems once thought exclusive to gate-based quantum computers. Traditionally, annealing systems have been seen as ideal for optimization problems in fields such as logistics, material science, and machine learning—not for cryptographic attacks. However, the international researchers effectively re-framed RSA decryption as an optimization challenge, unlocking new potential in quantum annealing. While quantum annealing computers like D-Wave’s systems were not originally designed for factorization tasks, this achievement raises important questions about their ability to scale to larger key sizes and tackle more complex encryption algorithms. If quantum annealing can be adapted for cryptography at higher levels, it could potentially shorten the timeline for when quantum computers might become a real-world threat to encryption standards. Though hurdles remain, this new approach widens the scope of quantum threats to cryptographic systems, showing that the race to quantum-safe encryption may need to accelerate.

In conclusion, this breakthrough in quantum annealing highlights the increasing urgency for federal agencies to prioritize their post-quantum encryption (PQE) transition. The rapid evolution of quantum computing, coupled with the potential for new cryptographic vulnerabilities, underscores the need to meet the milestones set by NSM 10 and OMB 23-02. Agencies that have not yet initiated or fully engaged in this process risk falling behind as quantum advancements accelerate. The time to act is now—establishing cryptographic leadership, conducting comprehensive inventories, and securing appropriate resources are critical first steps. Preparing today will ensure the resilience of federal systems in a quantum-enabled future.

To learn about the latest standards set forth by NIST and how Marion Square can support your Quantum Computing and compliance initiatives, view our webinar, “Mastering NIST PQE Standards: A Guide for Federal Compliance.”

Leaders In Innovation: Identity and Access Management

Agencies have been learning the importance of identity and access management for nearly two decades, but, like many technological evolutions, the coronavirus pandemic has encouraged adoption on an entirely new scale. As remote work became the norm, agencies adapted to use technology like smart identity cards in new ways, enabling capabilities like digital signatures. These new features are secured by the common access card (CAC) in the Department of Defense (DoD) or the Personal Identity Verification (PIV) card in the civilian environment, and all follow the principles and strategies of identity and access management.

Learn more: 8 cybersecurity experts from across the Federal government and industry discuss identity and access management in the latest Leaders in Innovation report.

Shane Barney, the Chief Information Security Officer at the U.S. Citizenship and Immigration Services in the Homeland Security Department, said as agencies move to the cloud, a new common framework focused on data around identity credentialing and access management is necessary.

“I know GSA is working toward that. I’m excited to see where we are heading with that, honestly, because we’ve been working in the identity world for quite a while now, very early on adopting some of those frameworks and trying to figure out a standard and hoping we are getting it right, and I think we’ve made good decisions, we made a couple of errors along the way and more good lessons,” he said in an executive brief sponsored by RSA and Carahsoft.

COVID-19 Has Also Highlighted Challenges

While agencies adapted to renewing or extending smart card authorizations, the pandemic made clear that other form factors must play a larger role in the months and years ahead, especially as agencies move toward a zero trust architecture.

Steve Schmalz, the Field Chief Technology Officer of the Federal Group at RSA, said agencies, like the commercial world, are starting to understand how cloud and remote workers are making the perimeter disappear.

“Zero trust is a fantastic conceptual way of dealing with that and talking about how you have to make sure to authenticate closer to the resource or make use of attributes and entry based access control to determine whether or not somebody should be allowed access to a particular resource,” Schmalz said, “That process of implementing attribute-based access control, looks like what you would have to do to implement a full zero trust architecture, where before individuals or processes get access to another resource, you have to check, you have to do some authentication.”

FNN Leaders in Innovation Blog Embedded ImageThe Future of FIDO

The changes happening, whether at DoD, the U.S. Army or across GSA’s shared services, are not going unnoticed by the National Institute of Standards and Technology (NIST). David Temoshok, the NIST Senior Policy Advisor for Applied Cybersecurity, said the standards agency is updating the Federal Information Processing Standards (FIPS) 201 document to allow for new kinds of tokens such as those from FIDO Alliance.

“As FIDO continues to mature as an organization in standardizing secure authentication processes, one of the things that they have established is a certification program for devices to both be certified for conformance to the FIDO specifications, but also to evaluate the security because FIDO tokens and the FIDO authentication processes use cryptographic keys for cryptographic authentication processes, which are very secure, very resistant to man-in-the-middle and phishing attacks,” he said. “We would be recommending their use for both external authentication processes, but also internal, where it’s convenient for agencies to use that.”

Connecting the Dots with ICAM

Along with NIST’s FIPS-201 update, the Homeland Security Department has made identity the center of its continuous diagnostics and mitigation (CDM) program. Rob Carey, the vice president and general manager for global public sector solutions at RSA, said what continues to become clear throughout this discussion and use of identity credential and access management (ICAM) is the old way of “one type of approach for all” continues to be proven unworkable.

“We’ve used the term to any device, anytime, anywhere, and DoD for probably 20 years now. Now we’re at the precipice of delivering that. As you validate, authenticate, the question is the back end, how are the systems and the business processes embracing this authorization to move forward to allow the right people to access the ERP or the financial management system,” Carey said, in a panel discussion sponsored by RSA and Carahsoft. “How are we connecting those dots with this somewhat new and better framework that we’ve talked about using role-based access, attribute-based access control?”

As agencies continue to prioritize zero trust architecture, the growth of identity and access management will only become more prevalent. Download the full Leaders in Innovation report to hear from agency leaders at UCIS, CISA, U.S. Army, DHS, DoD, GSA and NIST on how they’re tackling the challenges and reaping the benefits of identity and access management. 

Best of What’s New in Cybersecurity

For security professionals, the COVID-19 pandemic represents something of a perfect storm. The risk landscape exploded in a matter of days as state and local agencies rapidly sent thousands of employees home to work remotely. At the same time, security personnel and resources were stretched exceedingly thin, with many security teams redeployed from operational tasks to urgent new projects. Now is the time to reevaluate security tools, processes and strategies in light of these massive COVID-driven changes. Immediate steps include understanding and addressing situations where users may be storing sensitive data on insecure home computing devices, as well as dialing back remote access privileges to reduce the risk of inappropriate access or stolen user credentials. Over the longer-term, agencies must develop better monitoring capabilities that help them spot threat activity and potentially risky user behaviors. Read the latest insights from industry thought leaders in Cybersecurity in Carahsoft’s Innovation in Government® report.

Time to Reevaluate Security PracticesGovTech Oct Cybersecurity Blog Image

“The bottom line is that even the best tool or approach will not fix a bad process. All the zero-trust technology in the world won’t work if your identity and asset management processes give the system bad data. To fully utilize these approaches, agencies must look honestly at their processes and what they’re doing regarding hygiene, security practices and things like that. Organizations also need to determine what they want from these tools, whether the tools align with their best practices and overall security approach, and how these tools impact the way they perform existing processes.”

Read more insights from McAfee’s Chief Technology Strategist, U.S., Sumit Sehgal.

 

Building Resilience through Digital Risk Management

“Planning ahead for how you’ll address problems and putting contingency plans down on paper is an important risk management process. Organizations need good security workflows and a way to aggregate information about their networks, valuable resources and who is doing what in the organization. Then they need plans for triaging the most devastating risks first. It’s impossible to think of every threat, but organizations can start by considering what types of incidents could interfere with critical capabilities and prevent them from completing their mission. With that information, organizations can put together contingency plans, even when they’re not quite sure what potential threat might bring about that particular loss of functionality.”

Read more insights from RSA’s Federal Group Field CTO, Steve Schmalz.

 

Confronting a New Threat Ecosystem

“Understanding your organization and where it fits into the threat ecosystem is probably among the most effective ways to grapple with this issue. In a purely introspective sense, it’s important to understand your corporate network — you need to know which information assets, individuals and applications are likely to be targeted by attackers and then place a higher priority on security alerts and advisories that impact them. Organizations also can narrow the focus of their detection and threat-hunting efforts by understanding the specific attackers that are known to be interested in their industry and geography, and use this knowledge as a preliminary guide.”

Read more insights from FireEye’s Manager of Mandiant Threat Intelligence, Jeremy Kennelly.

 

Remote Work Is Here to Stay

“The secure access service edge (SASE) model lets organizations apply security no matter where their users, applications or services are located. It dictates that enterprise users need access to a variety of business resources and information. To maintain business operability and meet their missions, enterprises must figure out how to do that securely. Secure remote access — which includes secure connectivity, identity access management, access control, continuous validation of secure connectivity throughout an interaction and more — will be the mark of a functioning cybersecurity apparatus moving forward. The other component is being able to scale cybersecurity talent and resources to accommodate growth.”

Read more insights from Palo Alto Networks’ VP and Field CSO, MK Palmore.

 

Addressing Evolving Application Threats

“No matter who comes through the door, you have to verify everything about them and that verification must follow them through the system. Organizations can’t just check a user’s ID, give them a password and be done with it. It’s a continuous process of authentication. When a user attempts to move from one part of a system to another — for example, if a person applies for unemployment insurance, but they logged in through a parking application — the organization may want to require additional authentication or scrutinize the user more deeply. Access is not all or nothing. There’s a granular dial that you’re turning up and down based on what a user is doing within the system.”

Read more insights from F5 Labs’ Director, Raymond Pompon.

 

Taking Threat Detection and Response to the Next Level

“A lot of the change comes from having to support a large remote workforce. Regular system maintenance tasks like vulnerability scanning and software patching have changed dramatically. In the past, patching technologies assumed that systems were physically on the same network or would ultimately be connected via a virtual private network. As users’ machines move off the network, they get scanned less often, if at all. Remote work and increasing reliance on SaaS have really highlighted the need for zero-trust networks, where services require not only a trusted user but also protection of the data viewed and saved from these services.”

Read more insights from SecureWorks’ Chief Threat Intelligence Officer, Barry Hensley.

 

 

Download the full Innovation in Government® report for more insights from these government cybersecurity thought leaders and additional industry research from GovTech.

The Best of What’s New in Government Performance and Innovation

The COVID-19 crisis underscores the growing importance of data analytics to state and local governments as they tackle complex challenges. It also shows how technological improvements are making data-driven insights easier to achieve and share. Although the COVID-19 response kicked public sector data analytics efforts into high gear, states and localities have been steadily working to become more data-driven over the past several years. Twenty-eight states now have a chief data officer (CDO), and similar positions are found throughout local government. The rise of the CDO is just one indication of the push among states and localities to use data to improve internal operations, strengthen citizen services, improve safety, and boost transparency and engagement. Learn the latest insights from industry thought leaders in government performance and innovation initiatives in Carahsoft’s Innovation in Government® report.

Continue reading

Meeting the Requirements of the Supply Chain Imperative

IT modernization ranks as a top priority for the federal government, but it also further complicates a concern that agencies have faced for decades: managing the risks to their cyber supply chains. In May 2019, President Trump issued an executive order underscoring the danger the federal information and communications technology supply chains present to the U.S. Four months later, the Cybersecurity and Infrastructure Security Agency (CISA) published a report identifying nearly 200 security threats to these supply chains, including counterfeit components, poor product designs, and malicious hardware and software. For federal IT supply chains, security missteps can damage the economy, national security and even public health. Learn the latest strategies for managing supply chain risk in “Meeting the Requirements of the Supply Chain Imperative,” a guide created by GovLoop and Carahsoft featuring insights from the following technology thought leaders. Continue reading