Best of What’s New in Legacy Modernization

 

The pandemic changed the risk equation for state and local governments around technology upgrades. In the past, state and local government CIOs had created orderly multi-year plans to push toward modern technologies, carefully weighing numerous factors and often facing pushback from public officials who didn’t want to fund updates if the old systems were still chugging along. In 2021 – after a vast shift to remote work, the increase in user-friendly digital services, and the innumerable changes to individual agencies brought on by the coronavirus – the modernization of legacy technology is seen through a new lens. Read the latest insights from industry thought leaders in legacy modernization in Carahsoft’s Innovation in Government® report.

 

Moving Modernization Forward in Spite of Disruption

“State and local CIOs are dealing with challenges that none of us ever thought they would have to face. The two most important things they can do are to drive automation and focus on hybrid cloud solutions. We all know the cloud is here to stay. We also know that legacy systems will take too long to migrate completely to the cloud. Embracing a hybrid cloud approach around modern solutions, where you can be partly on-prem as well as in the cloud, is going to help drive modernization and help systems become more effective more quickly. The second piece is automation. Automation has come a long way. It allows organizations to re-factor their workforce into their mission while automating simpler tasks. Artificial intelligence and machine learning are part of this and will become increasingly important as state and local leaders look to improve responsiveness and citizen engagement — both now and in the future.”

Read more insights Red Hat’s Vice President of State and Local Government and Education, Nancy Bohannan.

 

It All Starts with Collaboration

“A DevSecOps approach takes DevOps culture and methodologies and incorporates security from the very beginning. This brings enormous value to legacy modernization efforts. Many legacy systems were built using waterfall methodologies. That means they may not be regularly scanned for vulnerabilities or they were simply not built to handle modern scale. DevSecOps helps you avoid these issues. First, you will be more agile, as we’ve seen with DevOps. Second, you will build systems that are inherently more secure. Instead of thinking about security after a system is built and in production, you are doing so from day zero and doing so continuously even after you’ve “shipped” it. This is especially critical in cloud environments where shared resources and multi-tenancy are the norm rather than the exception.”

Read more insights from Atlassian’s State and Local Manager, Shayla Sander, and Solutions Engineer, Ken Urban.

 

IIG GovTech March Blog 2021 Embedded ImageFinding Opportunities for Modernization

“The pandemic pushed most organizations into firefighting mode. They don’t have the luxury of doing wholesale rewrites of legacy software, which often take years. At the same time, organizations need to make these systems more efficient in order to serve constituents and improve operations — especially during the pandemic. Instead of replacing systems, organizations are augmenting them by putting new technologies on the front end. These efforts solve some of the immediate problems; however, many legacy challenges remain because organizations just haven’t had time or resources to do the rewrites.”

Read more insights from Dell Technologies’ Chief Strategist and Innovation Officer, Tony Powell.

 

Contact Center Modernization: Raising the Bar on Customer Service

“Modernizing how you serve citizens should be a continuous process. Methods of communication change. Technology improves. A pandemic exposes weakness in an entire process. And all of these things must be addressed in the context of resource constraints. Organizations should look across their constituency and current platform and ask questions such as: Are we communicating effectively? Do we have the necessary tools to properly manage resources? Do we have a business continuity plan? Is owning and managing technology the best use of our resources? Regardless of the question, the key is to be proactive in your evaluations.”

Read more insights from Genesys’s Director of Solution Consulting for the North American Public Sector, Chad Cole.

 

4 Tips for Advancing IT Procurement

“Identifying and analyzing potential risks upfront can give jurisdictions more options to address urgent IT needs when a crisis hits, Paneque says. ‘Through a risk assessment you can delineate potential scenarios you might face in the future and where you can substantiate an emergency procurement to stabilize them,’ says Paneque. ‘Later, you can roll that approach into less urgent kinds of requirements that can either be sourced through existing contracts or with typical methods of procurement like an RFP.’”

Read more insights from former New York State Chief Procurement Officer, Sergio Paneque.

 

Download the full Innovation in Government® report for more insights from these legacy modernization thought leaders and additional industry research from GovTech.

Agencies Build Foundation for DevSecOps Success

Since the development of the internet, IT professionals have been in an “arms race” with bad actors. DevOps emerged as a way to restructure the development process by bringing developers and operations teams together to create new applications, thus ending the cycle of vulnerabilities and software patches. But security still needed a seat at the table. The newest approach is DevSecOps — both a software engineering approach and a culture that promotes security automation and monitoring throughout the application development lifecycle. DevSecOps is designed to break down barriers to collaboration among development, operations and security teams so they all can contribute to creating new applications. Organizations can deploy new apps with secure, efficient, functioning code — but with security as the foundation. To learn more about how your agency can use DevSecOps to reduce lead and mean time, increase deployment frequency, and cut operation costs almost in half, get up to date with “Agencies Build Foundation for DevSecOps Success,” a guide created by GovLoop and Carahsoft featuring insights from the following technology and government DevSecOps thought leaders.

 

Embracing Machine Identity Management

“One of the advantages of modern IT services is that they leverage both physical machines (computers and other devices) and virtual machines (e.g., applications, containers and code) to exchange data and execute tasks without human intervention. That makes it possible to design services that are fast, flexible and reliable. But it also raises an important security question: How do you know whether those machines can be trusted?  That’s a question of identity management.”

Read more insights from Venafi’s Senior Product Marketing Manager, Eddie Glenn.

 

The Playbook for Innovating Quickly, Expansively and Securely

“Government adoption times can be taken for granted – people aren’t surprised when something takes three years to build or 12 months to implement. Those are common refrains that often go unquestioned. They shouldn’t. Cloud changed the game by allowing agencies to spin up networks instantaneously. And that was just the beginning. Throw in microservices architectures and agile development methods that have security and operations built in; now you’re getting down the court, faster than before.”

Read more insights from SAP NS2’s Cloud Director, Dean Pianta.

 

How Developers Can Become a Security Asset

“When it comes to security, IT experts often talk about the importance of “shifting left,” that is, addressing security earlier in the development lifecycle. But it’s not just security that shifts left with DevOps. In traditional IT environments, developers were expected to adhere to a detailed IT architecture, which was updated periodically. To take advantage of today’s rapid rate of innovation in technologies and architectural approaches, agencies need to give developers more leeway to decide what languages, toolsets and capabilities they might need to build an application.”

Read more insights from Red Hat’s Cloud Native Transformation Specialist, Michael Ducy.

 

IIG GovLoop Dec. DevSecOps Blog Embedded ImageEnabling Agencies to Succeed with DevSecOps

“Instrumentation provides benefits both to the application security team and to developers. For the application security team, the tool soup approach often results in so much data, and so many false positives, that they have a difficult time gleaning intelligence from it. The unified picture provided by an instrumentation platform eliminates the noise so that the team can identify and remediate problems quickly. Instrumentation can also provide accurate feedback directly to developers, so that they can fix vulnerabilities as part of their normal work.”

Read more insights from Contrast Security’s Co-Founder and CTO, Jeff Williams.

 

DevSecOps Teams Require a Robust Orchestration Platform

“DevSecOps, by definition, is intended to promote collaboration among the development, security and operations team. But Chow emphasized that such collaboration needs to begin at the outset of a project, when defining the goals and strategy for a project. The idea is to define the overarching goal or mission of the project, then have each team prioritize their own needs and goals as it relates to that mission, said Chow. Those secondary goals become the building blocks for the strategy and shapes the development and orchestration of the application pipeline, he said.”

Read more insights from F5’s Senior DevOps Solution Engineer, Gee Chow.

 

How Culture Drives DevSecOps Success

“’When people talk about DevSecOps, they often focus on improving communications between developers and the security team. But organizations need to foster open and transparent communications at every layer of management, from the top down,’ Urban said. In particular, developers can benefit from understanding how their work fits into the larger mission – and why particular security constraints are important. ‘Good healthy communication means staying as open and transparent as you can be without compromising that security,’ he said.”

Read more insights from Atlassian’s Public Sector Evangelist, Ken Urban.

 

Modern Cloud Security Requires an Agile Approach

“Automation also paves the way to change how agencies approve IT systems for use. In a standard Authority to Operate (ATO) process, a system owner must implement, certify and maintain required security controls. The problem is that certification is based on a snapshot in time, whereas in modern cloud environments, change is constant. Systems can ’drift’ from compliance over time as new threats arise. Modern cloud solutions offer architectures leveraging containers that perform discrete tasks within a microservice environment and are in constant flux with application updates, vulnerabilities/threats, policies, etc.”

Read more insights from Palo Alto Networks’s Chief Security Officer of Public Cloud, Matt Chiodi, and Senior Product Manager, Paul Fox.

 

DevSecOps Drives Change at the Air Force

“Another challenge is how to change the culture at government agencies that are not used to major shifts in culture and may actually be averse to it. DoD is still full of silos, he said in October 2020 during Amazon Web Services’ National Security Series. ‘It goes down to even like basic partnerships.… We have so many silos and that’s really part of the reason as to why we cannot really scale things, and why we reinvent the wheel and why we don’t do very well with enterprise services,’ Chaillan said.”

Read more insights from Air Force’s Chief Software Officer and Head of Platform One, Nicolas Chaillan.

 

Army Futures Command Makes DevSecOps a Long-Term Priority

“For agencies thinking of starting DevSecOps programs, Errico has advice: ‘Spend time conducting industry analysis of use cases both inside and outside the federal space. This is very much an emerging technology, and you have to figure out the right way it will fit for your organization. That takes time and thoughtful, honest analysis.’ Once the commitment is made and a DevSecOps program is in place, he said, comes the challenge of maintaining — and expanding — cultural change.”

Read more insights from the Army Futures Command’s Software Factory Lead, Maj. Vito Errico.

 

U.S. Transportation Command Cultivates a Team Mindset

“Unlike Platform One or the Software Factory, the DevSecOps program at U.S. Transportation Command is embedded in a unified, functional combatant command that provides support to the other 10 U.S. combatant commands, the military services, defense agencies and other government organizations. That means it serves many kinds of military organizations, providing strategic mobility capability through its own vast infrastructure of people, information systems, trucks, aircrafts, ships, trains and railcars. It also means the command may consider itself a transportation organization or a strategic logistics organization, but it doesn’t necessarily view software as an essential element of its mission in the way the services do, for instance.”

Read more insights from U.S. Transportation Command’s Chief of DevOps, Christopher Crist.

 

Download the full GovLoop Guide for more insights from these DevSecOps thought leaders and additional government interviews, historical perspectives and industry research on the future of DevSecOps.

Cloud’s Burning Questions: Hybrid, MultiCloud, Emerging Tech, and More

Over the past year there’s been a big change of federal agency policy from Cloud First to Cloud Smart. Even with the Cloud First mandate, a lot of workloads remain on-premises. Now, agencies want to be smart about moving to the cloud.

Moving to the Cloud

One of the biggest stumbling blocks in transitioning from on-premises to off-premises services is knowing which applications to move first. The Federal CIO Council’s Application Rationalization Playbook encourages CIOs to consider both business value and technical fit when making such decisions, since the best candidates are applications that are high in both. However, you also want to consider those applications with high business value and low technical fit like an old monolithic application that needs to be modernized.

In addition, COVID may have reorganized your agency’s priorities, so reexamine your existing plans. If you did an earlier rationalization of your portfolio, you should take another look. The unemployment modernization effort that you had on the back burner — may need to move to the front burner while other things are pulled back.

Securing the Cloud

Many people still think that the cloud cannot possibly be secure. The reality is that that the scale in terms of the systems and the number of security professionals helps ensure things are secure. But agencies must ensure that their security posture is consistent — whether it’s on-prem or in a public cloud. Automating it allows consistency, ensuring that you’re not creating holes in one environment while another is secure.

GovForward Blog Series - Red Hat Embedded ImageEmbracing Open Source

Government agencies can be wary of open source applications — but a great idea is a great idea no matter where it comes from, and open source is a great way to share best practices with a community. For one example… [think about] all the taxpayer money that has been spent on locking down a web server running Red Hat Enterprise Linux over and over again in the government. A lot of the “authority to operate” (ATO) paperwork hasn’t been reused at all.

Wouldn’t it be great if that paperwork were available so other agencies could not only use it, but improve upon it and make the security even stronger? That’s what [Red Hat’s] Compliance as Code project is, which allows people to get that ATO a lot faster and for a fraction of the cost – and that’s all thanks to open source.

Transitioning from Proprietary to Open Source

Agencies expect the divide between proprietary and open source to be more binary than it is. You don’t have to go all open source or all proprietary. Instead, pick the right blend that works for you. For example, you can run a proprietary database on an open source operating system on a proprietary hypervisor. Agencies can do so as well if they decide where to standardize, where to be in the stack and where to lay that open substrate.

Do you want it at the operating system level? At the Red Hat Enterprise Linux level, you could have on-premises data centers, public cloud, multiple different cloud vendors. Or do you want to go higher up the stack at say the Platform-as-a-Service layer where you use OpenShift and Kubernetes? That allows you further abstraction and more focus on the actual mission applications themselves. The important thing is making going to the cloud a conscious decision.

Achieving Success in the Cloud

The U.S. Citizenship and Immigration Services has taken their legacy monolithic applications and broken them down into containerized microservices on top of OpenShift, which can run on the public cloud or be on prem; the portability is right there.

But the agency did not just lift and shift the application over. They looked at the people and the processes — like changing from a waterfall model to agile and DevOps. Changing those processes — adding security, shifting security left to put that forefront with the developers and operations teams instead of as an afterthought — helped foster a very strong culture that encourages employees to focus on the mission.

Visit our website to learn more about the GovForward: Multicloud Series and FedRAMP through our additional resources.

The State of Artificial Intelligence in Government

Government agencies have been discussing artificial intelligence (AI) for more than a decade, and as technology and legislation progress, the focus on public sector impacts is stronger than ever. A 2019 executive order highlights American leadership in AI as key to maintaining the economic and national security of the United States. The Trump administration has also issued regulatory guidance on AI, instructing all federal agencies to prioritize and allocate funding for AI programs that serve their individual missions. Numerous national agencies and even multinational partnerships have identified AI as a priority. AI’s similarity to human intelligence means it could potentially impact every corner of society, from cybersecurity to medicine. To learn more about how your agency can use AI to analyze data, recognize patterns and automate manual tasks, get up to date with The State of AI in Government, a guide created by GovLoop and Carahsoft featuring insights from the following technology and government AI thought leaders.

 

AI Requires a New Approach to High-Performance Computing

“High-performance computing (HPC) needs to evolve. The traditional HPC architecture, now decades old, worked well for previous generations of HPC applications. But today’s applications, driven by AI, require a new approach. The problem? The old systems were too static. That wasn’t a problem when applications had static performance requirements. But AI is different. When developing an AI system, the workload changes from one stage of the process to another.”

Read more insights from Liqid’s Public Sector Chief Technology Officer, Matt Demas, and Director of Sales, Eric Oberhofer.

 

Bring AI to the Edge

“Legacy computing structures always glued data scientists to data centers. The two were tethered together, meaning scientists couldn’t work where the data didn’t reside, much like how a lab scientist needs their lab chemicals and instruments. Data science, however, is not entirely like lab science, because endless inputs come outside of a controlled environment. AI models are most effective when exposed to open air. The solution is to bring software-based applications to the edge, except for massive data projects.”

Read more insights from HPE’s Defense Department Account Team Technologist, Jeff Winterich, and Red Hat’s Public Sector Staff Solutions Architect, Ryan Kraus.

 

GovLoop Dec. AI in Government Embedded Image3 Ways Cloud Improves AI

“Cloud-based AI can help agencies move faster. During the pandemic, it has. One example is automating document workflows so that AI replaces manual data entry and extracts metadata to enhance search capabilities. As a result, AI speeds up timelines for constituents. Without having to wait on employees to manually enter data or respond to simple queries, citizens receive the front-facing information and services they need faster. Agencies can build AI faster in the cloud, too. Developers access capabilities through simple application programming channels, so they don’t have to build or integrate models from scratch. Cloud services like Amazon SageMaker remove the busywork and infrastructure so that data science teams are more productive and efficient when rolling out [machine learning].”

Read more insights from AWS’s Tech Business Development Manager of AI and ML for the Worldwide Public Sector, Joe Pringle.

 

How AI Demands a New Vision of the Data Center

“Technology originally developed to improve PC-based gaming and multimedia applications nearly 30 years ago is now driving advances in the field of artificial intelligence. In the early 1990s, when PC gaming was beginning to take off, the Graphics Processing Unit (GPU) was invented by NVIDIA to render an image by breaking it up into multiple tasks that could be executed in parallel. Today, the same approach accelerates processing for a wide range of applications, not just on PCs but also on the world’s fastest computers.”­­­

Read more insights from NVIDIA’s Vice President of the GPU Data Center Architect, Curt Smith.

 

DoD’s Battle Against COVID-19, With AI at the Helm

“When you’re talking about a domestic threat like COVID-19, for us to, for instance, predict how COVID-19 is going to be affecting a certain military installation, you might need data from things that would be nontraditional DoD data. So, you might need data from CDC, [or] from Department of Labor when it comes to unemployment. So, these sorts of datasets I think are really hard for the DoD to have, because they’re not traditional military data. But at the same time, for us to do accurate modeling, we do need datasets like that. So, this project had a lot more sort of rigorous policy review for data, more so than a project like predictive maintenance, for instance.”

Read more insights from Chief of Policy at the Department of Defense’s Joint Artificial Intelligence Center, Sunmin Kim.

 

Using AI to Improve Veteran Care and Save Lives

“It’s been an amazing journey from a veterans’ experience perspective. The Veterans Experience Office came out of the crisis of Phoenix, when there were the issues with the lists of appointments and veterans were not getting timely appointments – and the data was showing things differently. We did not have the customer datasets. We had a lot of operational data, we had a lot of financial data, but we did not have necessarily the data for [customers]. And I think that from the customer perspective, I think that’s a key aspect with AI. You can’t have AI if you don’t have the right data in place … and that’s something the VA has been very diligently working on.”

Read more insights from Department of Veterans’ Affairs Chief of Staff at the time of the interview, Lee Becker; Director of Enterprise Measurement, Anil Tilbe; and Acting Executive Director of Multichannel Technologies, Laura Prietula.

 

Improving Public Health Through AI

“Traditionally, public health plays the role of a data aggregator. We’re collecting large volumes of information because we’re interested in understanding how often illnesses or injuries occur, not just at an individual level, but across entire communities or entire populations as a country at large. And we use that information to try to understand why those diseases or injuries occur, and then we use that to take action that will allow us to address really significant threats to the public health at their source. AI can play a role at many different places in that information chain.”

Read more insights from the Centers for Disease Control and Prevention’s Entrepreneur in Residence, Paula Braun.

 

Download the full GovLoop Guide for more insights from these artificial intelligence thought leaders and additional interviews, historical perspectives and industry research on the future of AI.

Best of What’s New in Cloud Computing

This may be a make-or-break moment for jurisdictions newly converted to the cloud. As state and local governments scrambled to respond to new COVID-driven requirements, cloud-based contact center platforms, chatbots and web portals helped multiple states and localities quickly scale capacity for unemployment insurance and social services programs. In addition, cloud-hosted video collaboration platforms helped agencies shift employees to remote work on the fly and virtualize public meetings. IT leaders must now evaluate and rationalize the multiple cloud solutions they adopted so quickly. Now is also the time to look at cost optimization for cloud solutions. The COVID response has showcased real-world benefits of the cloud — and that experience is likely to accelerate a trend that was already underway as governments focus more attention on modernizing old systems and applications in the wake of the pandemic. Read the latest insights from industry thought leaders in cloud in Carahsoft’s Innovation in Government® report.

 

Cloud Migration as a Path to Modernization

“While there may be an increase in initial costs associated with modernizing legacy technology, the economics strongly indicate that maintaining dated infrastructure is more expensive in the long term. The biggest hurdle organizations face when migrating to the cloud is unpredictable costs. The cloud offers tools and resources to optimize investments and plan for the costs associated with migration. In addition, properly planning your move to the cloud helps agencies accurately budget for such a transition. When they do this correctly with the guidance of a strong partner, state and local governments see significant cost savings.”

Read more insights from the Partner Development Manager for Carahsoft’s AWS Team, Sehar Wahla, and the Sales Director for Carahsoft’s AWS Team, Tina Chiao.

 

How Does Evolving Cloud Adoption Impact Security?

“One approach is to standardize processes — think NIST or MITRE — so you have a common framework and language for measuring things like risk and attacks. That helps normalize the differences between cloud and traditional security so security teams can better understand what a risk actually means in a cloud environment. On the technology side, traditional threat profiling needs to move beyond the viruses and ransomware conversation and move toward user and entity behavior management, which looks at how users normally access and use an application. Organizations also need to articulate how separate applications securely exchange data for things like enterprise analytics. This is a nascent use case, but it has implications for critical systems where data integrity is important.”

Read more insights from McAfee’s Chief Technology Strategist, Sumit Sehgal.

 

IIG GovTech Dec. Embedded Image

“The biggest challenges include security, cost, having the technical expertise to successfully migrate into these hybrid environments and understanding which applications are best suited to run there. Organizations often spend a lot of time and money and introduce security vulnerabilities because they try to move applications that are not designed to run in a cloud environment. With the pandemic, organizations are under pressure to rapidly move their workforce into cloud environments. There can be a tendency to cut corners to save time, but these sacrifices can also create vulnerabilities.”

Read more insights from SAP NS2’s EVP of Software Development, Bryce Petty.

 

Paving the Way with Open Source

“There’s a realization that the cloud isn’t a silver bullet and that to be successful, organizations need to look at cloud adoption holistically. They need to take best practices into account when it comes to securing the environment, training and enabling staff, and even engaging in the procurement process. Open source supports a cloud smart strategy by helping eliminate vendor lock-in risk and technical debt. By using open source technology and an open source cultural process — where there’s transparency, collaboration and the ability to iterate quickly — organizations can solve their business problems and adapt their requirements based on emerging best practices. They’re not beholden to proprietary systems that may create friction for innovation and are potentially costly to replace, upgrade or move to the cloud.”

Read more insights from Red Hat’s Emerging Technology Lead, Frank DiMuzio.

 

Download the full Innovation in Government® report for more insights from these government cloud thought leaders and additional industry research from GovTech.

The Rise of Edge Computing

The proliferation of internet-of-things (IoT) sensors and an increasingly mobile workforce were dispersing government IT operations farther from the data center long before the coronavirus struck. But the pandemic has spotlighted agency employees’ increasing need for robust, secure capabilities in the field — or at home, in the case of remote work — and decision-makers need fast access to data analytics in a wide variety of situations. All those factors are driving interest in computing at the network edge, or processing data at the site of generation rather than storage. Edge computing has profound implications for a wide range of government missions across local, state, and Federal government, and with the emergence of 5G networks, it is becoming easier to incorporate. And if implemented thoughtfully, the benefits can be immense – reduced network stress, increased cybersecurity and savings in cost, time and storage. Read the latest insights from industry thought leaders in edge computing in Carahsoft’s Innovation in Government® report.

 

Streamlining the Adoption of Edge Computing

“Open source is a necessary component of edge computing for two main reasons. First, open source is much more secure than its proprietary counterparts due to the increased transparency. For edge deployments with hundreds or even thousands of sites, initially securing and maintaining them are solved through Red Hat open source. Second, open source supports a level of innovation most proprietary systems simply can’t match. When thousands of people work on a technology, that gives it a substantial advantage in terms of new ideas and accelerated innovation.”

Read more insights from Red Hat’s Practice Lead of OpenShift Virtualization, Storage and Hyperconverged Infrastructure in the North American Public Sector, Garrett Clark.

 

A Unified Approach to Edge Computing

“To avoid piecemeal implementation, edge computing must be part of an agency’s overall IT infrastructure. When done well, it will empower agencies to make more efficient and faster decisions because they’ll be able to harness more data from across the entire landscape. It will also give end users better and faster access to data in the field so they can take advantage of those insights in real time. Edge devices will not replace existing IT but instead will expand on what’s already in place. By incorporating edge computing into enterprise modernization, agencies can also start applying machine learning and other emerging technologies to harness the power of data. However, with edge devices and data now outside agencies’ firewalls, security must be embedded into edge computing. Important tools include automated security and centralized management, perhaps via the cloud.”

Read more insights from Nutanix’s Senior Director of Public Sector Systems Engineers, Dan Fallon.

 

FCW NovDec Blog 2020 Embedded ImageHow to Unleash the Power of Edge Computing

“Edge computing holds a great deal of promise as a stand-alone capability, but when paired with technologies such as advanced connectivity and enterprise data platforms, edge computing can fuel new customer and employee experiences at scale. When agencies combine edge computing with advanced connectivity, for example, they can empower rich, personalized experiences for customers as well as employees. Imagine moving from a 2D world of video consumption to a 3D world with immersive experiences personalized at scale for the individual. Edge computing coupled with advanced connectivity and SAP’s data platform can serve as the foundation to bring these new experiences to life. To help fuel this innovation, advanced connectivity such as 5G and Wi-Fi 6 play an integral role.”

Read more insights from SAP’s Vice President, Global Center of Excellence, Frank Wilde.

 

Accelerating Mission Success at the Edge

“Sometimes an agency will want to be in a cloud environment, sometimes it will choose an edge computing environment, and often, it will need both. In that situation, some quick analytics can happen at the edge, but then the data can move to the cloud for a deeper evaluation that will draw out more predictive insights and analytics. There are three key considerations agencies should keep in mind when moving to edge computing. First, they should think about it as part of a larger continuum alongside their core technologies, including cloud. Second, agencies should design for consistency in management and orchestration. Regardless of where a workload is running, a consistent approach helps agencies manage IT resources and costs and allows the organizations to scale and expand. The third consideration is more far reaching, but I encourage agency leaders to think about the opportunities that edge computing opens up.”

Read more insights from Dell’s Global Marketing Director of Edge and IoT Solutions, Kirsten Billhardt.

 

Beyond the Data Center and the Cloud

“We expect the number of connected devices to reach nearly 45 billion by 2025, gathering close to 80 zettabytes. Unfortunately, sending that growing amount of data to the cloud for processing is not always the best option due to bandwidth limitations and cost concerns. Many government systems are also not connected to the cloud and need to process data locally. Edge technology evolved to meet those challenges by bringing the advantages of cloud closer to the edge. Business applications enabled by edge computing include autonomous delivery, machine control, environmental monitoring, fleet vehicle diagnostics, vision-based analytics and defect detection. Edge computing is particularly beneficial in two situations: when a great deal of data needs to be migrated to the cloud for storage but there is little or no bandwidth and when data needs to be collected and acted on quickly at the edge (e.g., autonomous vehicles and drones).”

Read more insights from AWS’s Principal Technical Business Development Leader for IoT in the Worldwide Public Sector, Lorraine Bassett.

 

Edge: The Next Paradigm Shift in IT  

“Agencies can protect their data and applications across any cloud strategy (including on-premises, private, hybrid, multi-cloud or edge computing) with a cloud-agnostic, edge-based Web Application and API Protection (WAAP) solution. A globally distributed WAAP will protect websites, applications and APIs from downtime and data theft due to web attacks and distributed denial-of service (DDoS) attacks. All network-layer DDoS attacks, including those by large IoT botnets, are instantly dropped at the edge because a WAAP functions as a reverse proxy and only accepts traffic via ports 80 and 443. Any application-layer DDoS or web attack will be automatically inspected and stopped at the edge without disrupting access for legitimate users. Additionally, modern application architectures are shifting toward greater use of microservices and away from monolithic pieces of software. Small, independent microservices are assembled into more complex applications so they can leverage fully functional and distributed processes from third-party APIs.”

Read more insights from Akamai’s Senior Vice President of Web Performance, Lelah Manz.

 

Download the full Innovation in Government® report for more insights from these government edge computing thought leaders and additional industry research from FCW.

As Agencies Embrace Container Technology, Monitoring Challenges Emerge

Container technology is catching on big-time in the federal government as agencies such as the USDA and the National Institutes of Health look to containers to simplify software development and reduce costs.

This isn’t a big surprise. Containers offer enormous advantages over traditional “waterfall” application development processes, where monolithic systems are developed and released in a single effort—a time-consuming and onerous task. It’s also a risky approach. Once deployed, it’s impossible to change or upgrade the software without causing downtime and lost productivity.

A containerized approach to application development works quite differently, making it easier for developers to create and deploy software faster and with fewer errors. To understand why, let’s look at what containerization is and the key benefits it delivers. Then we’ll examine some best practices for overcoming one of the biggest challenges containerization poses for IT professionals: performance monitoring.

Containers Help Feds Develop Apps Faster and at a Lower Cost

A container is an entire runtime environment consisting of an application and all the dependencies needed to run it in a single host. Multiple containers can reside on a single server and share the same operating system components, including CPU, memory, and storage. This leads to markedly lower costs.

Containers allow developers to develop applications in a smaller, simplified, and modular way. Rather than run an entire monolithic application inside a single machine, containers enable an application to be split into smaller modules known as microservices. When a new application feature or update is required, the microservice or code update can be easily tested for errors during the development phase and deployed quickly and easily without having to rebuild the entire application. This allows agencies to deliver services with more agility and speed.

SolarWinds Container Tech Blog Embedded ImageContainers Aid With Scalability

Elasticity is another big benefit. For instance, during the early days of the COVID-19 pandemic, unemployment agencies experienced a peak in website and phone traffic as millions of citizens rushed to file for benefits. These agencies lacked the resources needed to manage this spike. With containers, however, IT administrators can easily scale their applications up during peak hours or down when they’re not needed. They can do so without having to rearchitect applications to deal with scaling requirements, potentially helping them save a significant amount of money.

Containers Pose Unique Monitoring Challenges

Despite their many benefits, containers occupy an ephemeral, virtualized environment that poses performance monitoring challenges. Open-source container monitoring tools can help, but they require instrumentation to achieve the desired monitoring capabilities. Federal IT pros need specialized tools designed to orchestrate container monitoring in a simple and easy-to-use way.

When considering tools, it’s important to remember true visibility goes beyond insights into containers themselves. IT administrators must monitor containers alongside other network and system components. If an application fails, they need solutions capable of quickly identifying where the problem resides. Is it within the container, the network, or the server? Is it happening on-premises, in the cloud, or in a hybrid environment?

Monitoring can also inform you when a container is reaching capacity. With this understanding, IT teams can spin up additional containers manually or automatically and maintain continuity of service (and the mission). Used historically, this data can also inform optimal capacity planning.

Beyond capturing performance metrics and alerts, container monitoring tools should also enable code profiling and trace-level visibility so IT teams can understand what’s happening inside their applications, such as instances of code potentially causing system degradation.

Monitoring Tools Enhance Container Performance

Containers offer many advantages to federal agencies: delivering software faster, lowering risk and costs, and easing the path to application modernization. But as container use increases, agencies must find ways to manage and monitor their environments. Without visibility into their containers and their many interdependencies, it’s impossible for IT teams to know if their investments are performing as expected, scaling as they should, and—most importantly—delivering highly available services to users and constituents.

Explore our resources to learn more about Continuous Monitoring and other protective monitoring infrastructure.

Evolving Kubernetes into an Enterprise Container Platform

State agencies and academic institutions are increasingly challenged to keep up with the speed of innovation while meeting stakeholder demands and expectations. By turning to container-based services, organizations enable efficient, affordable application delivery and cloud migration. Kubernetes, an open source platform, is the industry standard in container orchestration technology, but managing and running “do it yourself” Kubernetes is easier said than done.

Running on Containers

Red Hat Kubernetes Blog Embedded Image ImageIt’s almost cliche to say at this point, but we live in a digitally connected world where things are moving faster than ever before. This has never been the case more than in 2020 when frequent challenges force us to change the way we do business. How can government keep up with the speed of innovation and meet the expectations of its constituents?

Containers as a whole represent a fundamental shift in thinking about IT resources. We no longer think about machines or virtual machines, but instead about applications and capabilities. This itself is extremely empowering, and we’ve seen the public sector really start to embrace this idea to support its missions.

Central IT organizations are increasingly experiencing agency requests for container solutions. But implementing containers requires answering a lot of questions. How are they different from virtual machines? How can we use the technology in a secure, multi-tenant way? How can we containerize our applications? How can we shift our current development processes to suit containers?

Taking advantage of all that containers have to offer requires a tool to manage and orchestrate them. This is exactly where a platform like Kubernetes comes into play. It gives organizations and agencies the power to not only stretch their current infrastructures further, but also to pave the way for future innovation by enabling things like microservices or even serverless event-driven architectures.

Open Source, Open Culture

Container orchestration is powerful in what it allows organizations to do. It supplies a common platform for developers, security teams, and operations to all work and collaborate together.  It enables rapid innovation by standardizing and automating based on the guidelines that teams themselves can design and implement, ensuring that their best practices and security policies are rigorously followed.

The technology itself is important, but its real impact is the cultural shift that it enables. These tools and platforms allow the conversation to focus on the people and processes that bring the agency’s mission to life: collaboration, openness, transparency, and adaptiveness.

Containers permit you to restructure teams into centers of command, organizing them so they can use their particular expertise to solve problems. This is supported by technology that allows organizational control and guardrails to ensure everything is accomplished safely—providing innovation without sacrificing security and compliance. You can stop thinking about short-term technological necessities; your leadership and visionaries can plan the organization’s mission for years and decades into the future—not weeks and months.

Enterprising Kubernetes

Kubernetes is a powerful piece, but that’s all it is: one piece of the whole puzzle. Kubernetes alone might be sufficient for a small project, but making it a platform for enterprise and government is a very different story. There are a number of additional decisions required to make Kubernetes enterprise ready. And even after you make these decisions, you still have the work of configuring, integrating, operating, and supporting each of these pieces.

Let’s talk about concrete examples. What operating system will your nodes run?  What container runtime will you use? What is your image registry solution?  How about networking, load balancing and routing? Or log management or metrics?

Wouldn’t it be nice to have a product where all these decisions were already made? Wouldn’t it be great if everything was configured, hardened, and then rigorously tested using those configurations? Then as security patches and updates came out, you wouldn’t have to worry about hunting down and patching each component; instead you would have a place to apply those patches and updates with a single button click.

This is exactly what a supported enterprise Kubernetes platform provides. Vendors take the innovation of the open source community around Kubernetes, Prometheus, Grafana, Cri-O, CoreOS, and many other projects—and make these technologies accessible for enterprises and government by essentially taking ownership of these projects and ensuring they are secure, safe, and stable.

What happens if there is an issue with one of these projects? Would you want to rely on a group of volunteers to address your issue? Wouldn’t you rather have a dedicated team of engineers whose sole responsibility is to find and address such issues? The vendor provides the team and process around not only proactively looking for vulnerabilities but also fixing and delivering updates and patches securely to customers.

Doing More with Less

There are many hidden costs with implementing a “do it yourself” Kubernetes solution.  Assembling all of these different projects together by configuring, securing and hardening them requires teams to take responsibility and ownership—not only of the assembly but also of the long-term maintenance efforts. For instance, in just the past 3 years, Kubernetes has had 95% of its code changed. Imagine the time required to vet and then integrate all of these rapid changes into your platform. And don’t forget to account for the time required for responding to critical vulnerabilities discovered in your various projects.

Nevertheless, teams with very talented engineers still ask, “We have the technical know-how and capabilities to stand up Kubernetes as well as all of the other pieces together ourselves. Why wouldn’t we just do that?”

The answer is very simple. “Wouldn’t you rather have your talented engineers work on your organization’s mission—executing and bringing your ten-year timeline to life?” You could devote all or part of your team to building and maintaining a cloud platform, but their time and energy would be better dedicated to serving your constituents and achieving your mission.

In addition, simplifying operations and saving money can be particularly appealing to government and educational organizations during this time of budgetary constraints—as more and more resources are devoted to the pandemic response. Healthcare agencies are overwhelmed with new challenges. Transportation agencies face diminished ridership and revenue. And educational institutions must coordinate their curriculums in entirely new ways. If your organization is being asked to do more with less, shifting your current development processes to suit containers may be the solution.

Reach out Today and Download Red Hat’s Infographic to discover how containers drive application modernization in the public sector.

Building a More Secure Cloud

Government officials nationwide had to accelerate modernization initiatives to ensure that teleworking employees could access networks and data from remote locations. For many agencies, that meant a higher reliance on cloud technology and a possible expansion of their cybersecurity vulnerabilities in an environment already attractive to hackers. In response to the security challenges raised by the cloud, the federal government has provided myriad foundational documents, guidelines and strategies to help agencies create a strong security posture, including the Cloud Smart strategy and Federal Risk and Authorization Management Program (FedRAMP). Cloud technology has a crucial role to play in agencies’ ability to modernize IT systems and take advantage of the latest technological innovations. Given this importance, cloud adoption must keep pace with security efforts. Read the latest insights from industry thought leaders in government cloud security and FedRAMP in Carahsoft’s Innovation in Government® report.

IIG FCW July 2020 Blog ImageCloud and the Customer Experience  

“The emphasis on user-centered design is changing the way applications are created. In the past, many government applications were built from the perspective of the agency rather than from the perspective of the end user. The flexible, innovative nature of cloud technology makes it easier for agencies to improve the efficacy of their applications and what they ultimately deliver. In addition, cloud technologies can help agencies start getting a 360-degree view of how they interact with citizens, business partners and other agencies and even begin personalizing those experiences. In addition, software that manages, authenticates and verifies people’s credentials can ensure privacy while streamlining the customer experience. IDEA codifies the use of secure credentials across platforms and therefore will accelerate the use of trusted credentials in multiple environments so that people will be even more willing to conduct online transactions with the government.”

Read more insights from Acquia’s Vice President of Federal Sector, Peter Durand.

Why MultiCloud and Zero Trust Are Now Essential   

“The coronavirus pandemic has underscored the government’s need to offer a secure cloud environment that allows employees to access their data and applications anywhere, anytime and at virtually infinite scale. Many agencies found themselves unprepared to support the sudden move to telework in response to the pandemic. Some didn’t have enough VPNs or smart-card readers for their employees’ remote devices, for example. Google Cloud customers that were already using G Suite or Cloud Identity were able to make the transition to telework smoothly without the need for VPNs or other special technology. That was due in part to G Suite’s reliance on a zero trust architecture, which shifts access control from the network’s perimeter to individual users and devices.”

Read more insights from Google Cloud’s Director of Federal, Shannon Sullivan.

The Route to Secure, Fast Cloud Adoption

“SASE and CNAP pull together a number of different technologies and categories. But those are point-in-time definitions. Technologies evolve and their functions change over time, so rather than think about what category of product they need, agencies should focus on what they’re trying to accomplish and the business outcomes they want to achieve. Agencies should look for a platform that was built natively in the cloud. It should apply persistent protection to sensitive information no matter where it goes; offer complete visibility into data, context and user behavior across the entire environment; and take real-time action to correct policy violations and stop security threats.”

Read more insights from McAfee’s Senior Vice President of the Cloud Security Business Unit, Rajiv Gupta.

Cloud Security Considerations for DOD Mission Partners   

“Moving to the cloud requires a considerable level of effort and expense. Ensuring the security of applications or services running in a cloud adds another layer of complexity. When choosing a cloud service provider, organizations need to understand what security controls they will effectively inherit from that provider and what controls they will have to build and deploy on their own. For government agencies, FedRAMP provides a host of security levels and a robust number of security controls in a well-documented package, but Defense Department agencies also need to understand if they have any additional impact-level requirements for their applications and mission-critical data. As mission partners move to the cloud, they need to make sure that approved cloud providers can meet those baseline security and impact-level requirements.”

Read more insights from GDIT’s milCloud® 2.0 Cloud Services Portfolio Lead, Jeffrey Phelan.

The Evolution of Trusted Connections    

“Under TIC 3.0, agencies can still use network proxies, cloud access security brokers, and security information and event management (SIEM) tools to build a strong security framework, but they don’t have to run everything through a TIC. And users don’t have to struggle with increased latency and network complexity. Instead, the end-user experience is streamlined because cloud-native tools are handling processes and workloads. Agencies end up with a clean omnichannel experience for employees because their location no longer matters. Whether they are working on an iPad at home or a desktop computer at a government office, the security level and user experience are the same.”

Read more insights from Okta’s Solution Engineer, Habib Hourani.

Cloud: One Size Does Not Fit All

“Cloud is not a one-size-fits-all solution. Instead, finding the right fit depends on knowing agencies’ customers, the type of information they’re processing and their user base. Then it’s a question of aligning what the customer needs with the cloud offerings that are available. FedRAMP has been very successful at making that fit easier. The program brings transparency and consistency to the government’s use of cloud technology. Agencies know that an authorized company’s product or service has been rigorously reviewed under FedRAMP and that the government’s continuous monitoring program will provide information about how vulnerabilities are mitigated during the term of service.

Read more insights from SAP National Security Services’ Vice President and CISO, Ted Wagner.

How Cloud Makes Telework Smarter

“Smartsheet Gov enables employees to complete tasks more easily, efficiently and securely by working with systems on an automated or integrated basis. In addition, employees can access Smartsheet from wherever they are. They can share information and the results of their work via dashboards that multiple employees can view at one time and continue that seamless collaboration with their colleagues even when everyone is working from home. Smartsheet datasets are housed in a secure, FedRAMP-authorized cloud environment, which assures agencies that they can adhere to the same security protocols from outside the office. For example, if an agency needs to conduct a yearly audit that would normally take place with all the participants at a physical location, they can do the work remotely using Smartsheet Gov to run the same playbook, the same audit and the same workflow regardless of where those employees reside. Such borderless teams can reduce costs while increasing employee satisfaction and productivity.”

Read more insights from Smartsheet’s Vice President of Security, Risk and Compliance, Ignacio Martinez.

Visibility is Essential for Cloud Security

“The nature of an agency’s mission, data protection needs and other requirements suggest that multi-cloud and hybrid environments will be the norm. As we migrate to these new locales, there is an exponential deluge of data scattered across multiple systems and endpoints. It is critical that agencies have granular visibility into all the devices, workloads and applications running across these environments so that they can gain operational and security insights. The fidelity of data is another crucial factor because without it any technology has its limits and decisions may not ensure successful outcomes. To allay any fears about security, FedRAMP, a standardized framework for security assessments, was introduced. It has grown to be the gold standard for cloud security today.

Read more insights from Splunk’s Director of Industry Marketing for Public Sector and Education, Ashok Sankar.

How the Cloud is Redefining Security

“The Trusted Internet Connections Initiative was created in 2007 after the Office of Management and Budget conducted a study that found thousands of unprotected internet connections at agencies. Back then, we were using the internet mainly for email and web browsing, so when the government mandated that all internet traffic must go through a trusted connection, it made sense. But over the years, agencies have moved workloads to the cloud, and now employees’ activities rarely travel through an agency’s data center. As a result, TIC became a barrier to cloud adoption. The TIC 3.0 draft guidance, however, is a crucial step toward removing those obstacles.”

Read more insights from Zscaler’s Vice President of Global Government, Stephen Kovac.

 

Download the full Innovation in Government® report for more insights from these government cloud security thought leaders and additional industry research from FCW.

Creating Modern IDEA Compliant Citizen Experiences

Federal agencies are no longer expected to be just sources of information and services. They’re now tasked with providing digital experiences on par with those found on consumer sites. This starts with having a website compliant with the 21st Century Integrated Digital Experience Act (IDEA). It also means incorporating useful content, a personalized experience, and data management that allows non-technical stakeholders to update and maintain the site.

Among the many available resources for agencies to achieve IDEA compliance is the U.S. Web Design System (USWDS)—a library of code, tools, and guidance designed to help agency teams build fast, accessible, mobile-friendly websites. With the launch of USWDS 2.0 in 2019, the USWDS has become very adaptable with improved aesthetics and UX. This includes expressive theming to customize while remaining consistent with the system’s best practice guidance.

Liferay USWDS Blog ImageOvercoming Challenges to Meet Citizen Expectations

While the USWDS goes a long way towards building better digital experiences, it doesn’t offer the capabilities to provide streamlined, responsive digital engagement. Hindered by legacy systems that lack integration capabilities, agencies often struggle with fragmented and incomplete data that prevents operational efficiency and the ability to offer a seamless experience. Other challenges include insufficient or outdated information and manual processing of paperwork which creates time-consuming hassles for both citizens and government employees. Leveraging a digital experience platform (DXP), that includes USWDS code and guidance, is an effective way to build a compliant government website that also delivers a relevant, personalized citizen experience.

The Powerful Addition of a DXP

A DXP provides a modern architecture for agencies to deliver connected experiences. The platform gathers actionable insights to meet today’s citizen expectations and can integrate with both modern and legacy technologies. With an open architecture DXP, plugins can be applied to the platform, incorporating 3rd party libraries, such as USWDS. In other words, the goals of achieving compliance and improving citizen experiences can be achieved with a single solution.

  • Bridge Legacy and Modern Systems

With the addition of a DXP, agencies can connect disparate systems with a wide range of tools and APIs, combining data on a single, modern digital platform. The platform also can be scaled to meet the needs of citizens today and tomorrow.

  • Deliver a More Relevant Experience

A DXP can offer the advanced capabilities of persona modeling, journey mapping, responsive layout, and data-driven design to create personalized, relevant interactions. It can learn and adapt to user behaviors and feedback to continuously fine-tune personalization across both live and self-service engagement channels.

  • Provide Ready Access to Current Content

The challenge of finding information and manually processing paperwork can also be overcome. By effectively managing the entire lifecycle of content from creation to publishing, storage, maintenance, and deletion, a DXP enables the ability to immediately provide the optimal content to both citizens and staff.

  • A Winning Combination

As the push towards digital transformation continues, delivering on the design and functionality expectations of citizens is critical. With Liferay DXP and the Liferay Theme Generator, federal agencies can create IDEA-compliant experiences, rather than just disseminate content, to deliver thoroughly modern websites. Learn more about how to create better citizen experiences with Liferay. Be sure to visit the Liferay Digital Experience Hub for more information on how Liferay helps government agencies improve their digital experiences.