The Open Source Revolution in Government

Open source technology accounts for a significant portion of most modern applications, with some estimates going as high as 90%, and it is the foundation of many mainstream technologies. Its strength lies in the fact that a vibrant ecosystem of developers contribute to and continually improve the underlying code, which keeps the software dynamic and responsive to changing needs. Enterprise open source software further augments these community-driven projects by providing enterprise-grade support and scalability, while retaining the innovation and flexibility driven by the open source development model. By providing the best of both worlds, such solutions represent a powerful arsenal of tools for addressing government’s most pressing challenges. In a recent pulse survey of FCW readers, 93% of respondents said they were using open source technology. And more than half of respondents to FCW’s survey see open source as an integral resource for strengthening cybersecurity. That number reflects a positive trend toward a better understanding of open source software’s intrinsic approach to security. The power of enterprise open source technologies lies in a combination of collaboration, transparency and industry expertise. As agencies expand their use of such technologies, they maximize their ability to achieve mission success in the most secure, agile and innovative way possible. Learn how the combined power of community-driven innovation and industry-leading technical support is expanding the government’s capacity for transformation in Carahsoft’s Innovation in Government® report.

 

Why Open Source is a Mission-Critical Foundation  

IIG FCW Open Source Revolution November Blog Embedded Image 2022“Open source transforms the way agencies manage hybrid and multi-cloud environments. The most critical technology in the cloud, across all providers, is Linux. Everything is built on top of that foundation — both the infrastructure of the cloud and cloud offerings. Given the right partner, the promise of Linux is that it provides a consistent technology layer for agencies across all footprints, including multiple cloud providers, on-premises data centers and edge environments. From that foundation, agencies and their partners can build portable architectures that leverage other open source technologies. Portability gives organizations the ability to use the same architectures, underlying technologies, monitoring and security solutions, and human skills to manage mission-critical capabilities across all footprints.”

Read more insights from Christopher Smith, Vice President and General Manager of the North America Public Sector at Red Hat.

 

How Open Source is Expanding its Mission Reach

“The real power of open source technologies was revealed when they cracked the code on being highly powered, mission-specific, distributed systems. That’s how we are able to get insights out of data by being able to hold it and query it. Today, open source innovation is being accelerated by the cloud, and the conversation is still changing, with people now demanding that their open source companies be cloud-first platforms. Along the way, the open source technologies that start in the community and then receive a boost of commercial innovation have matured. The most powerful ones are expanding their ability to address more of the government’s mission needs. They are staying interoperable and keeping the data interchange non-proprietary, which is important for government agencies.”

Read more insights from David Erickson, Senior Director of Solutions Architecture at Elastic.

 

The Open Source Community’s Commitment to Security  

“A central tenet of software development is visibility and traceability from start to finish so that a developer can follow the code through development, testing, building and security compliance, and then into the final production environment. Along the way, there are some key activities that boost collaboration and positive outcomes, starting with early code previews, where developers can spin up an application for stakeholders to review. Other activities include documented code reviews by peers to ensure the code is well written and efficient. In addition, DevOps components such as open source, infrastructure as code, Kubernetes as a deployment mechanism, automated testing, and better platforms and capabilities have helped developers move away from building ecosystems and instead focus on innovation.”

Read more insights from Joel Krooswyk, Federal CTO at GitLab.

 

The Limitless Potential of an Open Source Database

“One of the most important elements of any database migration is ensuring that proper planning and due diligence have been performed to ensure a smooth and successful deployment. In addition, there are some key considerations agencies should keep in mind when moving to open source databases. It is essential to start with a clear understanding of the business case and objectives for adopting an open source approach. Agencies also need to decide how the database should function and what it should do to support their digital transformation. Then they must choose the optimal method to deploy the database.”

Read more insights from Jeremy A. Wilson, CTO of the North America Public Sector at EDB.

 

Modernizing Digital Services with Open Source

“A composable, open source digital experience platform (DXP) enables agencies to overcome those challenges. Open source technology is continuously contributed to by a community of developers to reflect a wide array of needs across organizations in varying industries and of varying sizes. A composable approach allows agencies to assemble a number of solutions for a fast, efficient system that is tailored to their needs. When agencies combine a composable DXP with open source technology, they have access to best-of-breed software and the ability to customize the assembly to suit their requirements. An enterprise DXP will enable agencies to achieve a 360-degree view of how constituents are engaging with their digital services and gain valuable data to understand how to enhance their experience. Finally, a composable, open source DXP provides a proactive approach to protecting against security and compliance vulnerabilities.”

Read more insights from Tami Pearlstein, Senior Product Marketing Manager at Acquia.

 

Creating Secure Open Source Repositories

“Protecting the software supply chain requires looking at every single thing that might come into an agency’s environment. To understand that level of visibility, I like to use the analogy of a refrigerator. All the ingredients necessary to make a cake or pie are in the refrigerator. We know they are of good quality, and other teams can use them instead of having to find their own. At Sonatype, our software equivalent of a refrigerator is the Nexus Repository Manager. A second aspect of our offering, called Lifecycle, allows us to evaluate the open source components in repositories at every stage of the software development life cycle. One piece of software can download a thousand other components. How do we know if one of those components is malicious?”

Read more insights from Maury Cupitt, Regional Vice President of Sales Engineering at Sonatype.

 

Better Data Flows for a Better Customer Experience

“A more responsive and personalized customer experience isn’t much different from the initial problem set that gave birth to Apache Kafka. When people interact with agencies, they want those agencies to know who they are and how they’ve interacted in the past. They don’t want to be asked for their Social Security number three times on the same phone call. They also expect that the information or service they receive will be the same whether they are accessing it over the phone, via a mobile app and on a website. To elevate the quality of their service, agencies must be able to stream information in a low-friction way so different systems are consistent with one another and up-to-date at all times, regardless of the communication channel an individual uses. President Joe Biden’s executive order about transforming the federal customer experience is based on this capability. The most successful companies across industries have figured out how to do it, and for the most part, they’ve done it with open source software.”

Read more insights from Jason Schick, General Manager of Confluent US Public Sector.

 

An Open Source Approach to Data Analytics

“For the past 40 years, agencies have used data warehouses to collect and analyze their data. Although those warehouses worked well, they were limited in what they could do. For instance, they could only handle structured data, but by some estimates, 90% of agencies’ data is unstructured and in the form of text, images, audio, video and the like. Furthermore, proprietary data warehouses can show agencies what has happened in the past but can’t predict what might happen in the future. To achieve the government’s goal of evidence-based decision-making, agencies need to be able to tap into all their data and predict what might come next.”

Read more insights from Howard Levenson, Regional Vice President at Databricks.

 

Download the full Innovation in Government® report for more insights from these open source thought leaders and additional industry research from FCW.

Top 10 Blog Posts of 2020

2020 was an unprecedented year with certain trends in technology developing practically overnight. IT solutions such as cybersecurity and workflow automation became more important than ever as many across the nation began working from home. During this time, Government agencies have become more adaptable, security-focused, and driven to ensure the digital experience has and continues to be successful. Here’s a look back at our Top 10 Carahsoft Community Blog posts of 2020 featuring this year’s most popular IT topics.

 

1) IT TRENDS IN GOVERNMENT: The Cloud and Electronic Signatures

Digital experiences are at the center of most services that citizens utilize day-to-day, and throughout government they can impact access to important services, such as healthcare, food aid, and housing. In order to ensure that these services are adequately accessible to the public, proper measures must be taken to make content available across devices, adaptable for use by all users regardless of physical ability, and consistent in appearance.

The best way to achieve digital experiences that adhere to the aforementioned criteria is to utilize the appropriate technology, such as form creation software and electronic signature platforms, which are becoming increasingly prevalent. In this post, Carahsoft’s Senior Product Specialist, Ashley Weston, examines two of Government’s top IT trends to achieving key digital experiences—form creation and e-signatures.

 

2) How Federal Agencies Can Achieve Section 508 Compliance

Technology has enabled users with visual or other impairments to more easily navigate the world around them, and government organizations are increasingly expected to abide by basic digital accessibility standards and to comply with federal requirements.

One such requirement is aimed at federal agencies, ensuring the government’s digital presence is accessible to users with disabilities. Section 508, which is part of the Rehabilitation Act of 1973, mandates that all electronic and information technology used by the federal government—including websites, social media, job application portals, and more—must be accessible to the 60 million people in the United States living with disabilities. In this post, Addteq partnered with Atlassian to explain how federal agencies can achieve Section 508 compliance.

 

3) Tips and Tricks to Establishing a Successful Telework Environment

As swaths of organizations in the United States are forced to shutter their workplaces in the wake of the coronavirus pandemic, unprecedented numbers of employees are conducting business as usual—from the safety of their homes. Some states have placed restrictions on nonessential businesses, and many organizations—including government contractors—have taken the initiative to encourage employees to work from home. In this post, Carahsoft’s Adobe Product Specialist discusses tips and tricks to successfully establish a large scale Telework Environment during the beginning stages of the coronavirus pandemic in the Unites States.

 

4) Evolving Kubernetes into an Enterprise Container Platform

State agencies and academic institutions are increasingly challenged to keep up with the speed of innovation while meeting stakeholder demands and expectations. By turning to container-based services, organizations enable efficient, affordable application delivery and cloud migration. Kubernetes, an open source platform, is the industry standard in container orchestration technology, but managing and running “do it yourself” Kubernetes is easier said than done. In this post, Red Hat experts explain how organizations can use container-based services to enable efficient, affordable application delivery and cloud migration.

 

5) Start Your Agency Off on the Best Cybersecurity Foot With Federal Frameworks

According to the SolarWinds 2019 Federal Cybersecurity survey report, threats posed by careless and malicious insiders and foreign governments are at an all-time high. The report found 56% of federal government IT leaders surveyed considered careless or untrained insiders as the most significant threat to their organizations. Fifty-two percent said foreign governments are the primary menace to their agencies.

Despite this, federal agencies surveyed believe their ability to detect and prevent insider and malicious external threats has improved over the last year. Agencies attribute this confidence to updated federal regulations and mandates that give them the ability to better manage risk as part of their overall security posture. In this post, we spoke with SolarWinds about how agencies can effectively tailor their cybersecurity frameworks.

 

Top 10 Community Blogs 2020 Embedded Image6) 3 Reasons Federal Healthcare Agencies Need Cloud Computing

It’s been six years since U.S. healthcare providers were required to integrate medical records into electronic systems under the American Recovery and Reinvestment Act. Since then, newer mandates have continued to encourage digital data sharing and interoperability within healthcare organizations.

A natural next step in the digitization of healthcare records is storing that data in the cloud, where it can be securely accessed and updated by healthcare teams. Additionally, when paired with cutting-edge artificial intelligence and machine learning technologies, cloud computing can offer data analysis that facilitates breakthroughs in medical research and patient care. In this post, Google Cloud talks 3 essential reasons that cloud computing can make a change in federal healthcare agencies.

 

7) How AI is Helping Government Agencies Deliver on their Missions

The Federal Data Strategy’s 2020 Action Plan released in December set the stage for how government agencies should prioritize data in the coming year. Since that time, many agencies have taken aggressive steps to turn their data holdings into strategic assets. One area of focus has been the increased adoption of AI and machine learning technologies. In my role, I work closely with the agencies and their data teams sitting on the front lines of this innovation. The early adopters who began their big data journey over the last few years are starting to see how data and predictive analytics can support their mission goals and create additional value for their stakeholders. In this post, Databricks walked us through examples of this implementation with teams across federal, state, and local agencies.

 

8) Creating Modern IDEA Compliant Citizen Experiences

Federal agencies are no longer expected to be just sources of information and services. They’re now tasked with providing digital experiences on par with those found on consumer sites. This starts with having a website compliant with the 21st Century Integrated Digital Experience Act (IDEA). It also means incorporating useful content, a personalized experience, and data management that allows non-technical stakeholders to update and maintain the site. In this post, Liferay’s Kale Fluharty dives deep into how to create a government compliant citizen experience using DXP with USWDS 2.0.

 

9) How Facial Recognition Can Keep Flexible Workplaces Safe

As state and federal agencies begin exploring hybrid workplace models and planning on how to keep employees safe as the COVID pandemic continues to evolve, compliance is a critical piece of the puzzle. Office reopening plans are only as successful as their implementation, and government organizations must be able to ensure that whatever precautions they put into place—from requiring masks and social distancing to keeping remote or revolving workstations secure—are effective. In this post, piXlogic’s Joseph Santucci explains ways that facial recognition can improve workplace safety, especially during a COVID-era in which employee accountability is imperative.

 

10) Leaders In Innovation: Identity and Access Management

Agencies have been learning the importance of identity and access management for nearly two decades, but, like many technological evolutions, the coronavirus pandemic has encouraged adoption on an entirely new scale. As remote work became the norm, agencies adapted to use technology like smart identity cards in new ways, enabling capabilities like digital signatures. These new features are secured by the common access card (CAC) in the Department of Defense (DoD) or the Personal Identity Verification (PIV) card in the civilian environment, and all follow the principles and strategies of identity and access management. In this post, we summarized the full Leaders in Innovation report which discussed the benefits and challenges of identity and access management.

 

Though this year presented its challenges, such as many companies moving completely out of office due to a global pandemic, Government Technology has evolved to expand its capabilities. During this struggle, we’d like to thank all of our authors, contributors and readers for their support within our community. We’re pleased to continue growing our blog and expanding our content, and look forward to bringing you even more in 2021.

Thanks for checking out our top 10 Community Blog posts for 2020! Come back soon to read our upcoming series on public sector IT trends that will be mission critical in 2021 – we will be taking a deeper look into: Workflow Automation, Artificial Intelligence and Machine Learning, Cybersecurity and Multicloud Technology.

How AI is Helping Government Agencies Deliver on their Missions

The Federal Data Strategy’s 2020 Action Plan released in December set the stage for how government agencies should prioritize data in the coming year. Since that time, many agencies have taken aggressive steps to turn their data holdings into strategic assets. One area of focus has been the increased adoption of AI and machine learning technologies. In my role, I work closely with the agencies and their data teams sitting on the front lines of this innovation. The early adopters who began their big data journey over the last few years are starting to see how data and predictive analytics can support their mission goals and create additional value for their stakeholders. Here are a few examples from working with teams across federal, state, and local agencies.

How AI is helping the Government

Public Health

Databricks AI Deliver Missions Blog ImageCOVID has shined a light on the dire need for population-scale predictive analytics. We are working with large public health entities to use AI to predict disease spread and adapt their response to the pandemic in real-time. By empowering epidemiologists to analyze different public data holdings at scale, these agencies have built disease surveillance systems that can predict a 7-day rolling average for case positivity or test rates. This is critical for influencing when and where to roll-out or roll-back shut down orders. We’ve even seen examples of this within the military where similar machine learning models are being used to predict COVID outbreaks among servicemen. This is critical when determining the mission readiness of their resources.

One specific example is our work with the Medical University of South Carolina. They are leaders in providing telehealth services in the US. Chatbots on their website help triage potential COVID patients. They applied predictive analytics to thousands of chatbot records to determine which patients are high-risk and require early intervention. This helps expedite patient flows while getting the right intervention to the right person, ultimately saving lives.

Military and Defense

The US Air Force recently spoke at our Spark + AI Conference on how they are using AI to predict part failures across their fleet of jets. They examine wear patterns and data sets from flight travel to identify when they might have a part failure. Early identification enables staff to stock the right part at the right time and repair that plane before it must be pulled out of commission. These types of innovations can result in millions of dollars of savings and improved efficiencies across the supply chain.

Emergency Services

The Los Angeles City Fire Department recently joined us for a panel discussion on the use of big data. They shared how they are using AI for smart routing and optimal placement of emergency resources. Ambulances, fire engines, and other vehicles and emergency response staff have different capabilities and skill-sets. When an emergency occurs, they need to deliver the right resources as close to the incident as possible to provide the appropriate emergency response resources quickly and efficiently. Machine learning can be used to help predict where to best deploy emergency response resources based on historic patterns to ensure the fastest response time possible.

Public Transportation

There are many examples of AI being applied to public transit. The US Department of Transportation’s SWIM project for the FAA uses advanced analytics to optimize the routes of airplanes and make them more efficient. At a local level, machine learning is being applied to real-time data feeds from public transit vehicles to improve city bus routes and aid in road planning. By predicting  traffic patterns and accident hot spots, transit agencies can optimize their transit ecosystem to better meet citizen needs.

Being successful with AI

With all these great examples, the question is not whether to invest in AI, but how to do it successfully?  It starts by federating all your data. AI algorithms are only as good as the data they are fed and with data assets sitting in siloed source systems and data warehouses, agencies need to consider a new approach to enabling analytics at scale.

The good news: powerful cloud-based solutions provide agencies with the tooling and scale they need to bring together vast amounts of structured and unstructured data, process that data efficiently in the cloud and analyze it with powerful analytics and AI tools. By bringing your data together in one place with the scale of the cloud, you can answer critical questions not in days or weeks but in a matter of minutes, dramatically impacting the success of your mission.

But getting the data together is only half the problem. Agencies need to consider how to best enable diverse mission teams to collaborate on these data sets. Databricks helps agencies overcome these challenges with a collaborative environment for analytics and AI that allows data engineers to prepare data, analysts and data scientists to build analytics and machine learning models, and functional teams to review findings through published dashboards, all within one platform.

Whether Databricks is the solution your agency is using or not, success with AI hinges on adopting a unified approach to data analytics and AI that brings together diverse sets of data, best of breed analytics tools and people from across the agency to work together to drive value from these critical data assets.

Get Started on Your AI Journey