Top 5 Unemployment Fraud Trends

The economic fallout from the COVID-19 pandemic has created a perfect storm of unemployment fraud—exacerbated by pressure on state agencies to provide unemployment benefits and inadequate anti-fraud infrastructure in those agencies. Fortunately, there is a clear path forward to combat unemployment fraud.

Here are the top five recent trends in unemployment fraud:

Fraud is easier

Pressure on state agencies to provide monetary relief for families, along with a steep increase in claim volume, has made it easier to succeed at unemployment fraud. The volume of claims alone helps to conceal fraudulent activities. Fraudsters have particularly targeted states without an income tax since those states cannot verify identities with tax records. Many states only learn about fraud when notified by citizens who have discovered fraudulent claims filed in their names.

Some states have slowed claims payments so they can verify the information before paying claims. But this slows benefit payments to families in need and adds to their frustration. States are better served by adopting technology to detect and prevent fraud in real-time.

Stolen identities are common

The easiest and most frequent way to commit unemployment fraud is with stolen identities. Massive data breaches in 2015, 2017, and 2019 at credit bureaus, healthcare providers, retailers, and credit card companies have compromised the social security numbers for virtually every American. There is a plelthora of false identities available, and they can easily be purchased on the dark web. Online tutorials explain the process of filing a false unemployment claim.

After amassing a list of stolen identities, fraudsters start trying to open new accounts and file unemployment claims. They often use stolen personal data for people who have just been born, have recently died, are in prison, or are even still employed. Fraudsters also assemble “synthetic identities” by combining information from different individuals to create a false person.

F5 Unemployment Fraud Trends Blog 2021 Embedded ImageFaking an address

During the unemployment application process, individuals must provide an address. Using real addresses of the victims of identity theft would be too dangerous. Instead, fraudsters list addresses for vacant buildings, frequently filing hundreds of applications with the identical physical address.

CBS Los Angeles discovered that empty mansions for sale often had hundreds or thousands of fraudulent unemployment claims listing them as the physical address. In some cases, illicit couriers visit the properties to pick up debit cards loaded with unemployment benefits.

Copy and paste

Fraudsters paste information roughly ten times more frequently than legitimate users. They also tend to open their web browsers only on a portion of the available screen space. The rest of the screen is occupied by a text file to allow copying and pasting. Most applicants don’t copy and paste their first and last names into online forms—unless they’re trying to open hundreds of unemployment claims in other people’s names.

Fraudsters love to hide

States are overwhelmed just handling unemployment claims and rarely have resources to investigate the inconsistencies that might indicate fraud. Fraudsters use a variety of techniques to avoid detection. They often use VPNs and cloud infrastructure to conceal their identities—as well as rotating their IP addresses and user agents. However, when they do this, their devices’ time zones frequently don’t coincide with the geolocation for their IP address.

In addition, fraudsters tend to use familiar devices. Research shows the same devices accessing a large number of unemployment accounts. It isn’t unusual for a single device to be used to access more than 20 fraudulent accounts. (By comparison, most devices access no more than three accounts.)

The pandemic and accompanying economic turmoil continue to create huge challenges. Unfortunately, fraudsters have quickly capitalized on the confusion to take advantage of benefits earmarked for those who really need them. Education about unemployment fraud allows technological solutions to detect and stop it. This can decrease fraud losses and ensure that states successfully direct those funds to the right recipients.

View our resource for more information on how F5 enables State Government Agencies to fight fraudulent claims.

Driving Cloud Computing and Telework Change

 

The Coronavirus pandemic has spurred noticeable changes in the way individuals within most organizations collaborate with colleagues, interface with the public, and get their day-to-day business done. Many have transitioned to telework, brought on virtual training platforms, and secured tools for managing or signing documents electronically. While employee safety certainly ramped up efforts to modernize, especially in the government sphere, the underlying push to modernize and adopt more cloud infrastructure continues in full force into the future.

The New Administration Sets IT and Cybersecurity Plans

As is customary to accompany a transfer of power, President Joe Biden announced his planned initiatives, which include making “federal IT modernization and cybersecurity top priorities during the early days of his administration — second only to COVID-19 response, it seems.” Biden’s transition team indicated that the administration “will provide emergency funding to upgrade federal information technology infrastructure and address the recent breaches of federal government data systems,” as “this is an urgent national security issue that cannot wait.” All in all, U.S. cybersecurity capabilities need to be strengthened to prevent breaches and a possible crisis over top of the current pandemic that has affected 25 million Americans thus far (Billy Mitchell).

CISA Recommendations for Remote Work

Adobe Cloud Computing and Telework Change Blog Embedded ImageWith no immediate safe end to remote work for those who are able to do so, the Cybersecurity and Infrastructure Security Agency is also warning that poor cyber standards can put an organization at risk for a major attack. As such, an analysis report released on January 13th by CISA outlines security practices. The report indicates, “These types of attacks frequently occurred when victim organizations’ employees worked remotely and used a mixture of corporate laptops and personal devices to access their respective cloud services.” It is therefore recommended that organizations establish a solid baseline for remote work and use platforms that allow for customized security settings (Sara Wilson).

IT Infrastructure Requirements

Major American government agencies are steadily stepping up to modernize, but the requirements for cloud solutions are strict. Platforms must adhere to different compliances, such as SOC, or achieve certain certifications, such as FedRamp, and tech companies are answering the call. In turn, by leveraging VPNs, setting up servers, and accessing cloud services, agencies from the state and local to the federal level are seeing the benefits of these solutions in remote work. Navy Vice Adm. Nancy A. Norton, Defense Information Systems Agency director, says, “The COVID-19 emergency drove us to enhance our telework tools for our workforce” and from here on out, “I think the world has probably recognized the value of telework and the ease at which we can telework.” DISA, which is on the digital frontlines in cyber and information technology, “enabled a more than 1,000% increase in telework connections for joint mission partners around the globe” through VPN functionality (David Vergun).

Modernizing Form Workflows

Aside from the infrastructure changes that remote work entails, it also drives changes to internal and external workflows, particularly those involving forms and signatures. With a large portion of the workforce operating remotely and citizens continuing to require public services, end-to-end solutions that allow for hosting documents, signing electronically, and tracking from start to finish become essential. Agencies are encouraged to secure their internal operations and make their public-facing operations fully digital and accessible across devices. The pandemic has quickened the call to action for government entities to modernize, but the efficiency of telework and the ever-present threat of cyber-attacks indicate that this trend will continue.

From top government organizations ramping up their IT infrastructure, to state and local governments adopting hosted solutions for day-to-day operations, the overall success of remote work that the pandemic necessitated shows that “we have learned how to work with our workforce in ways that we never did before,” says Norton, “and I think this is something that […] is going to continue.”

Looking to accelerate document processes within your agency? Discover how Adobe is paving the way for a digital document revolution in our 8-part webinar series, Integrated Paperless Processes From Start to Finish!

Best of What’s New In Law Enforcement

In July, USA Today reported that the combination of pandemic-induced economic woes and the national movement to “defund the police” could lead to the biggest budget cuts for law enforcement agencies since the Great Recession of 2008. For police departments facing growing demands and tightening budgets, using technology to increase the impact of existing staff and resources will be a game changer. Luckily, autonomous technologies, better connectivity, and more sophisticated video and surveillance analytics tools are available to fill in the gaps. Read the latest insights from industry thought leaders in law enforcement in Carahsoft’s Innovation in Government® report.

 

Managing Cyber Exposure in Law Enforcement

“A law enforcement agency can face a variety of issues. It may need to address issues related to who has access to what information based on their role. It may need to segment its network — for example, to separate CJIS lookups from other areas that are open to the public. Law enforcement organizations may also be connected to other municipal departments such as the Department of Public Works or even other departments outside the municipality. Addressing these potential attack vectors requires security expertise, which in many cases is not on the agency’s priority list or in its budget. As a result, these agencies become even more susceptible to attack.”

Read more insights Tenable’s Senior Director of Marketing, Michael Rothschild.

 

Using Blockchain Analysis to Fight Crime

“It comes down to having the right data and making it actionable. Specifically, law enforcement should be interested in a partner with data attributing services, which attribute addresses to the clusters — that is, the entities — that control them. In this case, that would be cluster associated with criminal activity and their cashout points. The historical data behind this capability is an important differentiator. Chainalysis is the only company that has systematically collected information that links real-world entities to blockchain transactions since 2014. This allows the software to accurately distinguish different clusters of entities and attribute more data than can be seen on the blockchain.”

Read more insights from Chainalysis’s Director of Market Development, Don Spies.

 

Cloud: The IT Force Multiplier

“Storing, managing and effectively using an ever-increasing volume of digital data presents multiple challenges. Buying and maintaining hardware for data storage is expensive and challenging and diverts resources from the core mission of public safety. Then, agencies must manage stored data so it is discoverable, retrievable and in compliance with legally mandated retention policies. Without a sound digital evidence management solution and automated life cycle retention solutions, data management is nearly impossible. Finally, because data is produced in multiple systems, integrating and normalizing that data so it can be searched, analyzed and shared is challenging. Without a strong data management approach and systems, agencies must access multiple systems to discover data that is in different formats, making it very difficult to integrate and gain insights from that information.”

Read more insights from Amazon Web Services’s Public Strategy Lead, Ryan Reynolds.

 

January GovTech Law Enforcement Blog Embedded ImageSupporting the Law Enforcement Community During COVID-19 and Beyond

“COVID-19 created an unprecedented urgency for state, county and municipal workers to operate remotely whenever possible. This caught many agencies by surprise. Although these organizations moved with commendable speed to equip staff to work from home, the needs of the public only increased. Law enforcement agencies had to quickly adapt to the dangers of a pandemic amid calls for police reforms. These officials had to balance protecting the public, themselves and their colleagues in an ever-changing environment. Many departments have come to appreciate how technology enabled them to address these critical priorities.”

Read more insights from the Director of the Law Enforcement Team at Carahsoft, Lacey Wean.

 

Technology is Key to More Efficient and Effective Law Enforcement

“The pandemic decreased proactive activities. There are fewer cases where an officer might stop you for speeding 10 mph over the speed limit, for example. Departments have to weigh whether it’s worth the risk to stop a car to issue a traffic ticket and potentially be exposed to COVID-19, or to reserve their exposure time for things that are a matter of life or death. The impact of that is reduced revenue generation. COVID-19 also impacted morale. More law enforcement personnel have died from COVID-19 this year than have died in the line of duty. That impacts a police department and its morale — people work longer shifts, and health often suffers.”

Read more insights from the former Senior Adviser for the U.S. State Department’s Antiterrorism Assistance Program and Senior Law Enforcement Adviser for the 2012 Republican National Convention, Morgan Wright.

 

Download the full Innovation in Government® report for more insights from these law enforcement thought leaders and additional industry research from GovTech.

Cloud’s Burning Questions: Hybrid, MultiCloud, Emerging Tech, and More

Over the past year there’s been a big change of federal agency policy from Cloud First to Cloud Smart. Even with the Cloud First mandate, a lot of workloads remain on-premises. Now, agencies want to be smart about moving to the cloud.

Moving to the Cloud

One of the biggest stumbling blocks in transitioning from on-premises to off-premises services is knowing which applications to move first. The Federal CIO Council’s Application Rationalization Playbook encourages CIOs to consider both business value and technical fit when making such decisions, since the best candidates are applications that are high in both. However, you also want to consider those applications with high business value and low technical fit like an old monolithic application that needs to be modernized.

In addition, COVID may have reorganized your agency’s priorities, so reexamine your existing plans. If you did an earlier rationalization of your portfolio, you should take another look. The unemployment modernization effort that you had on the back burner — may need to move to the front burner while other things are pulled back.

Securing the Cloud

Many people still think that the cloud cannot possibly be secure. The reality is that that the scale in terms of the systems and the number of security professionals helps ensure things are secure. But agencies must ensure that their security posture is consistent — whether it’s on-prem or in a public cloud. Automating it allows consistency, ensuring that you’re not creating holes in one environment while another is secure.

GovForward Blog Series - Red Hat Embedded ImageEmbracing Open Source

Government agencies can be wary of open source applications — but a great idea is a great idea no matter where it comes from, and open source is a great way to share best practices with a community. For one example… [think about] all the taxpayer money that has been spent on locking down a web server running Red Hat Enterprise Linux over and over again in the government. A lot of the “authority to operate” (ATO) paperwork hasn’t been reused at all.

Wouldn’t it be great if that paperwork were available so other agencies could not only use it, but improve upon it and make the security even stronger? That’s what [Red Hat’s] Compliance as Code project is, which allows people to get that ATO a lot faster and for a fraction of the cost – and that’s all thanks to open source.

Transitioning from Proprietary to Open Source

Agencies expect the divide between proprietary and open source to be more binary than it is. You don’t have to go all open source or all proprietary. Instead, pick the right blend that works for you. For example, you can run a proprietary database on an open source operating system on a proprietary hypervisor. Agencies can do so as well if they decide where to standardize, where to be in the stack and where to lay that open substrate.

Do you want it at the operating system level? At the Red Hat Enterprise Linux level, you could have on-premises data centers, public cloud, multiple different cloud vendors. Or do you want to go higher up the stack at say the Platform-as-a-Service layer where you use OpenShift and Kubernetes? That allows you further abstraction and more focus on the actual mission applications themselves. The important thing is making going to the cloud a conscious decision.

Achieving Success in the Cloud

The U.S. Citizenship and Immigration Services has taken their legacy monolithic applications and broken them down into containerized microservices on top of OpenShift, which can run on the public cloud or be on prem; the portability is right there.

But the agency did not just lift and shift the application over. They looked at the people and the processes — like changing from a waterfall model to agile and DevOps. Changing those processes — adding security, shifting security left to put that forefront with the developers and operations teams instead of as an afterthought — helped foster a very strong culture that encourages employees to focus on the mission.

Visit our website to learn more about the GovForward: Multicloud Series and FedRAMP through our additional resources.

The State of Artificial Intelligence in Government

Government agencies have been discussing artificial intelligence (AI) for more than a decade, and as technology and legislation progress, the focus on public sector impacts is stronger than ever. A 2019 executive order highlights American leadership in AI as key to maintaining the economic and national security of the United States. The Trump administration has also issued regulatory guidance on AI, instructing all federal agencies to prioritize and allocate funding for AI programs that serve their individual missions. Numerous national agencies and even multinational partnerships have identified AI as a priority. AI’s similarity to human intelligence means it could potentially impact every corner of society, from cybersecurity to medicine. To learn more about how your agency can use AI to analyze data, recognize patterns and automate manual tasks, get up to date with The State of AI in Government, a guide created by GovLoop and Carahsoft featuring insights from the following technology and government AI thought leaders.

 

AI Requires a New Approach to High-Performance Computing

“High-performance computing (HPC) needs to evolve. The traditional HPC architecture, now decades old, worked well for previous generations of HPC applications. But today’s applications, driven by AI, require a new approach. The problem? The old systems were too static. That wasn’t a problem when applications had static performance requirements. But AI is different. When developing an AI system, the workload changes from one stage of the process to another.”

Read more insights from Liqid’s Public Sector Chief Technology Officer, Matt Demas, and Director of Sales, Eric Oberhofer.

 

Bring AI to the Edge

“Legacy computing structures always glued data scientists to data centers. The two were tethered together, meaning scientists couldn’t work where the data didn’t reside, much like how a lab scientist needs their lab chemicals and instruments. Data science, however, is not entirely like lab science, because endless inputs come outside of a controlled environment. AI models are most effective when exposed to open air. The solution is to bring software-based applications to the edge, except for massive data projects.”

Read more insights from HPE’s Defense Department Account Team Technologist, Jeff Winterich, and Red Hat’s Public Sector Staff Solutions Architect, Ryan Kraus.

 

GovLoop Dec. AI in Government Embedded Image3 Ways Cloud Improves AI

“Cloud-based AI can help agencies move faster. During the pandemic, it has. One example is automating document workflows so that AI replaces manual data entry and extracts metadata to enhance search capabilities. As a result, AI speeds up timelines for constituents. Without having to wait on employees to manually enter data or respond to simple queries, citizens receive the front-facing information and services they need faster. Agencies can build AI faster in the cloud, too. Developers access capabilities through simple application programming channels, so they don’t have to build or integrate models from scratch. Cloud services like Amazon SageMaker remove the busywork and infrastructure so that data science teams are more productive and efficient when rolling out [machine learning].”

Read more insights from AWS’s Tech Business Development Manager of AI and ML for the Worldwide Public Sector, Joe Pringle.

 

How AI Demands a New Vision of the Data Center

“Technology originally developed to improve PC-based gaming and multimedia applications nearly 30 years ago is now driving advances in the field of artificial intelligence. In the early 1990s, when PC gaming was beginning to take off, the Graphics Processing Unit (GPU) was invented by NVIDIA to render an image by breaking it up into multiple tasks that could be executed in parallel. Today, the same approach accelerates processing for a wide range of applications, not just on PCs but also on the world’s fastest computers.”­­­

Read more insights from NVIDIA’s Vice President of the GPU Data Center Architect, Curt Smith.

 

DoD’s Battle Against COVID-19, With AI at the Helm

“When you’re talking about a domestic threat like COVID-19, for us to, for instance, predict how COVID-19 is going to be affecting a certain military installation, you might need data from things that would be nontraditional DoD data. So, you might need data from CDC, [or] from Department of Labor when it comes to unemployment. So, these sorts of datasets I think are really hard for the DoD to have, because they’re not traditional military data. But at the same time, for us to do accurate modeling, we do need datasets like that. So, this project had a lot more sort of rigorous policy review for data, more so than a project like predictive maintenance, for instance.”

Read more insights from Chief of Policy at the Department of Defense’s Joint Artificial Intelligence Center, Sunmin Kim.

 

Using AI to Improve Veteran Care and Save Lives

“It’s been an amazing journey from a veterans’ experience perspective. The Veterans Experience Office came out of the crisis of Phoenix, when there were the issues with the lists of appointments and veterans were not getting timely appointments – and the data was showing things differently. We did not have the customer datasets. We had a lot of operational data, we had a lot of financial data, but we did not have necessarily the data for [customers]. And I think that from the customer perspective, I think that’s a key aspect with AI. You can’t have AI if you don’t have the right data in place … and that’s something the VA has been very diligently working on.”

Read more insights from Department of Veterans’ Affairs Chief of Staff at the time of the interview, Lee Becker; Director of Enterprise Measurement, Anil Tilbe; and Acting Executive Director of Multichannel Technologies, Laura Prietula.

 

Improving Public Health Through AI

“Traditionally, public health plays the role of a data aggregator. We’re collecting large volumes of information because we’re interested in understanding how often illnesses or injuries occur, not just at an individual level, but across entire communities or entire populations as a country at large. And we use that information to try to understand why those diseases or injuries occur, and then we use that to take action that will allow us to address really significant threats to the public health at their source. AI can play a role at many different places in that information chain.”

Read more insights from the Centers for Disease Control and Prevention’s Entrepreneur in Residence, Paula Braun.

 

Download the full GovLoop Guide for more insights from these artificial intelligence thought leaders and additional interviews, historical perspectives and industry research on the future of AI.

Making the Most of Content Discovery Software

Images and videos have long been key to critical workflows carried out by government and law enforcement organizations. The ability to detect, recognize, match, correlate, and understand what is in a picture or a video sequence are areas where automated software solutions can save time and improve quality.

By necessity, analysts and investigators need to be creative in their work, often generating and following leads from different angles and points of view. While “point solutions” such as facial recognition are available, more is needed to truly facilitate investigative work. What is required is software that integrates not only facial recognition but also objects recognition, text recognition, automatic labeling, visual search, and other related computer vision and AI technologies, all working together to provide a multifaceted view of the data.

Search and Discovery

Investigators, analysts, and knowledge-workers often have to make sense of and discover patterns in data that comes from many different sources. Bits of information from structured and unstructured data can then form the basis for actionable conclusions, strengthened by supporting evidence. When the data sources are from images and videos—whether archived or collected in real time—the user is often not sure of exactly what they are searching for or trying to discover. They are looking for patterns that lead to information that is pertinent to the task at hand, but it’s hard to know what information will be needed to identify those patterns.

In many cases, software automation tools on the market today focus on “matching” things, such as faces. Some extend the capability to cover “search,” such as matching a face but only under user-defined selective conditions. Even fewer tools support “discovery” functions—using other clues to help figure out which face the user should be interested in searching for and eventually matching to. All of these capabilities need to be packaged in a productive interface that lets users effectively communicate with the software to express their search intent. Further, the software is required to work at scale.

PiXlogic Content Discovery Software Blog Embedded ImageData Types and Technologies

The ability to detect and recognize faces is a helpful feature in many use case applications. Capabilities of facial recognition technology have grown rapidly over the past few years, ranging from the ability to identify people in passports or access control situations to identification at scale or in more general settings, including, more recently, the ability to identify people wearing COVID-style face masks.  Requirements continue to evolve, but there are at least some vendors that are pushing the state-of-the-art in this regard.

Recognizing objects within images or video has improved in recent years through the application of deep learning techniques. This technology is most successful when applied to generic classification tasks such as detecting the presence of a “cat,” a “person,” or a “car.” Unfortunately, the technique requires significant quantities of pre-labeled data, which can be expensive and time consuming to acquire, and in some use-cases may not be possible to obtain. Due to these and other technical reasons, commercial suppliers of object detection services on the web tend to limit their offerings to a few hundred common, recognizable object classes.

Many use-cases require higher levels of specificity, such as a search for a specific type of car, not just any car. This specificity requirement is important in the context of discovery. Further, the scope of things or objects that a user may be interested in is far greater than a few hundred or a few thousand, and often it is not possible to know beforehand what will be of interest. Visual search technologies are required to bridge the gap and support these use-cases. This capability lets users search for things that are seen but for which a name is not available, such as an image of a person whose face is completely obscured—visual search can still find useful information in the image, such as matches to the clothes the person is wearing.  There are very few vendors that provide solutions in this space.

The ability to recognize a string of text that appears in the image is also crucial to the search and discovery process. Text can often add specificity to the formulation of a search query, such as searching for a car with a certain set of characters in the license plate. Text recognition and partial-string text search is often used together with other search module types.

An Integrated Package Approach

The integration of the above technologies in a productive user interface provides significant benefits to the work done by investigators and analysts. The goal is to let them discover more, faster, and more thoroughly. Integrating the capabilities means that in the same environment a user can operate on visual data from multiple points of view. Looking down the road, an integrated approach will enable new functionalities, including predictive analytics. These tools will let users identify patterns that would otherwise be exceedingly difficult for humans to spot, and thereby enable proactive responses.

Visit our website to learn more about an integrated content discovery software package.

Best of What’s New in Cloud Computing

This may be a make-or-break moment for jurisdictions newly converted to the cloud. As state and local governments scrambled to respond to new COVID-driven requirements, cloud-based contact center platforms, chatbots and web portals helped multiple states and localities quickly scale capacity for unemployment insurance and social services programs. In addition, cloud-hosted video collaboration platforms helped agencies shift employees to remote work on the fly and virtualize public meetings. IT leaders must now evaluate and rationalize the multiple cloud solutions they adopted so quickly. Now is also the time to look at cost optimization for cloud solutions. The COVID response has showcased real-world benefits of the cloud — and that experience is likely to accelerate a trend that was already underway as governments focus more attention on modernizing old systems and applications in the wake of the pandemic. Read the latest insights from industry thought leaders in cloud in Carahsoft’s Innovation in Government® report.

 

Cloud Migration as a Path to Modernization

“While there may be an increase in initial costs associated with modernizing legacy technology, the economics strongly indicate that maintaining dated infrastructure is more expensive in the long term. The biggest hurdle organizations face when migrating to the cloud is unpredictable costs. The cloud offers tools and resources to optimize investments and plan for the costs associated with migration. In addition, properly planning your move to the cloud helps agencies accurately budget for such a transition. When they do this correctly with the guidance of a strong partner, state and local governments see significant cost savings.”

Read more insights from the Partner Development Manager for Carahsoft’s AWS Team, Sehar Wahla, and the Sales Director for Carahsoft’s AWS Team, Tina Chiao.

 

How Does Evolving Cloud Adoption Impact Security?

“One approach is to standardize processes — think NIST or MITRE — so you have a common framework and language for measuring things like risk and attacks. That helps normalize the differences between cloud and traditional security so security teams can better understand what a risk actually means in a cloud environment. On the technology side, traditional threat profiling needs to move beyond the viruses and ransomware conversation and move toward user and entity behavior management, which looks at how users normally access and use an application. Organizations also need to articulate how separate applications securely exchange data for things like enterprise analytics. This is a nascent use case, but it has implications for critical systems where data integrity is important.”

Read more insights from McAfee’s Chief Technology Strategist, Sumit Sehgal.

 

IIG GovTech Dec. Embedded Image

“The biggest challenges include security, cost, having the technical expertise to successfully migrate into these hybrid environments and understanding which applications are best suited to run there. Organizations often spend a lot of time and money and introduce security vulnerabilities because they try to move applications that are not designed to run in a cloud environment. With the pandemic, organizations are under pressure to rapidly move their workforce into cloud environments. There can be a tendency to cut corners to save time, but these sacrifices can also create vulnerabilities.”

Read more insights from SAP NS2’s EVP of Software Development, Bryce Petty.

 

Paving the Way with Open Source

“There’s a realization that the cloud isn’t a silver bullet and that to be successful, organizations need to look at cloud adoption holistically. They need to take best practices into account when it comes to securing the environment, training and enabling staff, and even engaging in the procurement process. Open source supports a cloud smart strategy by helping eliminate vendor lock-in risk and technical debt. By using open source technology and an open source cultural process — where there’s transparency, collaboration and the ability to iterate quickly — organizations can solve their business problems and adapt their requirements based on emerging best practices. They’re not beholden to proprietary systems that may create friction for innovation and are potentially costly to replace, upgrade or move to the cloud.”

Read more insights from Red Hat’s Emerging Technology Lead, Frank DiMuzio.

 

Download the full Innovation in Government® report for more insights from these government cloud thought leaders and additional industry research from GovTech.

As Agencies Embrace Container Technology, Monitoring Challenges Emerge

Container technology is catching on big-time in the federal government as agencies such as the USDA and the National Institutes of Health look to containers to simplify software development and reduce costs.

This isn’t a big surprise. Containers offer enormous advantages over traditional “waterfall” application development processes, where monolithic systems are developed and released in a single effort—a time-consuming and onerous task. It’s also a risky approach. Once deployed, it’s impossible to change or upgrade the software without causing downtime and lost productivity.

A containerized approach to application development works quite differently, making it easier for developers to create and deploy software faster and with fewer errors. To understand why, let’s look at what containerization is and the key benefits it delivers. Then we’ll examine some best practices for overcoming one of the biggest challenges containerization poses for IT professionals: performance monitoring.

Containers Help Feds Develop Apps Faster and at a Lower Cost

A container is an entire runtime environment consisting of an application and all the dependencies needed to run it in a single host. Multiple containers can reside on a single server and share the same operating system components, including CPU, memory, and storage. This leads to markedly lower costs.

Containers allow developers to develop applications in a smaller, simplified, and modular way. Rather than run an entire monolithic application inside a single machine, containers enable an application to be split into smaller modules known as microservices. When a new application feature or update is required, the microservice or code update can be easily tested for errors during the development phase and deployed quickly and easily without having to rebuild the entire application. This allows agencies to deliver services with more agility and speed.

SolarWinds Container Tech Blog Embedded ImageContainers Aid With Scalability

Elasticity is another big benefit. For instance, during the early days of the COVID-19 pandemic, unemployment agencies experienced a peak in website and phone traffic as millions of citizens rushed to file for benefits. These agencies lacked the resources needed to manage this spike. With containers, however, IT administrators can easily scale their applications up during peak hours or down when they’re not needed. They can do so without having to rearchitect applications to deal with scaling requirements, potentially helping them save a significant amount of money.

Containers Pose Unique Monitoring Challenges

Despite their many benefits, containers occupy an ephemeral, virtualized environment that poses performance monitoring challenges. Open-source container monitoring tools can help, but they require instrumentation to achieve the desired monitoring capabilities. Federal IT pros need specialized tools designed to orchestrate container monitoring in a simple and easy-to-use way.

When considering tools, it’s important to remember true visibility goes beyond insights into containers themselves. IT administrators must monitor containers alongside other network and system components. If an application fails, they need solutions capable of quickly identifying where the problem resides. Is it within the container, the network, or the server? Is it happening on-premises, in the cloud, or in a hybrid environment?

Monitoring can also inform you when a container is reaching capacity. With this understanding, IT teams can spin up additional containers manually or automatically and maintain continuity of service (and the mission). Used historically, this data can also inform optimal capacity planning.

Beyond capturing performance metrics and alerts, container monitoring tools should also enable code profiling and trace-level visibility so IT teams can understand what’s happening inside their applications, such as instances of code potentially causing system degradation.

Monitoring Tools Enhance Container Performance

Containers offer many advantages to federal agencies: delivering software faster, lowering risk and costs, and easing the path to application modernization. But as container use increases, agencies must find ways to manage and monitor their environments. Without visibility into their containers and their many interdependencies, it’s impossible for IT teams to know if their investments are performing as expected, scaling as they should, and—most importantly—delivering highly available services to users and constituents.

Explore our resources to learn more about Continuous Monitoring and other protective monitoring infrastructure.

Leaders In Innovation: Identity and Access Management

Agencies have been learning the importance of identity and access management for nearly two decades, but, like many technological evolutions, the coronavirus pandemic has encouraged adoption on an entirely new scale. As remote work became the norm, agencies adapted to use technology like smart identity cards in new ways, enabling capabilities like digital signatures. These new features are secured by the common access card (CAC) in the Department of Defense (DoD) or the Personal Identity Verification (PIV) card in the civilian environment, and all follow the principles and strategies of identity and access management.

Learn more: 8 cybersecurity experts from across the Federal government and industry discuss identity and access management in the latest Leaders in Innovation report.

Shane Barney, the Chief Information Security Officer at the U.S. Citizenship and Immigration Services in the Homeland Security Department, said as agencies move to the cloud, a new common framework focused on data around identity credentialing and access management is necessary.

“I know GSA is working toward that. I’m excited to see where we are heading with that, honestly, because we’ve been working in the identity world for quite a while now, very early on adopting some of those frameworks and trying to figure out a standard and hoping we are getting it right, and I think we’ve made good decisions, we made a couple of errors along the way and more good lessons,” he said in an executive brief sponsored by RSA and Carahsoft.

COVID-19 Has Also Highlighted Challenges

While agencies adapted to renewing or extending smart card authorizations, the pandemic made clear that other form factors must play a larger role in the months and years ahead, especially as agencies move toward a zero trust architecture.

Steve Schmalz, the Field Chief Technology Officer of the Federal Group at RSA, said agencies, like the commercial world, are starting to understand how cloud and remote workers are making the perimeter disappear.

“Zero trust is a fantastic conceptual way of dealing with that and talking about how you have to make sure to authenticate closer to the resource or make use of attributes and entry based access control to determine whether or not somebody should be allowed access to a particular resource,” Schmalz said, “That process of implementing attribute-based access control, looks like what you would have to do to implement a full zero trust architecture, where before individuals or processes get access to another resource, you have to check, you have to do some authentication.”

FNN Leaders in Innovation Blog Embedded ImageThe Future of FIDO

The changes happening, whether at DoD, the U.S. Army or across GSA’s shared services, are not going unnoticed by the National Institute of Standards and Technology (NIST). David Temoshok, the NIST Senior Policy Advisor for Applied Cybersecurity, said the standards agency is updating the Federal Information Processing Standards (FIPS) 201 document to allow for new kinds of tokens such as those from FIDO Alliance.

“As FIDO continues to mature as an organization in standardizing secure authentication processes, one of the things that they have established is a certification program for devices to both be certified for conformance to the FIDO specifications, but also to evaluate the security because FIDO tokens and the FIDO authentication processes use cryptographic keys for cryptographic authentication processes, which are very secure, very resistant to man-in-the-middle and phishing attacks,” he said. “We would be recommending their use for both external authentication processes, but also internal, where it’s convenient for agencies to use that.”

Connecting the Dots with ICAM

Along with NIST’s FIPS-201 update, the Homeland Security Department has made identity the center of its continuous diagnostics and mitigation (CDM) program. Rob Carey, the vice president and general manager for global public sector solutions at RSA, said what continues to become clear throughout this discussion and use of identity credential and access management (ICAM) is the old way of “one type of approach for all” continues to be proven unworkable.

“We’ve used the term to any device, anytime, anywhere, and DoD for probably 20 years now. Now we’re at the precipice of delivering that. As you validate, authenticate, the question is the back end, how are the systems and the business processes embracing this authorization to move forward to allow the right people to access the ERP or the financial management system,” Carey said, in a panel discussion sponsored by RSA and Carahsoft. “How are we connecting those dots with this somewhat new and better framework that we’ve talked about using role-based access, attribute-based access control?”

As agencies continue to prioritize zero trust architecture, the growth of identity and access management will only become more prevalent. Download the full Leaders in Innovation report to hear from agency leaders at UCIS, CISA, U.S. Army, DHS, DoD, GSA and NIST on how they’re tackling the challenges and reaping the benefits of identity and access management. 

How Facial Recognition Can Keep Flexible Workplaces Safe

As state and federal agencies begin exploring hybrid workplace models and planning on how to keep employees safe as the COVID pandemic continues to evolve, compliance is a critical piece of the puzzle. Office reopening plans are only as successful as their implementation, and government organizations must be able to ensure that whatever precautions they put into place—from requiring masks and social distancing to keeping remote or revolving workstations secure—are effective.

One emerging solution employs facial recognition software to ensure that COVID-era guidelines are being followed. This low-budget solution takes advantage of existing cameras within the workspace and emerging facial recognition technology that distinguishes individual faces with and without masks, providing users with automated reports and insights on the safety of their workspaces.

Monitor mask compliance

Wearing masks in an office environment is one key to facilitate the safe return to in-person operations, but monitoring and compliance is critical to ensuring the policy is effective, especially in large government buildings. Establishing checkpoints is one way to demonstrate compliance, but that approach isn’t necessarily efficient or effective—it requires manned stations throughout the facility, and employees can still remove their masks once they’ve passed the checkpoint.

PiXlogic Face Recognition Embedded ImageInstead, workplaces can turn to cutting-edge facial recognition software—which, in the time of COVID, has been adapted to recognize masks as objects and can differentiate individuals both with and without masks. This technology can extract valuable information from the video feeds of existing cameras and can operate around the clock without the need for additional personnel.

Such software enables users to compile reports on mask-wearing within the workplace, pinpoint areas or situations where employees are more likely to remove their masks, and identify individuals who repeatedly remove their face coverings. These insights allow organizations to shape and strengthen their mask-wearing policies and demonstrates compliance within the workplace.

Ensure social distancing

Another key COVID-era workplace policy is social distancing, which presents its own challenges in bustling hallways, conference rooms, and other communal areas. Proper distancing can be especially hard to enforce in situations where employees are moving about or passing through.

Software automation can be used to identify infractions and bottlenecks by analyzing the camera feeds for reoccurring instances of congregating. Workplaces can use this insight to pinpoint areas where social distancing is hard to maintain and implement changes to reduce bottlenecks or manage the number of people in one part of the facility.

Identify remote workers

While state and federal organizations are working towards bringing more of the workforce back into the office, flexible and remote work will continue to be critical for keeping employees safe and healthy. Most organizations have implemented telework policies including a VPN and secure authentication, but facial recognition software can ensure that only the authorized individual is sitting at the terminal for the duration of the session.

While a user is logged into a secure session, the software can use the computer’s camera to send regular images of the user to the server for automated monitoring. If the user steps away from the computer and there is no face detected in the images, the software can automatically terminate the session. Similarly, if the user steps away and another person takes their place, the software can identify the switch and end the session as well, ensuring that only the authorized user has access to the VPN.

Like many organizations, government agencies have had to pivot their operations to keep employees safe during COVID. As workplaces start figuring out how safely bring more employees back into the office, policies and compliance must work in tandem to reduce the spread of COVID as much as possible. Facial recognition software is an easy way to use existing infrastructure to assess just how safe the workplace is—whether it’s in office or remote—and adjust policies if needed.

Software company piXlogic has adapted its facial recognition technology to overcome the challenge of detecting masked faces and can identify individuals, with or without masks, at a high degree of reliability and accuracy. piXlogic has structured its software to seamlessly count masked and unmasked individuals and provide reports on compliance with COVID precautions within the workplace.

Learn more about how piXlogic is helping organizations return to work safely.