The Roadmap to a Modern Analytics Platform

Just about every organization needs to do a better job at exploiting its data. High quality data fuels insights that help us be more responsive and move from a reactive to a proactive mode, providing better analytical insights. A modern data analytics platform needs to be central to any major modernization initiative and needs to be an early focus of the initiative.

ThoughtSpot Modern Analytics Blog ImageLimitations of Conventional Platforms

Most conventional analytics platforms favor governance and trust over agility and user empowerment.  They focus on ingesting data from structured and internal application data sources. Yet there’s a huge, pent-up demand to mash up all kinds of data from a wider variety of sources, beyond the walls of internal organizations. Users want to get answers to their burning mission questions in more responsive ways.  Conventional platforms are also optimized for functional uses, creating many architectural walls at different points in the data flow—leading to latency issues.

What do we want in a Modern Platform?

A modern platform offers more capabilities to provide work at the speed of business and to overcome latency.  There are four dimensions of a modern architecture.

  1. Infrastructure that allows users to securely connect to any data source: on-premises, in another cloud or organization, or in multiple clouds.
  2. End User Data Prep: The consumers of the data must be able to prepare and model the data they need—and catalog it, so it can be shared with limited IT staff engagement.
  3. Telling a Story: Rather than just displaying a dashboard, users want the power to actually tell the story while sharing data findings.
  4. Governance is the backdrop. Modern platforms enable governance controls and standards as the underpinning of every layer.

In addition to these dimensions, actual capabilities must be enabled in a modern platform:

1. The ability to near real-time streaming as data changes in the real world so data is immediately available to decision makers.

2. Triggers so that when particular events happen in the outside world, there will be the capacity to rapidly expedite the flow of data from the source to users.

3. The capabilities to rapidly curate and enrich data, as appropriate for the type of data and decisions that needs to be made, involve specific technologies and tools in a modern platform.

4. The ability to virtually connect the data and determine what data needs to be physically stored. Modern platforms do not require you to physically lift and shift all data from a source into a physical warehouse.

5. With conventional platforms data preparation is done through an IT department, but a modern platform allows individual consumers and decision makers to prepare the data themselves.

Elements of a Modern Platform

The modern platform has the capabilities to monitor, explore, investigate, and automate. The tools are very sophisticated, and self-service is available, enabling citizen data scientists in addition to more traditional data scientists.

The Analytics Workbench allows consumers to touch the data directly.  For example, at a monthly meeting, executives view data about the organization and ask, “Why did that happen? What if the numbers looked this way?” They want to explore the data. In a conventional platform you would have to get back to them with that information at a later date.  The analytics workbench allows that interaction to happen right in the meeting so you can drill down and answer the why questions.

Exploring: The platform takes these terms in the search bar and actually creates the sequel query, going down to the database on the backend. It does a complex query of multiple different joins, different tables, and data sources. The end user just asks simple question. The system takes that question and handles the complexity on the backend without the user’s knowledge.

Data Science Laboratory and AI Hub: The data science laboratory allows experts to do algorithms predicting and providing prescriptive information with the data in the environment.  The AI hub provides capabilities for very sophisticated simulations and algorithms using natural language processing. It automates by embedding the results into a business process or an operational system as quickly as possible.

Roadmap

Organizations need an explicit roadmap—a defined strategy— for moving quickly, but at low risk, from today’s legacy data and analytics environment to the target environment. A roadmap helps you achieve mission outcomes, reduce the latencies, and reduce cost and vulnerabilities. The roadmap has two main objectives.  Here are the elements of a successful roadmap to a modern platform:

  1. Outcomes: Focus first on organizational objectives. People often look at modernizing platforms from the perspective of IT vulnerabilities or saving money. But you also need to ask, “what outcomes do we need to achieve?” Looking at the capabilities and dimensions listed above and establish which ones you most need.
  2. Prioritize: Based upon your outcomes, prioritize which capabilities and dimensions you actually need.
  3. Select: Choose infrastructure based on your priorities. You will require a portfolio of tools, not a single tool, to accomplish what you need in your roadmap. As a first step, consider augmenting what you already have. Modern tools that align with your existing technology will get quick results and build momentum.
  4. Schedule: Document how and when you will deploy each new capability and tie it back to business outcomes.

IT modernization will be more successful if organizations focus on data and analytics early in the process. This is the heart of your transformation. A modern platform requires multiple dimensions and capabilities at the speed of business enablement. Focus on the business outcomes you want to achieve and develop a roadmap for how to meet your mission objectives quickly, at low risk.

Request a Demo Today to learn how ThoughtSpot can help your team achieve faster insights and Check Out Our On-Demand Webinar “IT Transformation: Roadmap to Data Driven Agency Powered by Modern Architecture”.

Best of What’s New In Data, Identity and Privacy

Last year, state lawmakers across the nation introduced hundreds of privacy bills. One of the most prominent pieces of legislation — the California Consumer Privacy Act (CCPA) — took effect in January, marking the first of potentially many state-level attempts to emulate the European Union’s groundbreaking General Data Protection Regulation (GDPR), which gave EU residents more control over how organizations use their personal information. All of this points to a dramatic shift in how state and local government agencies must manage and protect data. Fortunately, technology tools available to help the public sector address privacy challenges are growing smarter and more sophisticated. Learn the latest insights from industry thought leaders in Data, Identity and Privacy in Carahsoft’s Innovation in Government® report.

IIG GovTech July 2020 Data Identity Privacy Blog ImageProtecting the Data That Matters Most

“Organizations should avoid the temptation to skip requirements and get things out there quickly. This crisis forced organizations to establish work-from-home policies overnight. Work-from-home technologies — whether employee-owned or government issued — must incorporate the organization’s security processes and policies around sensitive data. Government-issued laptops should have remote access capability to keep OS and security product patches up to date, ensure VPN connections are working and generally maintain security standards. It’s also important to conduct and continually reinforce security awareness training focused specifically on working at home or remotely. Then, make the new normal as simple as possible; have everything in place for users to just basically turn on their laptop and log into the system.”

Read more insights from Dell Technologies’ Chief Strategy and Innovation Officer of State and Local Government, Tony Encinias.

 

Simple, Smart and Fast: Search-Driven Analytics for Data Privacy and Compliance  

“Clearly defined use cases are critical. What questions do agencies need to answer to fulfill their mission, and what data do they need to obtain those answers? Once you find that data, how do you store it, and how do you track compliance requirements on that data? How do you enable data sharing and transparency without interfering with privacy and security? Another critical piece is the criteria and best practices used for tool selection. Can you get to granular levels of data and customize security clearances down to the role level or column level so you can govern who’s seeing what without having to create duplicate data lakes for each department? That can create a lot of economies of scale and enable organizations to more easily and confidently share data across agencies.”

Read more insights from ThoughtSpot’s Senior Director of Global Public Sector and Industry Alliances, Helen Xing.

 

Using a Data-Centric Approach to Reduce Risk and Manage Disruption  

“AI and ML have a lot of potential to streamline privacy and compliance, but they also come with certain risks. For example, AI/ML require systems to be trained. If systems are trained inadequately or with inaccurate data, the result may be poor decisions that ultimately cause more damage than good. This is why, as discussions about the use of AI and ML continue, we expect to see more emphasis on accountable development and usage. In practice, this means having requirements around transparency of AI usage, decisions and data quality, as well as robustness in terms of AI security and resilience.”

Read more insights from Broadcom’s Global CTO and Chief Architect for Symantec Enterprise Division, Paul Agbabian.

 

Leading Through Change  

“People have been self-servicing analytical needs for years because they need to answer their own questions rapidly. But are people asking the right questions and are they doing all that in the most efficient digital forms? Proficiency is one of the core capabilities defined in the Tableau Blueprint, which is a prescriptive, proven methodology for becoming a more data driven organization. Proficiency speaks to the need to educate people to see and understand data for decision-making. That includes educating them on how to work with data, measuring the value that they derive from their use of data, and institutionalizing best practices that drive behavior change and informed decision-making.”

Read more insights from Tableau’s Senior Manager of Customer Success, Jeremy Blaney.

Download the full Innovation in Government® report for more insights from these Government Data, Identity and Privacy thought leaders and additional industry research from GovTech.

The Next Phase of AI in Government

For government, artificial intelligence (AI) promises to streamline operations, facilitate decision-making and improve customer services in ways that weren’t possible before. Agencies have already begun using machine learning, robotic process automation, the internet of things and other AI tools to improve operations, but in many ways, AI’s potential is still untapped. In a recent survey of FCW readers, 72% said their agencies have not begun deploying AI-based tools, and 70% said their teams had no training in data science or AI. Learn the latest insights from industry thought leaders in artificial intelligence in Carahsoft’s Innovation in Government® report.
Continue reading