Doug Bordonaro

The Future of Enterprise Analytics is AI

Despite technological advances, the analytics workflow remains cumbersome, mirroring processes from 40 years ago. However, AI, particularly large language models (LLMs) offer a transformative shift. Prioritizing the human experience, AI can directly provide accurate results based on user queries. At Numbers Station, the emphasis is on genuine knowledge management, underscoring that the future of analytics lies in optimizing the human experience with data.

My first exposure to analytics was as a young boy, accompanying my mother into work one day when she couldn’t find a babysitter. She was a COBOL programmer in those early computing days. I remember her feeding stacks of punch cards into a giant beige mainframe computer that took up almost the entire room. She let me play with the used cards, fascinated by the little rectangular holes, and dodge the piles of green bar paper used for printing results that came pouring out of the attached printer. 

Fundamentally, analytics today works much the same as that long-ago day. While archaic languages like COBOL and esoteric machines like AS/400s have largely disappeared from modern IT environments, and endless piles of green bar paper replaced by sleek Tableau dashboards viewed on tablets and monitors, the underlying workflow remains remarkably similar.

From an end-user perspective, getting meaningful information today mimics clunky processes used over 40 years ago. Submit a ticket requesting a new report, wait for IT to assign it to someone, define the specific requirements, wait for development resources to become available, test the first iterations, provide feedback, and finally deploy it into production for use. Modern analytics training programs and technologies have helped streamline and ease many of these steps, but fundamentally it’s still that same cumbersome workflow, especially from the perspective of the end user.

Why did I recently join Numbers Station, focusing on using data-aware AI to solve the analytics bottleneck? This childhood story of green bar reports and COBOL perfectly captures the heart of it. For decades, companies have repeatedly promised that the latest wave of technology would revolutionize enterprise analytics and reporting. They touted the transformative power of dashboards, mobile access, real-time proactive alerting, self-service analytics, natural language search, and many other innovations over the years. Yet through it all, the core workflow for most end users to get the information they need has seen little substantive change. The pain points remain.

Artificial intelligence, specifically large language models (LLMs) like GPT-4, are arguably the first category of technologies set to fundamentally alter these ossified analytics workflows. 

What makes this disruption more interesting is how similar AI-driven advances are already changing workflows and processes across many other domains outside of analytics and business intelligence. Nearly every tool and system I use in my own daily work and personal life is now augmented by some form of artificial intelligence or machine learning. In some cases it’s merely a parlor trick or gimmick so software vendors can claim that their products now include “AI capabilities”. But when implemented thoughtfully, these technologies can entirely transform and revolutionize how we get tasks done.

Let me provide some real-world examples that showcase the power of AI:

  • A forward-thinking colleague recently introduced me to a new AI-enhanced web browser called Arc, which I now use daily. One incredibly useful feature is the ability to hold down the SHIFT key and hover over any link on a webpage to immediately generate a short summary of what that linked page contains—without needing to actually click through to it. Arc also lets you get an instant one-paragraph synopsis of whatever page you currently have open. And it can smart-rename downloaded files based on content so you can always find what you need later. I highly recommend it, it’s changed the way I use the web.
  • Many of the non-technical people in my personal and professional networks now use a ChatGPT subscription in their daily workflows, whether they are teaching university classes, analyzing financial documents, compiling market research, or any number of knowledge worker tasks. The days of tediously assigning research papers and then manually grading them late into the night are quickly coming to an end. For both students and professors, such assignments are becoming more akin to math problems – understand the fundamentals of good writing, but pull out an AI calculator to handle the actual hard work of producing an essay.
  • In my own job, I have regularly used automated call transcription services that can capture video recordings and semi-accurate transcripts of important work meetings and calls. But today’s most advanced tools like Read.ai go far beyond just transcription. They can automatically summarize key action items, highlight important discussion points, analyze how much time each participant spent talking, and even discern nuances in individuals’ speaking styles and cadences. This helps me focus directly on the conversation during a meeting, while still being able to efficiently review and search the content afterwards without having to re-watch a long video or parse through a verbose wall of text.

We are now seeing similar game-changing advances emerge in the world of enterprise analytics with the application of LLMs. Rather than merely building yet another layer of abstraction or interface on top of data to try and make it more accessible and consumable for end users, AI flips the entire paradigm by focusing primarily on improving the human experience. These models are capable of directly understanding what a person is asking or trying to learn from their data, and then generating the appropriate database queries to return accurate results. All within the context of an ongoing conversation.

At Numbers Station, we have seen incredibly high accuracy and relevance rates by leveraging AI, thanks to our relentless focus on data-driven use cases, fine-tuning of models on customers’ own metadata, and building unique technologies with a remarkable engineering team. But it’s often the end-user workflow improvements enabled by this technology that get discussed most frequently during our product design process.

Providing accurate, relevant answers is just the table stakes and baseline requirement. The real magic happens on top of that. How can you smoothly guide users to the right data source if they are inadvertently querying the wrong dataset or database for their intended question? How do you build effective guardrails against nonsensical inputs without limiting flexibility? If users can now freely ask anything they want of any data they have access to at any time, how do you simultaneously promote and enable discovery while still guiding them to use vetted, verified metrics and dimensions? When new data becomes available, how does the system cue users to relevant insights they may have missed if querying earlier?

Unlike mundane challenges like performance optimizations or building yet another metadata layer, these questions address issues that directly improve a human user’s overall experience interacting with data. Rather than forcing humans to conform to the constraints of databases and analytics tools, the technology now shapes itself to the user.

This knowledge-focused, engineering-like approach exemplifies the breakthrough inherent in LLMs like that which Numbers Station offers. We are finally crossing the threshold into the long-heralded era of true “knowledge management,” rather than just repeatedly engineering slightly better mouse traps that grant incrementally more refined but increasingly brittle and narrow access to data, all in service of specific limited use cases rather than general human exploration.

Artificial intelligence represents a seismic shift for the world of enterprise analytics, unlocking vastly more capability than any previous wave of innovation. While accuracy and relevance will always remain critical, the focus must expand to the total end user experience. LLMs like ChatGPT can synthesize remarkably robust queries based on simple natural language questions. But effective solutions consider the entire journey – guiding users to reliable data sources, preventing nonsensical or harmful inputs, and streamlining access to the latest metrics. The future of analytics is AI because it focuses first and foremost on optimizing the human experience versus forcing humans to conform to the constraints of data.