S3T Playbook: Gaining Buy-in for Data Modernization

Embracing Data Modernization: Strategies for Success

GenAI puts a new spotlight on the need for data modernization - which is no easy journey. How to prepare yourself and your team.

Executive Summary

The rapid advancement of Generative AI (GenAI) and increasingly powerful large language models (LLMs) is driving a sense of urgency among leaders who do not want to get left behind. But many of these same leaders are now confronted with the unpleasant reality that their data enterprises are not ready to participate in the AI revolution - and won't be - without significant investment in data modernization.

GenAI apps using foundational LLMs pre-trained on Internet data, have captured imaginations, but represent a small percentage of the world's most valuable "data fields" (the data analogy to "oil fields"...locations where high value concentrations exist). Across multiple industries, vast concentrations of high value proprietary data sit locked up inside legacy data warehouses or dispersed across departmental databases. These data assets are underutilized, pinned down by access rules and sustained at barely viable levels in cost-prohibitive legacy platforms, processes and vendors.

The net effect is that this highly valuable data is accessible only by the teams who have the skills to keep it locked in place; other teams with the skills to unlock the value of the data using more modern technologies are not allowed to get near it.

Astute data professionals recognize a data strategy trifecta for unlocking the maximum value of data:

  • the largest most unfiltered data sets
  • the most powerful AI processing
  • the top software and data talent

The challenge for change leaders: how to inspire and guide an organization to leave behind its status quo and embark on a data modernization journey toward this trifecta?

Significant investment in data modernization is not just an option but a necessity

Data locked up inside legacy systems, cycled through convoluted poorly managed processes, or governed by strict access policies may be unusable in Generative AI or Large Language Model processes until significant changes or investments are implemented.

But data modernizations are fraught with complexity, costs, and perhaps worst of all politics. When a large data asset migrates from one generation of technology to the next, decision authority and budgets inevitably change hands. Departmental data stores, owned and operated for years or even decades by a comfortably entrenched panoply of teams, vendors and systems, now must be transitioned to new platforms managed by teams and vendors with newer skills. Underneath all of this lurks the ever present risk that

This post is for paying subscribers only