The conversation we need to be having in RevOps right now is bigger than tools, bigger than dashboards, and bigger than whatever AI pilot your CRO saw on LinkedIn last weekend.

It starts with data. It ends with judgment. And everything in between is about building the kind of structure that lets you run a business well.

A bit about me first. I work at Teradata, where we service the world's biggest banks, healthcare companies, and telcos. If there's an app on your phone, there's a decent chance Teradata is powering the data analytics behind the scenes.

We do around $1.5 billion in ARR with a go-to-market team of about 700 people handling large, complex deals. You have to be pretty big to buy what we sell.

My job title has changed, as it usually does in RevOps. Today, I'm in charge of revenue strategy, planning, and performance inside the wider revenue operations and strategy team.

Which means I get to do the things nobody else wants to do: turning complex ideas into systems we can actually execute, building management cadence, proving things work, getting buy-in from leadership, and making decisions from data that's rarely as clean as we'd like.

Because here's the truth. Most of our data is bad. Some of it is great, but mixed in with a load of junk generated by salespeople, systems, and the natural messiness of doing business. If you've got perfectly clean data, I genuinely want to meet you.

The career shift that taught me everything

I got hooked on transformation while I was still a salesperson at National Instruments.

I led a major change program, shifting from seven different sales processes across the world and four different CRMs to a single Salesforce instance with a common global process.

Two thousand people involved. Eighteen months of my life. At the end of it, I knew I never wanted to do sales again. I wanted to do strategy. I wanted to work with cool data tools. That's where I've been for the last ten years.

Along the way, I learned something that matters more now than ever: human judgment is still the most valuable thing you bring to the table.

There are probably some of the best Excel users in London reading this. Technical skills, though, are going to get eaten by AI within a few years. My Excel skills are great. I know Claude or OpenAI or something else will match them soon.

What I have that AI can't easily replicate is the ability to gather loads of context about a business, infer things, and come up with judgments at a scale the tools struggle with. Our brains are context machines. And context is everything in our world.

So really, I'm good at making informed guesses and persuading people to follow me based on them. That's a skill worth protecting.

Revenue leadership in an AI-first world
In this candid, free live session, People AI and Aidoc will be direct about what’s actually changed in the CRO role.

The monster coming over the hill

Here's what I want you to walk away thinking about.

You can't stay a dashboard builder. Dashboard building will go the way of Excel wizardry. There's already a new AI plug-in shipping every week that'll build the perfect dashboard faster than you can. If your CRO is keeping you up at night demanding the latest view, that's a bad position to be in long-term.

And there's a monster coming over the hill. AI has the potential to build a new world for RevOps or destroy it entirely. I asked ChatGPT, what to call the monster. It said Claude. I asked Claude. It said Gemini. I asked Gemini, and it said Grok. I asked Grok and it told me there was no monster and Elon Musk is the best person ever. So, you know, take that as you will.

The monster is real. And the way to tame it is to stop being purely technical experts and start being signal architects. People who define the signals the business needs, separate them from the noise, and control what the AI gets fed. Feed it the right things, control it, and you've got an amazing beast. Feed it junk, and you've got chaos.

Call it go-to-market engineering if you like. The title matters less than the mindset. Your job is to translate what your business needs into something sales, marketing, and customer success can actually use.

How to think about data (hint: like food)

If I told you today's lunch was going to be some flour, some eggs, a pot of boiling water, and some stinky cheese, you'd be worried. If I said fresh pasta with the finest parmesan on top, you'd be delighted. Same ingredients. Completely different framing.

That's how data works. If we're going to feed the monster and feed our sales teams, we have to feed them good things. So the starting point is how we think about data.

I borrowed this framing from our VP of financial services, Ben, who was talking about how financial services organizations should think about data. It applies cleanly to RevOps, too.

Start with your data foundations. What's your revenue model? Product-led growth, sales-led growth, the revenue bow tie, a funnel, a flywheel. You'll have a CRM, marketing automation, revenue intelligence platforms, customer success platforms, planning tools, and engagement tools. And hopefully you've got a data store somewhere. A warehouse, a data mart, a lake, a lakehouse sitting on the lake.

Stored data matters because your CRM is a snapshot. It forgets. Salesforce drops a lot of field history after 12 to 18 months. I've got a time machine on my data lake. I can go back five years and see exactly what Salesforce looked like at any given moment. That's the kind of capability you can't replicate from live systems alone.

Then you've got the focus areas you engage around. For me, it's forecasting, planning, performance, enablement, strategy, and cross-functional alignment. Each of those needs tools to help you run the business.

Data products are the unlock

Here's a term that deserves more attention in our world: data products.

Your raw data needs to be turned into something structured and reusable. Think of it as a recipe book. You might have a data product around opportunity forecasting, account prioritization, or adoption and usage. That structured set of data then drives use cases that deliver value.

For forecasting, the use case might be me looking at the forecast and saying, "Nope, this isn't right." Or it might be AI-driven split analysis through a tool like BoostUp, which has genuinely gotten good at predicting whether a deal will slip based on activity patterns. Territory optimization. Lead prioritization. The point is to find things of value for the business.

A data product is a governed, reusable building block. It has data contracts, inputs and outputs, clear ownership, interfaces (APIs, models, dashboards), and documentation.

That last part matters. I once threw a large dataset into an LLM and asked it to query on it, and the response was essentially "I have no idea what any of this means."

Feed it structured data with definitions, and you can ask natural-language questions like "tell me the best-performing AE based on the forecast," provided you've explained that the account owner is the AE, the unit path is the forecast category, and ARR matters more than total contract value.

Documentation turns a dataset into something reusable.

And it doesn't have to be complex. A data product can be an Excel file you generate once a week. Or it can be a full system. What matters is knowing where it lives, who owns it, and how it's managed.

You start with reporting and dashboards exported into Excel. You move into governed analytics, KPIs, and refreshed data. Then predictive intelligence and embedded products. Tools like Clari sit at the top of this maturity curve around opportunity data.

Building data products gives you a shared language for your business. You stop saying "look at the raw data" and start saying "here's the data product that explains this area of the business, and here are the use cases we can build from it."

How to spend your first 90 days in a senior RevOps role
Your first 90 days, your first month, is building relationships, building trust, not coming in with the answers, not coming in to fix anything.

Who needs your data products

The CRO wants answers, though he'll say he wants data. Finance needs additional calculations and will never fully trust your numbers, because that's finance's job. Product needs customer insight, win/loss patterns, and usage behavior.

Your vendors need structure, too. You can't just plug in tools and expect them to deliver value without a defined integration path.

Strategy building needs it. Sales teams need insight and sometimes a kick.

And AI needs it most of all. AI needs context, and context is structured knowledge. AI has a limited context window. It can't process everything in the world at once. It's trained on structured memory, which it can access quickly. If you want to build useful AI applications, you have to feed it high-quality structured knowledge with whatever situational data helps it do the job.

Context windows fill up. Tools either warn you or just start hallucinating. Hallucinations happen elsewhere, too. I was with our enterprise architecture team recently, and we realized the finance team had been hallucinating.

They'd made up account numbers because real ones didn't fit their models, and they hadn't told anyone. That's why we couldn't match 30 accounts across our systems. Finance hallucinates. Maybe they're AI.

A real example: restructuring 700 people in seven weeks

Let me tell you what this looks like in practice.

Last year, a new CRO. Announced in April. We'd planned the year in January with a consistent territory structure we'd spent three years building. Then in April, he told me he thought we had inefficiencies in how we ran the business, that our geographic approach might be wrong, and that he wanted the go-to-market organization restructured by July.

Seven weeks to do something that normally takes six months.

First thought: Oh no. Second thought: this is exactly the kind of thing I love.

We had to figure out whether geographies, industries, or segments should be the organizing principle. All three have valid arguments. Geography works until you've got one person in Thailand who can also be an industry specialist.

Industries work until you realize we're huge in financial services, and Wells Fargo alone spent $400 million on AI infrastructure, so we're nowhere near saturated. Segments work when the long tail is companies like British Airways and HSBC.

Then there's decisioning. Which industries have growth potential? Which has size? Are we actually penetrating them? Can a team in one country effectively manage an account in Japan, another in the US, and another in Germany, with all the language and procurement differences?

Procurement in France, for example, is nothing like procurement in the UK. In France, procurement is a senior engineering role. Walk into a procurement meeting in Paris expecting to talk numbers, and they'll quiz you on specifications. You have to understand how countries work.

Then productivity. I spend a lot of time on this. What's the best use of $100,000? US reps are expensive but sell big. India is more affordable, and some of our India AEs sell as much as our best Europeans at a fraction of the cost. Switzerland is terrifying. I need my Swiss teams to be more productive than two Americans each just to maintain the same ratios.

We were in a good position because we had years of data products already built. Some were formal systems (Gainsight for customer health, structured productivity models, forecasting pipelines, long-range planning, quota models). Some were Excel files on my laptop, which I'd built three years earlier. Either way, they were documented and usable.

We blended those data products into scenario models. Within a couple of weeks, we could look at potential around every account and every resource. My Excel file broke the machine every time I ran it, but I could model scenarios, build new tabs, and sit down with the CRO to say: Here are three options. I think option one is best. You might like option two. You'll hate option three, but it saves the most money.

We went somewhere between options one and two. We built a structure with industry units spanning the world, geographic units with AEs who specialize in industries within their region, and a dedicated emerging markets unit for LatAm, the Middle East, Africa, and Asia, where we needed a scrappier partner-led approach rather than our usual streamlined enterprise model.

Eighteen months in, some units are delivering major benefits, others have raised hard questions about whether we got the split right. That's fine. The point is, we built fast because we had data products to build with.

Six rules for working with AI

AI is useful. AI is also a monster. Here are the rules I work by.

One: No single platform does everything well at scale. If you're small, something might cover most of what you need. As you grow, you'll find platforms are great at one thing and bad at others.

Watch out for vendor lock-in, especially when a vendor holds your data and gives you back only the outputs. We saw this with conversation intelligence tools that ingest your calls but won't hand back the raw transcripts. Now you can't do anything else interesting with that data.

Two: protect your IP. Your IP, both as a business and as a RevOps team, is extremely valuable. Understand where your data and your data products live. If the data product sits inside someone else's platform, it's their data product now.

Three: lead with use cases. Use cases drive value, and value drives adoption. Build for fun, and nobody will use it. Build for the question "what's in it for me?" and people will. Then find parallel use cases that reuse the same data product. The most interesting companies are blending your data with genuinely new approaches.

Four: AI needs knowledge, and should be treated like an unfriendly, possibly incompetent coworker. Would you give a toddler Salesforce admin access? No. Would you trust an AI with full automation across your systems and delete permissions?

Jason Lemkin of SaaStr recently did something similar with vibe coding and discovered his production marketing database had been wiped. Scope what you give AI, define it tightly, test everything, and establish trust gradually. There's a documentary about a guy called John Connor you should watch. Search for Skynet.

Five: share knowledge carefully. Build a knowledge hub that combines data products and feeds them to stakeholders. In my business, the knowledge hub is a person, Jessica Lee, who controls the critical Excel files that go to different teams. It can also be a system. Either way, retain control of what you value.

Six: build for speed. Unstructured data is slow. Cool, but slow. Passing a thousand PDFs through an LLM one by one takes ages. Imagine asking someone to go into Waterstones and find the book with the best cacio e pepe recipe.

They'd know roughly which section to look in, but they'd still be pulling books off shelves for hours. Structure matters, even for unstructured content. Vectors help (turning images and PDFs into directional components you can compare against each other, stored in a standard database).

That's what a lot of the best vendors are doing under the hood.

What we're actually building

We're a knowledge and AI platform company with a long history in data, so we're eating our own dog food. Our data lakehouse in the cloud, called Transcend, holds structured, semi-structured, and analytical data. Our team built a model context protocol server so AI can talk to our systems and rapidly convert natural-language queries into SQL.

On top of that, our enterprise architecture team built an interface with automation and an LLM (we use Claude). It pulls from Salesforce, ServiceNow, parts systems, telemetry, ERP, EPM, ZoomInfo, annual reports from the web, and content from Seismic, blending it into a structured context layer.

Three live applications as of Q4:

An automated account planning engine. Every five to 20 minutes, my phone pings with another email as the engine works through each customer, pulling telemetry, service cases, 10-K reports, investor materials, and behavioral data, then inferring what the account team needs for planning.

A week's work of research delivered in minutes. It won't replace account planning, and plenty of its suggestions will be wrong. What it does is kickstart the process.

Vectorized contracts. I can compare thousands of contracts in seconds. "Show me contracts that look like this one" returns instant results. That unlocks use cases for renewals and customer success teams around what customers are contracted for versus what they're actually doing.

Automated business value assessments. Our customer solution architect team has built a system where the account team talks to Claude, Claude asks for specific documents, and it returns an assessment of the value the customer is getting and the areas similar customers exploit that they don't. This used to take months.

None of this is built by our go-to-market team. Enterprise architecture stood it up with domain experts contributing context. My team and stakeholders from ops translated the business context into prompts. We're editing IT's original prompts to include messaging for 2026 and specific business logic. Scrappy, experimental, no formal process. We pick a deadline, release, and iterate.

Where this leaves you

If you take one thing away, make it this: stop building dashboards as your main output and start architecting the signals your business runs on.

Build data products. Document them. Keep ownership of what matters. Feed AI structured, high-quality knowledge and treat it with appropriate suspicion. Lead with use cases that have clear value. Be the person who translates business complexity into systems people can actually use.

The monster is coming. You can either feed it well and ride it, or you can keep pulling all-nighters building dashboards until the tool that replaces you ships next quarter.

I know which one I'd pick.