Data mesh without the theatre

Most data mesh efforts fail.

If your board is asking what is data mesh, the useful answer is not a four-pillar diagram. It is a working data product: one owner, one contract, one catalogue entry, shipped in the stack you already paid for.

Built by Data Dune — BI consultants for Pfizer, McKinsey, Siemens.
See how it works ↓
7
traits of a real data product
3
practices that make it real
6
Monday steps before buying tools

"[They] consistently brought strategic thinking on architecture, data mesh, governance, and cost optimization — reducing our dashboard delivery from weeks to days."

— Simon Skurikhin, Data Engineer @ Pfizer

Sound familiar?

The idea is not wrong. The starting point usually is. Most teams optimise for architectural purity while the people opening Tableau are still asking which number they can trust.

"Six months in, page one of the contract template is still blank."
Process-heavy mesh turns into thirty-page templates, quarterly councils, and nobody brave enough to ship the first domain. Governance becomes a waiting room.
"Decentralised. Federated. Polyglot. Lovely. What number do I use?"
Theoretical mesh sounds clever in steering committees and useless on Monday morning. Three words your Sales VP will never say do not fix three definitions of Active Opportunity.
"We'll do mesh after the warehouse migration."
Replatforming-first mesh is how good ideas go to die. This works on your warehouse, your BI tool, Salesforce, Data Cloud, Tableau, Agentforce, and whatever else finance already signed for.

What is data mesh, actually?

Treat data like a product. Owned by the domain that produces it. Discoverable by everyone else. Then keep the implementation brutally small: catalogue, contracts, ownership, one domain at a time.

// six Monday steps
01
Pick one domain
Clean cut. Recognisable. At least two stakeholder groups already consuming the data.
02
Name one owner
A person, not a team alias. Someone who can explain, approve, and answer the phone.
03
Write one contract
Schema, refresh, SLA, consumers, owner. One screen. YAML in Git if you want it tidy.
04
Add one catalogue entry
Confluence is fine. Data Cloud catalogue is fine. The point is one link, not another portal safari.
05
Use the changelog and versioning
Every schema change bumps a version and lands in the changelog. Trust is not never change — it is never surprised.
06
Stop, feedback loop, repeat
Live with it for three weeks. Listen to consumers. Fix the rough edges. Then package it for the next domain.
VALUABLE
decision
Names the consumer and the decision it supports. If nobody can name both, bin it before it becomes another unused dashboard.
"Sales forecast used by the weekly pipeline call"
DISCOVERABLE
catalogue
One catalogue entry, one link, one source of truth. People should find the product before they rebuild it badly.
Catalogue + contract, not Slack archaeology
ADDRESSABLE
stable URI
Has a permanent, unambiguous address — a URI, a table reference, a catalogue link. People and pipelines point at it without "which one was the right one again?".
stable://sales_pipeline/v1.2 — same address next quarter
TRUSTWORTHY
contract
Breakage is announced, not discovered. Trust is not never change — it is never surprised.
Versioned contract + changelog
SELF-DESCRIBING
schema + semantics
Carries its own schema, field meanings, and calculation notes. Consumers understand what "Active Opportunity" means without ringing the owner at 9pm.
Schema + field descriptions + calculation notes shipped with the data
INTEROPERABLE
shared keys
Plays nicely with other products. Shared dimensions like customer_id line up across domains so cross-product questions don't need a migration project.
Same customer_id format in CRM, Service Cloud, and Finance
SECURE
access by contract
Access rules live in the contract from day one — not bolted on after a board scare. PII, regulated fields, and row-level security are stated, not assumed.
Row-level security in the contract; PII fields tagged and masked

What this costs your team right now.

The Sales Pipeline war story is the proof. This was not a slow query problem. It was a leadership team that quietly stopped using the platform.

14 → 1
dashboards collapsed into one certified contract-backed product
9 → 1
builders replaced by one named owner everyone could find
3 → 1
conflicting definitions of Active Opportunity made boring
0 → 1
owners to one catalogue entry with a contract and changelog
Before: dashboard sprawl
Sales pipeline reporting14 dashboards
People editing logic9 builders
Active Opportunity3 definitions
Named owner0 people
Executive behaviourSales VP in Excel
After: data product pilot domain
Sales pipeline reporting1 contract
Accountability1 owner
Active Opportunity1 definition
Findability1 catalogue entry
Executive behaviourSales VP back
6 weeks
to first working domain. The Sales VP came back. Excel binned.

Questions directors face every week.

Data mesh only matters if it changes decisions. Here is the boardroom version: fewer arguments about definitions, more confidence in the numbers, and less begging central data teams for every answer.

Director of Analytics
"Which dashboards are safe to use for the Monday revenue meeting?"
Start with catalogue coverage and certified contracts. The answer becomes a short list of owned products, not a folklore tour through abandoned workbooks.
cataloguecontracts
VP of Data
"How do we scale ownership without turning governance into a ministry?"
Domains own the data. Platform owns the rails. Central teams provide templates, contract standards, and guardrails; they do not approve every bloody field change.
ownershipplatform
Head of BI
"Why are we still rebuilding the same metric in five places?"
Because nobody made the trusted version discoverable and contract-backed. One product with listed consumers stops duplication before it becomes another Tableau junk drawer.
contractscatalogue
Chief Data Officer
"Are we AI-ready or just putting agents on top of dodgy data?"
Agentforce, Copilot, and custom agents multiply the blast radius of bad data. Contract-backed products give agents the same trusted definitions your humans use.
AI-readyownership

Start with the cheatsheet. Go production with us.

The free playbook helps your team answer what is data mesh without buying another platform. The paid work turns one domain into a working proof people can copy.

Free cheatsheet
Free
One-page data product checklist
Seven-trait scorecard to grade any dataset before you call it a product
Three practices that work without buying a new platform
Six concrete steps your team can run on Monday morning
Pilot-domain selection with your stakeholders
Contract template adapted to your stack
1-day discovery workshop
Pick the pilot
Map candidate domains and consumers
Find the first valuable data product
Agree owner, contract shape, and catalogue home
Works with your warehouse and BI tool
Production implementation included
Team coaching over six weeks
6-week pilot delivery
One domain, end-to-end
Everything from discovery
One working contract-backed data product
Catalogue entry, changelog, and ownership handoff
Data Dune coaches your team while shipping
Package the playbook for the next domain
Executive readout with next-step roadmap

Questions buyers ask.

The practical bits: replatforming, ownership, regulated data, and how quickly you can get something real into the hands of one domain.

What is data mesh, in one paragraph?

Data mesh treats important datasets and dashboards as products: owned by the domain that understands them, described by a clear contract, and discoverable by everyone else. The grown-up version of what is data mesh is not decentralisation theatre; it is a working pattern that makes trusted data easier to find, change, and use.

We're not Salesforce-shaped — does this still apply?

Yes. Salesforce, Data Cloud, Tableau, and Agentforce are one example stack, not the only stack. The same pattern works with Snowflake, BigQuery, Databricks, Power BI, Looker, dbt, APIs, and the BI tool your teams already open every morning.

Do we need to replatform first?

No. Please do not make this another two-year migration dependency. Start with one domain and one product on what you already own. If the pilot proves a platform gap, fix that gap with evidence instead of vibes.

How small can we start?

One dataset or dashboard. One owner. One contract. One catalogue entry. Use it for three weeks, listen to the consumers, then repeat. If nobody is already using the data, pick a different domain; empty demand makes fake products.

Who owns what — domain teams or central platform team?

Domains own the data because they understand the meaning, trade-offs, and changes. The platform team owns the rails: templates, infrastructure, shared governance, observability, and the standard that keeps domains from inventing chaos in five dialects.

How is this different from a data catalogue tool?

A catalogue tool is a place to list things. Data mesh is the operating model that makes the listed things worth trusting. You still need ownership, contracts, consumers, versioning, and a habit of announcing changes before people find broken dashboards.

What about regulated data, GDPR, or SOX?

Regulation makes ownership and contracts more important, not less. A contract should record classification, allowed consumers, retention expectations, lineage, and change controls. The pilot does not bypass governance; it makes governance concrete enough to inspect.

How long until we see results?

A focused workshop can pick the right pilot in a day. A six-week delivery can produce one working domain with a contract, owner, catalogue entry, and handoff. That is fast enough to prove the pattern and slow enough not to ship bollocks.

A pattern takes weeks.
Trust takes a working contract.

Take the cheatsheet. Pick one domain. If you want the first data product shipped properly, Data Dune will help you build it with your team, in your stack.

Download the data mesh cheatsheet

Talk to us about implementation
No replatforming sermon. No governance theatre. One useful domain first.