Marketing measurement and analytics | 021 Newsletter

Marketing measurement and analytics | 021 Newsletter

Share this post

Marketing measurement and analytics | 021 Newsletter
Marketing measurement and analytics | 021 Newsletter
What I’ve learned writing 20+ case studies for dbt Labs
User's avatar
Discover more from Marketing measurement and analytics | 021 Newsletter
Helping marketers make sense of data, from zero to one. Learn about marketing attribution, growth models, incrementality, and more.
Over 1,000 subscribers
Already have an account? Sign in

What I’ve learned writing 20+ case studies for dbt Labs

The data issues companies face, what drives them to change course and what can turn data teams into profit centres.

Barbara Galiza's avatar
Barbara Galiza
Sep 05, 2024
5

Share this post

Marketing measurement and analytics | 021 Newsletter
Marketing measurement and analytics | 021 Newsletter
What I’ve learned writing 20+ case studies for dbt Labs
1
Share

I took on my first “fully content” project nearly two years ago. I wanted to flex my writing muscles. dbt Labs wanted someone who got data and could explain the value of their product to COOs and CFOs. Fast forward to today and I have ~30 customer stories written for dbt Labs, with featured clients ranging from Nasdaq to McDonald’s to Whatnot.

Did I improve my writing along the way? For sure. Most importantly, I learned a lot about the data industry.  

It feels like I’ve conducted tens of interviews with companies of varying sizes on how they struggle with data and what they’re doing to fix these issues. 

Case studies are a window into understanding the problems a company faces

These studies are more than just stories; they're a window into the real-world struggles and triumphs of companies dealing with data. In this writing process, I’ve learned:

  1. The Puzzles Companies Face: What specific data problems do businesses encounter and how do the companies themselves understand/communicate said problems?

  2. The Moment of Action: What (internal or external) triggers a company to seek out new tools or workflows to solve these problems?

  3. Evaluating the Toolbox: What criteria do businesses have to evaluate different data solutions in the search for the right fit?

  4. Defining Success: What does data success like?  

For this article, I cover two problems that are faced by organisations (poor data quality and disconnect between marketing / data) and their corresponding consequences.

Don’t forget to subscribe to 021 Newsletter to learn more about the intersection between marketing and data:

The ubiquitous problem: poor data quality creates a shaky foundation for businesses

Data quality is a pervasive problem, brought up by nearly every (if not every every) interviewed customer. It has a domino effect, creating consequences that affect the productivity of the data team and how the whole organisation leverages data.

Data quality is most commonly seen in the shape of:

  • Unstable pipelines: Data pipeline breakdowns are often undetected if there’s no alerting system or observability in place. This leads to decisions made on faulty data (because the dashboards or metrics are incorrect) or missing historical data.

  • Metric discrepancies: When different teams calculate essential metrics like Monthly Recurring Revenue (MRR) or activation rates differently, it's like they’re speaking different languages – nothing aligns.  

Poor data quality takes everyone’s time and removes all trust on the data

Consequence #1: Lack of trust leads to ditching the data altogether

Poor data quality leads to significant trust issues from stakeholders. From my experience as a consultant, this leads marketing teams to either: 

  • Get their own data, like manually exporting from multiple sources (ad platforms, etc) and hacking together solutions—that often can’t be audited or scaled—like a spreadsheet. In the best case, the data is correct but it takes loads of time to create it. However since this process is so manual and error-prone, usually it leads to faulty data.

  • Ditching data altogether. When data is “broken”, some marketers are more than happy to lean into their hunches, intuition or best practices. Defining strategies can become political, since it’s all “just opinions”. Decisions take ages.

Consequence #2: Rabbit hole searches and reactive troubleshooting

When data is needed, but quality is poor, the pursuit for accuracy turns into a time-consuming endeavour. Analysts find themselves in an endless loop of verification, slowing down the decision-making process. It's like chasing a rabbit down a hole; you keep going deeper without getting closer to the solution.

The kind of message that gives analysts PTSD

Consequence #3: Slow value add, which then leads to stack changes

The situation creates a loop. Data analysts are stuck investigating why data says Y, data engineers are reactively fixing pipelines. The data team as a whole starts to move painfully slowly, adding little value to the business. This can break the camel's back, leading the data team to search for a more scalable solution by investing in a data platform via a new data stack and way of working.

The hidden problem: data teams are not working closely enough with marketing on their needs and use cases

The data team has identified the data quality problem and they go on to search for a solution. dbt Cloud, with engineering best practices like CI/CD and observability, often fits the bill. The team then goes on to find complementary solutions to dbt to form their “modern data stack” and start to build models for dashboards, reporting, etc.

Even though marketing is one of the core departments responsible for making decisions based on the data, they’re often not involved at all in the stack selection phase. (And I’m focusing on marketing here because that’s my area of expertise, but other business stakeholders—from product to sales—are affected similarly.)

The lack of marketing involvement leads to a delay in value creation (e.g. capabilities like RETL and custom auditable attribution models are added way way later) but it also has additional consequences for the data team and the organisation as a whole:

Consequence #1: A poor job showcasing the business value of data

Data teams often measure success through technical metrics like warehousing costs and uptime. However, to transition from being seen as cost centres to profit generators, these teams need to align closely with business objectives. This is more important now than ever, due to the post-ZIRP / layoffs phase we’re in.

When data teams work closer to the business, they can drive and show revenue uplift directly with activities like:

  • Building churn prediction models: increasing ARPU and overall revenue

  • Customising customer onboarding: driving conversion rate to purchases and decreasing CAC

  • Loading data signals for marketing and sales tools: decreasing CPC or CPLs by leveraging data for audience targeting and custom messaging

  • Improving decision making: leveraging custom attribution models and audience segmentation for allocating budget with confidence

  • Maintaining marketing data quality: generating valuable insights by guaranteeing marketing data quality with products like naming convention validators and discrepancy dashboards.

Data teams need to work with business teams (e.g. marketing) to turn into profit centres

Consequence #2: Troves of valuable data just sit there

Many companies sit on a goldmine of data without realising its potential. Below, I’ve selected two examples of how data can directly generate venue:

  • HR SaaS Deputy successfully monetized its data by selling insights to consultancy firms. Data selling is becoming more common with the growth of AI.

  • In my first growth role at Her, we leveraged user data to create compelling PR narratives. 

Anonymised data can be a valuable asset, not just for marketing and strategic decision-making, but also for direct monetization.

Learn more about how to turn data into business value by subscribing to the newsletter (for free):

Consequence #3: Unoriginal analysis because domain experts don’t get a chance to contribute

One principle I firmly believe in is empowering domain experts with data access and participation in data transformation. These experts are the keys to unlocking insights that can significantly impact revenue. It's like giving a master chef the best ingredients; they know best how to create a masterpiece.

Let’s look into marketing in particular. Marketing data is not maths, it’s analytics. Analytics is about context and storytelling. The most original and powerful analysis is done by those with expertise in the topic.

Tools like Google Analytics offer event-level data that marketers can model and explore. Much better than what’s available in the UI

Obviously, finding the sweet spot between governance and accessibility is crucial. Over-restricting data access can stifle analysis, while too much freedom can lead to data quality issues and redundancy. 

It's about setting the right guardrails, with data contracts and governance in place, to maintain data integrity while enabling meaningful business user involvement.

Conclusion: data quality issues must be tackled and teams must work together to drive business value

Data quality should be tackled by the data team. But, after this step is completed, teams need to work together. Educating marketing stakeholders on data, like teaching them SQL and the “whys” behind their data sources, is the best way forward. A marketing data consultant or agency, cough me cough, can also bridge this gap.

I’d like to end this piece with a shoutout to Sofia from dbt Labs, who has been ace at managing the case study project with multiple freelancers and tens of customers.

And for those eager to dive deeper, there's a whole world of data tools out there, like Snowflake, Snowplow, Fivetran, and Hex, each with its own set of fascinating case studies that you can explore.

And, if you enjoyed this article and you want to read more about the intersection of marketing and data, don’t forget to subscribe for free:

021 Newsletter publishes ~1 article a month on all-things-marketing-data:

Gloria Q's avatar
Juliana Jackson's avatar
5 Likes∙
1 Restack
5

Share this post

Marketing measurement and analytics | 021 Newsletter
Marketing measurement and analytics | 021 Newsletter
What I’ve learned writing 20+ case studies for dbt Labs
1
Share

Discussion about this post

User's avatar
When to use click attribution or Marketing Mix Modeling (MMM) for campaign measurement
Should you throw your ad platform reporting out of the window and migrate to MMM? We cover how two different types of attribution work and how to get…
Mar 14, 2024 • 
Barbara Galiza
 and 
Pranav Piyush
3

Share this post

Marketing measurement and analytics | 021 Newsletter
Marketing measurement and analytics | 021 Newsletter
When to use click attribution or Marketing Mix Modeling (MMM) for campaign measurement
1
Marketing metrics you shouldn’t take at face value, and why
Measuring data can be just as important as ignoring data. We cover different metrics that you should pay extra attention to before committing to…
Apr 17 • 
Barbara Galiza
7

Share this post

Marketing measurement and analytics | 021 Newsletter
Marketing measurement and analytics | 021 Newsletter
Marketing metrics you shouldn’t take at face value, and why
Tackling marketing attribution: key learnings from our Attribution Masterclass
What we've taught on measuring true campaign performance and proving marketing ROI.
Mar 11 • 
Barbara Galiza
6

Share this post

Marketing measurement and analytics | 021 Newsletter
Marketing measurement and analytics | 021 Newsletter
Tackling marketing attribution: key learnings from our Attribution Masterclass

Ready for more?

© 2025 Barbara Galiza
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Create your profile

User's avatar

Only paid subscribers can comment on this post

Already a paid subscriber? Sign in

Check your email

For your security, we need to re-authenticate you.

Click the link we sent to , or click here to sign in.