How to Run a CDP Assessment: The Practitioner's Playbook
A step-by-step playbook for running a CDP assessment. 13 sessions across data, product, marketing, and tech teams. What to cover, who to involve, and what to deliver.

Listen to summary
0:00 audio overview
How to Run a CDP Assessment: The Practitioner's Playbook
Picture this. Your VP of Marketing wants a CDP. Your data engineer wants to know what systems it needs to connect to. Your legal team wants to know what data you'll be storing. And your CEO wants to know what it costs and what you get back.
Everyone is asking a different question. Nobody has a shared process.
That is the problem most CDP assessments never solve. They jump straight to vendor demos and feature comparisons. Then, six months into implementation, they hit the walls that a proper assessment would have flagged on day one.
This playbook gives you a structured CDP assessment process. Thirteen sessions. Clear owners. Defined outputs. It works for organizations doing a first-time CDP evaluation and for those questioning whether an existing tool is still the right fit.
Why Most CDP Assessments Fail Before They Start
The standard approach goes like this. Someone builds a feature matrix. Vendors get invited to demo. Scores get tallied. A winner gets chosen.
That process has a flaw. It evaluates the tool before you understand your organization.
A CDP is not a plug-and-play solution. It connects to your data sources, your activation channels, your team workflows, and your governance policies. If any of those are broken or misaligned, the best CDP on the market will underperform.
The CDP assessment process best practice is to assess your organization first, your data second, and your vendor options third.
This playbook follows that order.
Who Needs to Be in the Room
A CDP assessment is not a marketing project. It is a cross-functional process. You need four groups involved.
Marketing and growth. They define the use cases. What decisions do you want to make faster? What campaigns do you want to run that you cannot run today?
Data and engineering. They know what data exists, where it lives, and how hard it is to move. Their input shapes your technical requirements.
Product. If your product generates behavioral data, product needs a seat at the table. Especially for SaaS and e-commerce companies.
Legal and compliance. Data collection and activation touch privacy law. Bring them in early, not at the end.
If you skip any of these groups, you will find out why they mattered during implementation.
The 13-Session CDP Assessment Process
Structure your assessment across four phases. Each phase builds on the last. Do not skip ahead.
Phase 1: Organizational Readiness (Sessions 1 to 3)
This phase answers one question. Are we actually ready to implement a CDP?
Session 1: Use Case Definition Workshop
Bring marketing, growth, and product together. Ask them to write down the top five things they wish they could do with customer data that they cannot do today.
Group the responses into themes. Common ones include: unified customer profiles, real-time personalization, suppression lists for paid media, lifecycle triggers, and churn prediction.
Rank those use cases by business impact and implementation complexity. This becomes your north star for the rest of the assessment. Every decision you make should trace back to these use cases.
Output: A ranked use case list with at least three "must-have" outcomes defined.
Session 2: Organizational Readiness Audit
This is the session most teams skip. Do not skip it.
Ask each stakeholder group to answer honestly.
Marketing: Do you have a documented customer lifecycle model? Do you know what a "resolved" customer identity looks like for your business?
Data and engineering: How many source systems contain customer data today? What is your current identity resolution approach? Do you have a data dictionary?
Legal: Have you completed a data inventory? Do you have consent management in place? What jurisdictions apply to your customers?
Leadership: Is there a named owner for this project post-implementation? Is there budget for ongoing operations, not just the license?
If more than half of these answers are "no" or "we're working on it," document that. It does not mean you stop. It means you know what needs to happen before go-live.
Output: Organizational readiness scorecard with clear gaps identified.
Session 3: Measurement Model Definition
Before you pick a tool, decide how you will measure its success.
This sounds obvious. Almost nobody does it before vendor selection.
Define your primary metric. Is it resolved customer profiles? Email revenue per recipient? Time to first activation? Customer lifetime value by segment?
Pick one or two. Write them down. Attach a baseline number to each. You cannot measure CDP ROI without a starting point.
Output: Two to three defined success metrics with current baseline values.
Phase 2: Data and Technical Assessment (Sessions 4 to 7)
This phase answers the question: what data do we actually have, and what will it take to move it?
Session 4: Data Source Inventory
Pull together every system that contains customer data. CRM, email platform, e-commerce platform, mobile app, website, support tool, POS system. All of it.
For each source, document three things. What customer identifiers does it use? How frequently does it update? Who owns it?
You are looking for two things. First, how many identifiers exist across systems. Second, how much overlap exists between them. This tells you how hard identity resolution will be.
If you have five different email formats across five systems and no shared customer ID, your implementation timeline just got longer. Better to know now.
Output: Data source inventory with identifier mapping and gap analysis.
Session 5: Data Quality Review
Take a sample from your top three customer data sources. Run a basic quality check.
Look at completeness. What percentage of records have a valid email? A phone number? A name?
Look at consistency. Does "John Smith" appear the same way across systems or as twelve variations?
Look at freshness. When was the last update on your oldest active customer records?
Data quality problems are not blockers. They are information. They tell you how much remediation work sits between today and a functional CDP. Build that into your timeline and budget.
Output: Data quality summary for primary sources with remediation estimates.
Session 6: Integration Architecture Review
Bring your data and engineering teams together with a simple question. What would it take to pipe each of these data sources into a central platform?
Document the connection type available for each source. Native connector, API, file export, or custom build. Note which sources require real-time streaming versus batch.
Also document your activation destinations. Where do you want data to flow out? Email platforms, paid media, CRM, personalization engines. These outbound connections matter as much as inbound.
This session is where you discover the technical debt hiding in your stack. A system running on a legacy API with no documentation is a six-week engineering project, not a quick integration.
Output: Integration map covering data sources and activation destinations with complexity ratings.
Session 7: Privacy and Compliance Requirements
This session belongs to legal, with support from data and engineering.
Document every regulation that applies to your customer data. GDPR, CCPA, CPRA, HIPAA if applicable. Note the jurisdictions your customers live in.
Then document your current consent management setup. How do you collect consent? Where is it stored? How do you honor deletion requests today?
A CDP will process personal data at scale. You need to know what constraints apply before you evaluate any vendor. Some CDP architectures handle consent natively. Others require external tools. That is a selection criterion, not an afterthought.
Output: Privacy requirements document with consent management gap analysis.
Phase 3: Vendor Evaluation (Sessions 8 to 11)
Now you are ready to look at vendors. You have use cases, data inventory, technical requirements, and compliance constraints. That is your evaluation filter.
Session 8: Requirements Prioritization
Before sending out an RFP, convert your outputs from phases one and two into a requirements list.
Separate requirements into three tiers. Must-have. Should-have. Nice-to-have.
Must-haves are non-negotiable. If a vendor cannot meet them, they are out. Common must-haves include: specific identity resolution capabilities, native connectors for your core systems, and compliance with your applicable regulations.
Should-haves differentiate your shortlist. Nice-to-haves inform scoring but should never drive a decision.
Output: Tiered requirements document used as the RFP foundation.
Session 9: Vendor Shortlisting
Use your requirements list to filter the market. You are not looking for the most popular CDP. You are looking for the best fit for your use cases, your data complexity, and your team's capability.
Do not start with analyst rankings and work backwards. Start with your requirements and see who meets them.
Send your requirements document to four to six vendors. Ask them to self-score against it before any demo. This immediately surfaces misalignment and saves everyone time.
Output: Shortlist of three to four vendors with self-scored requirements responses.
Session 10: Structured Vendor Demos
Run every demo against the same script. This matters. Vendor demos are designed to show strengths. Your job is to test for fit.
Build three demo scenarios based on your ranked use cases. Ask every vendor to walk through the same three. Score them on the same criteria.
Pay attention to what they skip or redirect. If your top use case is real-time event-triggered activation and the vendor keeps steering toward batch segmentation, that is a signal.
Also ask each vendor this question directly: "Tell us about an implementation that did not go well. What went wrong and why?" How they answer this question tells you more than any feature demo.
Output: Vendor scoring matrix with demo notes and red flags documented.
Session 11: Reference Checks and Proof of Concept
Do not skip reference checks. Ask for customers similar to your organization in size, industry, and use case complexity.
Ask references three questions. How long did implementation actually take? What did you wish you had known before you started? What would you do differently?
For your top two vendors, run a lightweight proof of concept if budget allows. Take one of your actual data sources. Connect it. Build one real use case. Activate it to one real channel. Four to six weeks of real-world testing is worth more than any demo.
Output: Reference check notes and proof of concept results for finalist vendors.
Phase 4: Decision and Implementation Readiness (Sessions 12 to 13)
Session 12: Total Cost of Ownership Review
Vendor pricing is rarely the full cost. Build a complete TCO model before final selection.
Include the license or platform fee. Include implementation services, either internal engineering time or external consulting hours. Include the cost of data quality remediation identified in session five. Include any infrastructure changes, like upgrading your data warehouse or event collection layer.
Then include ongoing operational costs. Who manages the CDP post-launch? Is that a new hire, an existing team member with added responsibility, or an agency partner? That cost recurs every year.
A CDP that looks affordable at contract signing can be expensive by month twelve if the operational model was never costed.
Output: Five-year TCO model for finalist vendors.
Session 13: Go or No-Go Decision
This is the session where you make the call, as a group, with documented rationale.
Present the use case fit, technical fit, compliance fit, and total cost for each finalist. Show your measurement baseline and confirm everyone agrees on how success will be evaluated.
Then make the decision. Document it. Write down what you expect to achieve in six months, twelve months, and twenty-four months.
If the assessment revealed that your organization is not ready, this session is where you say that out loud. It is better to delay a CDP implementation by ninety days to fix foundational gaps than to launch and spend twelve months fixing them under contract.
Output: Final vendor decision with documented rationale, success criteria, and implementation readiness checklist.
When the Answer Is "Not Yet"
A CDP assessment process done well sometimes produces a "not yet" answer. That is a valid outcome.
If your data quality is poor, your identity resolution strategy is undefined, and you have no named owner for post-launch operations, you are not ready. Buying a CDP will not fix those problems. It will make them more expensive.
The right next step might be a data quality initiative. Or hiring a marketing ops lead. Or completing a consent management implementation first.
Knowing that before you sign a contract is the point of the assessment.
Running This Assessment With Support
If you have the internal capacity to run all thirteen sessions, this playbook gives you the structure to do it. If you do not, or if you want an independent perspective on your organizational readiness and vendor fit, that is exactly what the team at House of MarTech does.
We run CDP assessments as structured engagements, with defined deliverables at each phase and no vendor relationships that influence our recommendations. The goal is always the same: help you make a better decision, faster, with fewer surprises.
The CDP assessment process is not complicated. But it requires discipline, the right people in the room, and a willingness to hear what the data tells you, even when it pushes your timeline back.
Do that work upfront. Your implementation will be better for it.
Related Topics
Related Articles
Need Help Implementing?
Get expert guidance on your MarTech strategy and implementation.
Get Free Audit