Integration projects fail because of data, not technology. In our experience, data quality issues add 30-50% to integration timelines and budgets. Yet data quality assessment is often an afterthought in technical due diligence.
Why Data Quality Matters in M&A
Every integration plan assumes data will flow from the acquired system to the buyer's systems. Customer records will merge. Products will synchronize. Financial data will consolidate.
These plans fail when:
- Customer records have inconsistent formats, duplicates, or missing fields
- Product data doesn't map to buyer's taxonomy
- Financial data has unexplained discrepancies
- Historical data is incomplete or corrupted
The Data Quality Assessment Framework
1. Completeness
How complete is the data?
- What percentage of required fields are populated?
- Are there systematic gaps (e.g., no data before a certain date)?
- How are missing values handled?
2. Accuracy
Is the data correct?
- Do values fall within expected ranges?
- Do relationships between fields make sense?
- Can accuracy be validated against external sources?
3. Consistency
Is the data internally consistent?
- Do the same entities have consistent identifiers across systems?
- Are formats standardized?
- Do aggregations match detailed records?
4. Timeliness
Is the data current?
- When was data last updated?
- Are there stale records that should have been updated?
- What's the data freshness for key entities?
5. Uniqueness
Is there unwanted duplication?
- What's the duplicate rate for key entities?
- How are duplicates currently handled?
- What's the impact of duplicates on analytics and operations?
High-Risk Data Areas
Customer Data
Customer master data is critical for integration. Common issues:
- Duplicate customer records (average: 10-15% in most systems)
- Inconsistent naming conventions
- Missing or invalid contact information
- No clear golden record when customers exist in multiple systems
Product Data
Product and pricing data integration challenges:
- Different product hierarchies that don't map cleanly
- Historical pricing data gaps
- SKU proliferation and rationalization needs
Financial Data
Financial data quality issues that impact close:
- Revenue recognition inconsistencies
- Chart of accounts mapping challenges
- Inter-company transaction complexity
Quantifying Data Quality Impact
Data quality issues have real costs:
- Integration delays: Each major data quality issue adds 2-4 weeks to integration timelines
- Remediation costs: Data cleansing and normalization projects typically cost $100K-$500K
- Ongoing operational costs: Poor data quality creates ongoing manual work and errors
- Analytics impact: Unreliable data means unreliable insights and decisions
Case Study: The Customer Data Disaster
A strategic acquirer purchased a competitor to consolidate customer relationships. The integration plan assumed a 6-month customer migration timeline.
Reality:
- 42% of customer records couldn't be matched automatically
- Customer contact data was 3+ years out of date for 30% of accounts
- Contract terms were stored in unstructured notes, not standardized fields
- Revenue attribution was inconsistent between CRM and billing system
The 6-month integration became 18 months. The data remediation project alone cost $800K. Three major customers were lost during the confusion.
A proper data quality assessment would have identified these issues pre-close, enabling realistic planning and appropriate purchase price adjustment.