7 Takeaways from the SOCAP Data & Reporting Workshop

We recently had the chance to attend and facilitate a session at the annual SOCAP Data & Reporting Workshop in Chicago. With over 20 of our client companies in attendance - it felt like a bit like an extended family reunion, complete with a birthday celebration!

We also had the chance to hear contributions from a variety of companies and to lead a discussion on data quality. I am pleased to share a few of the key points from the data quality discussion (which was anchored by a survey that 30+ participants completed before the event) and a few broader takeaways.

1 - Data quality plans and targets: Only half have of the companies surveyed have a formal data quality plan and target.

2 - But standards are high: Those that do have targets for accuracy set their sights high, with targets ranging from 90-97+%.

3 - Accuracy definitions vary: While there are clearly some common items when it comes to data accuracy ('did the code we assigned the contact match what the consumer said' is nearly universal and certainly makes sense) - there is certainly not a consensus.

Elements of Data Quality

 

4 - Not all contact types are measured: Frankly I was surprised by how many teams seem to only review complaints for accuracy. This is likely driven by the fact that in some companies only complaints are ever really reviewed by other departments (ie quality, regulatory) and used formally in business processes.

Type of contacts reviewed

 

5 - Not all channels are scored: Just as with contact types, not all contact channels are reviewed for data accuracy. Based on the discussion this is clearly a function of a) making choices with scarce resources and b) not all consumer care teams are taking contacts beyond phone, letters and email.

 channels reviewed.png

6 - Reporting that drives action is personalized: It is clear from consumer care teams that drive action with consumer feedback that the days of sending an excel file with a few graphs and all consumer contact records are pretty much over. Making a positive change seems to work best when either a) only information relevant to one team is shared, and qualitative and quantitative information are used together to tell a story or b) tools to analyze consumer contact data are shared broadly with analysts who look at a variety of data sources. In either case the need for investment is clear.

7 - Agent monitoring frequency: While not everyone has a data quality program the practice of monitoring agents for feedback and coaching is universal. For anyone curious as to how often everyone does this - 4 contacts per agent per month seems to be THE standard!

 

 

New Call-to-action

Topics: data, Analytics and Insights