Using temps to clean sales data – do’s and dont’s
We have worked in organizations that tried to use temps for sales data cleansing, and we now have clients that do this. We know why you do it, and what you seek to achieve. But often we see it’s not done well and as a result, doesn’t get you where you want to go.
We offer database and sales analytics software and services. We clean customer data not as a stand-alone service, but to add more value so that our predictive models are more accurate and actionable. We use structured cleansing and matching processes, B2B databases, rigorous profiling and analytical validation procedures.
Here for your benefit we’ll share our observations and tips about using temps. This is because we are passionate about data, and have seen many instances where data cleansing was not done right, and later we had to work harder to fix it. We sincerely hope you find these tips valuable.
IT, Sales and Marketing should inform each other before data cleansing starts.
Just like you wouldn’t dig a trench in your yard without Call JULIE – we hope! – the group that cleanses data must be transparent and inclusive of other departments that use the data. It’s not about “who owns the data,” but “who knows the data.” Depending on the fields or tables, this can be IT, sales or marketing.
Give each temp clear instruction to follow.
They must be very familiar with the instructions and also the nuances as to specific fields and circumstances, because there will be an unavoidable level of subjectivity. Input sample records and validate if they were corrected as per the instructions.
Determine which fields are appropriate for temp cleanup.
Usual fields for temp cleanup are contact (name, title, company, country), market (industry and employee size), transaction data (i.e., missing values), and internal (rep name/IDs, validations etc.). But the most value from manual cleaning is internal data that requires knowledge of your processes. Postal and market data can be appended through other sources more consistently and cost effectively.
Recognize even the same fields from different sources are not all the same.
Fields like SIC Codes (industry) are different – sometimes significantly – based on the compiler’s algorithms and information. Avoid mix and match if you can get high coverage with one.
Use multiple methods of validation in-process and post-process.
Address fields can be run through CASS and NCOA for a fraction of the cost than, for example, dialing to get the right ZIP Code. Incorporate validation by temps, reps and external sources.
Define the way forward.
Before you start correcting the data, think through and set guidelines for future corrections – frequency, volume as well as expected evolving standards.
So, yes you can use temps, but be cautious in your approach, scope, consistency and validation. Also factor in the cost of alternate methods, out-of-pocket, time (including opportunity cost), anticipated future costs and coverage trade-offs.

President & CEO at Valgen
Rainmaker extraordinaire for our clients. Turns databases into gold. Analytics executive and entrepreneur with a track record of producing significant and sustained revenue gains for sales teams in fleet, transportation, high tech & financial services.