Keep Salesforce Running Smoothly With Data Quality Monitoring
- AUTHOR Luke Duncan
- November 14, 2013
- No Comments
One of the hardest challenges facing salesforce.com administrators is keeping their data clean. Our CRM systems have become complex and its not unusual to have data coming into the system from multiple sources and constantly being changed by users and system processes alike. In such a chaotic system, even with strict precautions in place, it is expected that there will be some dirty data.
While you can and should try and limit data quality issues and address them at the source when you find them, you also need to have a process to spot potential issues before they get out of hand. I recommend monitoring specific areas where you have historically seen issues. You can create reports that highlight data errors and have a scheduled email notify you each day. If you don’t see anything in your reports then things are running smoothly. If you do see issues, then you can investigate the cause, fix the data, and hopefully fix the underlying issue.
The monitoring reports you create will vary based on you business process and the systems integrated into your Salesforce org. However, here are a few reports that I’ve found to commonly be useful.
1. Leads/Contacts owned by Inactive Users
2. Leads in an open status owned by Queues
3. Critical fields missing (even if there is a validation rule or requirement on the page layout).
You’ll need to use a customer report type to get to the owners user information to determine if they are active or inactive.
In some cases, you may not be able to solve the issue directly but can monitor and fix errors as they come up. This tends to happen in large and complex organizations when custom apex code hits DML lock errors and does not recover.
You may not know what to check for with the reports at first, but if you make a habit of creating a report when you notice any sort of data anomaly you will quickly build up a set of reports that will help you find data issues much sooner.