Data Quality Management is certainly a hot topic these days. Everyone understands that tags aren’t easy to deploy and are difficult to maintain, and this contributes to headaches and data quality problems for a sweeping variety of digital marketing efforts. When it comes to the practical aspects of managing data quality, several layers of analysis and protection are useful.
In the white paper Data Quality and the Digital World, Eric Peterson warns that companies have started to leak data through the multitude of tag-based data collectors they have deployed across their digital properties.
In Data Quality Management Process Part 2, I covered logical segmentation techniques and technical configuration inside ObservePoint.
Data quality management best practices include the creation and maintenance of documentation. This may include the Solution Design Documents for your analytics deployment, change logs in your TMS, or other files that map the strategy to the deployment of tags across your digital assets.
Best in class digital analysts have become accustomed to complex, dynamic tag deployments that populate variables in specific ways based on how web pages are accessed. Testing the proper functionality of this tooling has historically been a laborious process requiring specialized browser-based tools, careful interpretation of raw data, and manual logging in purpose-built excel files.
In previous posts, we’ve discussed timing strategies of tag audits. What content to audit is another vital part of conversation.
Intranet sites can be an extremely useful and valuable tool for your company, with many possible uses.
Two of the most important questions data quality analysts ask ourselves is what to audit and when to audit.