top of page

Assuring High-Quality Marketplace Data

At Ascendant Group, we specialize in helping you improve marketplace data excellence!


What is “marketplace data”? And why is it important to manage it properly? And how do I improve our data integrity efforts?


Great questions, I’m glad you asked! 😊


Like many acronyms, we try to be as contextually aligned as possible. It seems in today’s technically diverse world, we toss around buzzwords but they don’t often hit the mark. "CRM" and "ERP" are good examples. The unabbreviated acronyms are at the same time ambiguous and misaligned with the actual solution space.


Definition: Our definition of “marketplace data” is also Ascendant Group’s business focus. We define it as all structured information (database records) that allow the business to manage its marketing, sales, and customer retention/servicing over time.


Now each business model will have a different mix of data management needs…different terminology and data relationships, but in the end, it’s about prospects, leads, customers, messages, campaigns, conversions, competitors, partners, brands, marketplace categories, buying and selling transactions, correspondence, and all context (demographics and firmographics) that help any organization comprehend the marketplace and make better business decisions. It implies a strategy and tool set to unify and perpetually maintain this information to support improved demand management.


Customer Science: This information and the ability to manage it is the subject of our Customer Science™ service set. It is usually managed in the CRM domain, but marketplace information is captured by digital applications and services, marketing tools, 3rd parties, ERP/financial systems, ecommerce, and supply chain solutions. Tracking this information is vitally important to reduce sales costs, errors, and missed opportunities.

The challenges we help overcome are in how to make marketplace data holistic and reliable for re-use in business processes and decision support.


Keys to Marketplace Data Quality Management: Here is the rubric that you must use to both determine your own data weaknesses and your approach to meeting the challenges of data dysfunction.


◾Accuracy: Is the data accurate? Meaning, does the data at any point in time reflect objective reality? Does it specify something meaningful? Since non-historical reality is subject to change, then does the database reflect the most current changes? If you create a field called, “age” and the birthdate or anniversary is passed, then the age is no longer accurate, which is why we never do that…instead we apply “birth year” and calculate age based on current data context.


o Precision: Related to accuracy is the notion of precision, but never confuse these concepts. Precision is a way to express certain kinds of data in the least ambiguous terms possible. Precision creates larger data distributions, so in analytics we often aggregate and make data groupings less precise so we can come to terms with it. But often we want better precision at the record retention level. So, we don’t have a field indicating “top customer” and try to manage that. We may want to measure YTD, LYTD, LY revenue, and revenue growth to the exact total invoice amount, and score it. Then create a normalized distribution and then use analytics to say our top 20% scores are our “top customers”. Often business decision makers are impressed with precision and think it is a proxy for accuracy. It isn’t.


◾Validity: Validity is about record parameters and how they contain data. Since databases are just records and fields managed by manual and automated processes, sometimes fields are populated with information that is not expressed the way the field is designed to retain it. Like, if we define validity of a rank field as “1, 2, or 3”. And somehow a few records receive “a, b, and c” values instead.


◾Completeness: How much of the relevant data has been collected? One of the biggest problems in extended, integrated, participatory software applications is the inability to enforce complete data. So, comparisons between records becomes difficult, and full context becomes elusive. Many processes and tools are used to either infer data attributes or acquire them. Completeness includes both a complete set of records and a complete set of attributes per record. Enlargement is the aggregation of more records and enrichment is making data less porous.


◾Consistency: Related to all three concepts above is the idea that the data across records and over time is acquired, measured, and maintained in a similar way, so the records are comparable and therefore reliable. Consistency will have everything to do with business process. An example is your opportunity or quote probability assignment. If sellers manage this, it’s important to enforce (by convention or by systematic validation) the same probabilities for the same sales circumstances. Like a score, let the system assign the probabilities based on circumstantial data in the record.


◾Timeliness: Time is our adversary. Not only does data decompose over time, the database at any moment in time needs to be “fresh”, not imposing a lag that makes even valid/accurate information impractical. The data need to be applied to the database as quickly as possible, preferably approaching “real time”. Timeliness and Consistency go together. Consistent data refreshment is critical, so the updates give rhythm to decision-making. An example of this is regular ETL updates to your customer data platform each night at midnight…everyone knows the refresh timing and the lag.


◾Uniqueness: Records in a database reflect something “real” or recognized as such by the data user community. But in participatory systems and omnichannel, multi-applications organizations, there will always be record “duplicates” that occur. This means the same real-world entity or event is measured/captured more than once, creating invalidity in the database. Each record may contain part of the complete data or duplicate records may contain conflicting information. Because duplicate detection is an inferential process, detection and removal is a fuzzy, semantical science unto itself, but a very important ongoing effort.


◾Accessibility: Is the data available where needed, when needed? This is not about the database itself, but a demand the user community has upon it. The data is the payload, but what is bringing it to the decision-makers at the proper moments in time so business transactions can be properly enabled and influenced? This is the role of software, and device interfaces. But devices and UI’s are tricky and picky about how they want to acquire and present data. The data must be prepared, structured, and secured in a way that assures high availability to real world business processes. Customer Data Platforms can be a fantastic way to assure data availability.


At Ascendant Group, Inc. we have the tools and processes to help you improve your data overall integrity, so your data is useful and reliable at the time it is needed. Let us help you grow your business with better data. Contact us at info@ascendant-group.com


4 views0 comments

Recent Posts

See All
Post: Blog2 Post
bottom of page