Large enterprises in the retail, financial, and logistics industries face an imminent need to interact with end users in multiple ways and on various platforms so as to help serve them better. Thanks to the internet and other technologies, today, enterprises are able to provide information back to consumers in a lot more contextualized, personalized and interactive way. For example, Google analyzes every keystroke and provides auto suggestions for results that you might be looking for. Another example is of the ecommerce websites, where in, they search for products based on your search criteria and also recommend products with similar type, price range, etc. They provide product ratings, vendor information, price, time of delivery, etc., and even compare your selected products with other products anytime and on any device of your preference.
This information transformation (from being static -> being dynamic & interactive -> being a decision making tool) has evolved to provide customers with a great experience. Every enterprise (on any scale) wants to serve their customers better and is putting their best foot forward to offer best in class customer experience.
“So what does your customer want? How can you serve them better?”
As long as this self-probing question is being asked, information transformation will strive and grow. New research and ideas on information services and data analytics will continue to happen. New technologies and services will blossom. Today, many enterprises are building systems that analyze social media, weblogs, mobile apps, cloud apps, etc., to understand user activity and are applying analytics to derive customer’s behavior. Every single click, every post, image, video, and audio is collected for analysis. In a traditional BI set up, all this data would be collected, cleansed, transformed, processed, and analyzed; analytics would then be performed to provide results to the management. But that is traditional!
The latest trend is to provide useful information back to users for better decision making. This means that the data has to be analyzed in real-time and information has to be served to the customer within seconds. This transformation of big data into an insight at a faster pace is called as “Fast Data”. This is an emerging trend in the information transformation and services space. But, what good does it do if analytics is done on un-clean data. Accurate analytics and thus accurate insights can only be obtained with Clean, Correct, and Relevant data. This is where Master Data Management (MDM) becomes critical for even “Big Data” and “Fast Data”. Without MDM it would be Garbage in – Garbage out.
MDM has been helping in cleansing, standardizing, matching/merging and governing data for an enterprise for over a decade now. But it has mostly been in a near real-time mode with structured data. Performing MDM on petabyte and exabyte of structured and unstructured data in real-time may practically be impossible with the traditional MDM model. This may require a synergy of MDM with newer technologies of big data. Emerging technologies like in-memory computing and certain big data technologies may have to be integrated with data quality technologies. New paradigms like “In-Memory Data Quality”, “Distributed Data Quality”, and “Real-Time Data Sync” may emerge. When these technological advancements become a reality, MDM could be served on the fly.
What have been your experiences in this space? How have you been able to handle such data in the context of MDM, Data Quality, Data Governance and Data Archiving?