Customer Data & Analytics Blog

How the World of Customer Analytics is Changing

Iqbal Kaur | 6 minute read


My analytics career started more than 15 years ago, at General Electric’s financial services business. Prior to that, I had worked in various consumer lending based roles within and outside GE, but analytics was a new beast to me—and a very exciting one. Those were the days of SAS, SQL, and structured databases. Social media wasn’t even on the horizon. We considered ourselves lucky to work at a company that had invested so much in data, analytical software, visualization tools, and rule engines etc. 

Five years later, when I joined Target, visualization tools had become more advanced, social media had picked up, we had started to hear about open source analytical software, and even a new way of storing and processing data. Things had started to look different in the world of analytics, but few of us realized that these were no minor changes, and that in the next decade the face of consumer analytics would change exponentially. 

Let us look at what has changed, and how it impacts enterprises today.

Data Platform:


From Relational and Object-Relation databases, to NoSQL and NewSQL – database technology has undergone a huge change. This has allowed us to manage large quantities of structured data, while storing and processing unstructured data that is being generated to the tune of exabytes every single day. 

Hadoop has taken storage and processing to a different level through distributed file systems and the MapReduce programming model. We now live in a world where we can store and process humungous amounts of data in extremely short time frames. Gone are the days when extracting 50 variables for 10 million customers could take hours. 

Another service that has contributed significantly to change is cloud storage and cloud computing. Today, one no longer needs to own and manage data servers, and one doesn’t need on-premise installations of analytical software and visualization tools.  All one needs are APIs, or applications that use APIs. Another technology that is on the rise is Hybrid Cloud –  a mix of on-premises, managed cloud, and public cloud – that remain distinct entities but are bound together, offering the benefits of multiple deployment models.

Analytical Tools and Languages:


SAS was the undisputed market leader in the commercial analytics space for many years. But open source languages like R and Python have overtaken the expensive SAS in the era of machine learning. 

All three languages have the basic required functions, but having up-to-date technologies and functions matters a lot if your work expects it. R and Python are open source in nature, so they are enhanced with the latest technologies and features more quickly than SAS. 

While R has a much larger library of statistical packages, Python is better for building analytics tools. If you’re doing specialized statistical work, R packages cover more techniques and allow for better visualization, but Python is more effective for deep learning. Overall, R has an edge in statistics and visualization, whereas Python has an advantage in machine learning and building tools. 

Open source R has allowed many companies to start doing analytics without having to invest in an expensive SAS environment. And Python is the reason behind the plethora of analytical tools in the market today. 

Analytics approach from Statistics to Machine Learning


At the heart of it, analytical techniques haven’t changed a lot, but there does exist a significant difference in what techniques were used in the past versus now. 

The most common supervised learning techniques used previously were linear regression, Naïve Bayes, logistic regression, CHAID, and time series methods like smoothing, ARIMA, ARIMAX etc. The unsupervised methods mainly used K-Means clustering, PCA etc. This was primarily because of availability of products and resources. 

For basic optimization problems, excel solvers were used, and SAS OR wasn’t the best for complex optimization issues due to the coding needed, so specialized software, like Dash Optimizer, was used. 

Neural networks were being used in areas of fraud detection, but again needed specialized tools. 

Today, the machine learning advancements allows for the use of “deeper” techniques that can uncover complex patterns, non-linearity, variable interactions etc., with much higher accuracy. For instance: 

  1. Supervised – Linear Classifiers like Naïve Bayes Classifier, Perceptron, Support Vector Machine; Random Forest; Neural Networks
  2. Unsupervised – K-nearest Neighbours, Matrix Factorization, Autoencoders, Restricted Boltzmann Machines, SOM
  3. NLP & Text Mining – Latent Dirichlet Allocation (to generate keyword topic tags), Maximum Entropy/SVM/NN (for classification for sentiment), TensorFlow etc. 

A special version of machine learning is Deep Learning – this is the class of ML algorithms that is based on the learning of multiple levels of features, or representations of the data. Higher level features are derived from lower level features to form a hierarchical representation. Deep learning algorithms transform their inputs through more layers than shallow learning algorithms. Deep learning has special uses like Automatic speech recognition, Image Recognition, NLP etc. 

Data Visualization:


I don’t think many of us have the abbreviation “MIS” in our dictionary any more. The emergence of BI tools, with the focus on presenting the data in the most effective manner, played an instrumental role in making analytics accessible to everyone. These tools take complex data and present the information in an easy-to-interpret visual format using Data Visualization. 

No one likes to use basic bar graphs and pie charts any more. Today information is displayed using sunburst charts, box plots, cartograms, sankey charts, zoomable area charts, fisheye distortion, hive plots etc . More relevant to this era is the advent of infographics. 

Modern BI tools also enable DV that extends across multiple devices, allowing multiple people across different devices to look at the same information in real time. 

Customer Interactions:


I started my analytics career when direct marketing was the key method of interacting with customers. The data would be uploaded into a Customer DW on a weekly basis.  We would then extract it, build predictive models, segment customers, and push the list of the selected customers through campaign execution tools. 

But in today’s omni-channel world, there are many ways to interact with customers. The purchase paths today are no longer simple and linear – they are longer and can be influenced by many factors. 

Customer experience in this Big Data age calls for relevance at every touch point.  Making customer experience relevant, more targeted, personalized, and customized is extremely important for customer retention and monetization. And this customer relevance cannot be episodic – maintaining relevance continuously as customers move seamlessly from one touch point to the next is a new demand of the digital age. 

To deliver these relevant experiences at scale, companies need to be able to tap into the wealth of data sources that provide valuable signals about what attracts and holds customer attention at any moment in time—not just data about geo-location, gender, age, product preference or purchase, but also—more importantly— ongoing behavior, intent, interests and, who they talk and listen to. Of course, there is also a need to integrate these data points in near real time, using systems that can handle the multiple platforms that your customers are using to connect with you. 

In summary, a lot has changed in the area of Customer Analytics – be it customers’ interactions, their expectations, or the data and technology to support these expectations. But what hasn’t changed is that customer analytics continues to be as exciting as ever. In fact, with all the advancements in technology, it is expected to become even more interesting. Imagine:  ongoing real-time customer behavior data, blended with psychographics, intent, and demographic.  In this way, deep learning can give far more accuracy regarding propensity, businesses can review each customer, push individualized offers, retain them pro-actively, and focus on customer lifetime value. 

At ZyloTech, with an AI-powered platform, we automate the entire analytics supply chain, from collecting data, to ongoing data curation, to embedded analytics, and power a near-real-time customer intelligence dashboard, as well as individualized customer call to action.

Here’s to the exciting future of customer analytics, and the key role it will continue to play in Enterprise!

Topics: Customer Analytics