15 Critical Data and Analytics Mistakes to Avoid

Facebooktwitterpinterestlinkedinmail

Do you believe you are closing all of your data analytics gaps to run your operations successfully? Do you know what are some of the biggest areas data professionals tend to overlook? In this blog post and supporting podcast we will dive deep to discuss the “15 critical data and analytics mistakes and how to avoid these”.

Listen to the 15 Critical Data and Analytics Mistakes to Avoid

Show Summary:
1. De-prioritizing of business outcomes over analytics projects

Analytics professionals tend to get busy with the day to day asks in the organization. Fire drills from senior leadership, customer reports and stakeholders asks increases the complexity of prioritization. All these asks starts to become a project in itself. So right now, for example, my situation is I have more… I will say 80% of the requests that are coming are mostly housekeeping items and they tend to become big projects in itself because some of them could get really complex and we have to leverage IT and all the other departments to sort it out. But then 20% is what I think should be the 80% of the focus for most teams, what we are trying to do here is focusing on business outcomes. Focusing on what exactly is going to drive the next set of growth for the company. How are we going to improve the performance of marketing? How are we going to change things that is essentially going to impact, a positive impact on the close rates, velocity of the lead. So we start to de-prioritize those business outcomes just because we want to manage these tons of analytics projects that takes a lot of time.

Whoever the leader of the analytics team is and looking after the data and the analytics, they should reconsider and kind of re-jigger their teams and try to put the focus back on the business outcomes. Because there’s no shortage of analytics projects in the organization. There’s always going to be something going on at any given time. So if you really want your group to expand, you should make a business case for adding of team members, or tools, or technology, or data sets then you have to focus on the business outcomes.

2. Not having a Centralized approach to analytics management

For this one in particular, I think for me it starts with this idea of like a center of excellence. If you look at a lot of companies who do analytics well, what they do is they create a center of excellence. So it’s not really just having a single person that everybody reports too, but really just a strong cadence between different groups. Having representatives of different parts of the organization that are responsible for analytics management and having them have a same seat at the same table, where those discussions can happen on a consistent basis. Everyone can be in the same understanding of what needs to happen, what’s prioritized, and really how to centralize things, so there’s not one team who’s reporting one thing and the other team says, “No, no, no. You’re about five points off.”

Additionally, what I have seen from experience leading analytics and demandgen teams is when you start an organization, specifically if you’re building a new marketing organization, you tend to have analytics as a small functional unit within a demand generation group. So it’s a small group or maybe a couple of people that are reporting to a demand generation lead and they’re running analysis and doing things. But then eventually, you start seeing the opportunity; and once the company has really aggressive goals and they start to grow, I think that’s where the centralization of analytics comes into picture. Because then what’s gonna happen is if you have a team working on one specific set of demand generation project, then generally, most of the work that analytics team will do will be focused on that particular team’s needs. So, it’s not gonna be essentially central, it’s not gonna be a layer inside of marketing.

It will be silo to that team. And that’s what I’ve noticed, when these things happen it’s time for the leadership and all of the different leaders in the group needs to consider that, “Hey, now it’s the time for this operations or analytics group to be a separate entity in itself and reporting directly into CMO, so they can function, and expand, and grow based on their needs. So, they’re not tied to the need for that particular team that they’re a part of.” So it’s very critical to have a centralized approach to analytics that can serve all the different segments of marketing independently. I’m talking specifically about marketing analytics. But if you think, say the same concept of sales operations, financial operations, HR operations, all of those areas needs to be centralized. Again, depending on how big or small the organization is. If you’re a three, four person team then I’m not saying you’re gonna go create a completely different analytics department.

But if you’re a decent-sized organization and you know the needs are becoming larger and you want to function independently and serve everyone equally, then you have to have this centralized group.

3. Too much data and not enough insight

We discussed the data volume issue in our previous podcast, which is Five Actionable Lessons from Big Data Survey. We talked about why the variety of data is more important than the velocity or volume. So, when we talk about increase in data, essentially we’re not just talking about the volume, which, we all know how significantly is growing; IDC predicts 1.7 MB of new data will be created every second by 2020, which is…40 zetta bytes. I don’t know what Zetta means but… I think that’s one big problem, but also the same time the variety is another issue. There’s data coming from a call center, there’s data from your web analytics platform, there’s CRM data, there’s social media data; there’s all sorts of data coming in, adding more complexity. So, I think what’s happening is, people tend to spend a lot of time on trying to figure out where the data is coming from, and how much data is available. And essentially, the whole project becomes sort of a data project and not necessarily an actionable insight project, which I think, which is what we’re seeing here.

If we go back to the number one mistake, which is de-prioritization of business outcomes, it kind of connects with the number three. If we have prioritized our business outcomes correctly, then we know what types of datasets, we need to work with, what are the different platforms that we’re gonna interact with, and what will be the essential outcome that we should come out of that, what should be our recommendation. It’s very tightly connected with the prioritization of business outcomes, so that we don’t focus just on the variety and velocity of data; we focus on the actionable insights.

4. Performing analytics on raw data alone

I think this is a very critical one and a problem I come across consistently. What’s happening is because the data’s being created with such enormous pace, it’s really hard to keep up. Then what happens is, essentially once you start seeing the data, you wanna jump and take action on the data right away before going through the entire process of extract, transfer, and load (ETL). You can start cleaning the data, you can make the data usable, you can connect the data to the data warehouse, which I think it’s more of a BI role and sort of an IT role. I think this is where the balance comes in. Once you’ve built your analytics infrastructure, you wanna leverages your IT department. IT department is your liaison, they are your friend, they’re your allies in this process.

So, you’re not spending a significant amount of time making this data usable. You’re leveraging their help to provide the infrastructure and they can set that cadence up, where you have the right technology, the right infrastructure, so you have data in a usable format.

I’m not saying that you completely leave it up to them because you know the data, you know your data well more than enough than anybody else. So, working with them in tandem and making this data operationally usable, and then taking action on it. If you just take the raw data and start taking action on it, you could face lots of issues such as you can have null values, incorrect categorization, unstructured data or several other issues.

Even data scientists sometimes struggle with null value because it’s really hard to understand,”What do we do with it? Should we keep it? Should we take out of the data? Should we normalize it?” There’re lots of things you can do. The other consistent problem that I’ve noticed, in terms of the raw data is the structure of the data. So, if you get your call center data, which is acoustic data, what are you gonna do with it? How are you going to convert to a text format? Do you have any text analytics?

Same applies to Twitter data. We do a lot of Twitter analysis. So, how are you gonna take the data and convert those tweets to a readable format, so your machine learning algorithms can run properly? I think, again, the number fourth, which is performing analytics on raw data is dangerous because it could provide you insights that could lead you in a completely wrong path or decision making process.

5. Overlooking key takeaways and making judgmental decisions

I think the key word here is ‘balance’. I’m not saying that an experienced person, an experienced leader should have a say in terms of the direction we wanna go with a specific project or the company itself. But when we have a team that is responsible for identifying the trends using data and the key takeaways, we need to leverage it to a full potential. I’m not saying that if the data tells you that the company is not going grow, you just stop and start worrying about it like, “Hey, what are we gonna do about it?” You definitely use your hunch, you definitely use your experience, you use your vision and direction to guide the company. But what we’re saying here is: Don’t make too many judgments, decisions once you have the data, what it tells you. We have seen a lot of companies go in a completely negative direction because they started using judgmental decisions and they didn’t rely on their data.

The judgmental decisions could also be from a limited scope of that data, and hyper-focusing on a very finite group of data and not looking at how that data’s attribution affects other tools, other channels.

The best example I can give is from a company I used to work for. There was a hyper-focus on ROI and we looked at the… Our goal was a 1.5 to one ROI, which is common and average. We were stack-ranking all the different marketing channels that we were using. We were to be able to find some that were below 0.5 ROI, really giving us a terrible return on investment. But looking at your marketing mix modeling, it was not something that you wanted to increase your investment with. So, they decided to pull a budget for it, but they pulled too much. The idea there was you don’t understand how the performance of one channel affects the performance of another. Too much money was pulled out and it impacted the performance of the rest of the business.

If the ROI for one particular channel, is not optimal, what we need to understand is how this channel impacts your lead funnel, or your lead progression funnels, or your customer’s journey, right?

We know there are areas where we know that they’re not performing optimally, but we also know and we have done some analysis to show that they are somehow impacting the overall progression of the lead from one stage to the other and for the opportunity from one stage to the other. So, it’s not just looking at the data in a small segment, but also looking at the big picture of you as well.

6. Taking too long to switch from old to new analytics platform

I believe everyone is somehow guilty of this one. Every time you go to these conferences, you’re sitting there inundated with these new tools and new analytics platforms and you get excited. Then you go to these conferences, you wanna come back and say, “Let’s go buy this new tool. Let’s go adopt this new technology.” But that’s not the correct methodology. There’s a science to this.

I think it’s a combination of art and science. I think the marketing stack needs to define what your marketing analytics technology needs are. How do you want to shape your marketing, what are the different components, what are the different products? Are the products are something that’s gonna make your data visualization easier or are their products that’s gonna allow you to make the data available? There are different types of products for each and every thing.

Now the challenge is sometimes the organizations have some legacy solutions in place that they are holding onto for whatever reason. Maybe they are just comfortable with it, maybe they are just too hesitant that other people will not like it. I’ve seen that as well. We want to make sure that we don’t take too long to switch from legacy solutions to a new solution that is in place.

Because first off, you spend money on that new solution that’s already in-house, and you’ve already bought into the solution, you’ve either built it or bought it. Second, you’ve already built all the customization’s that your company need. You have taken into consideration the requirements from the different stakeholders. Now, you’re just waiting, and waiting, and waiting somehow to magically that the new solution will take over and the old solution will be removed.

I think that’s where it needs to happen quickly. Once you know you have enough information and the stakeholders buy and you know that this is gonna be better for your organization, it’s gonna provide much better and robust analytics, then let’s just switch over. Let’s flip the switch. Don’t wait too long.

There’s a lot of considerations in place and a lot of it is: What was your goal? Is your goal truly integrated analytics, or integrated data across channels? Or is it journey visualization? Is it scalable? Is it a tool that where you can start adding on capabilities to build a scale to the potential growth of your company over the next two to three years? There’s a ton of considerations.

7. Not communicating the actionable insights clearly and frequently

This one in particular has to deal with your methodology and where that analytics team sits within the org. Being able to identify and communicate those really true, meaningful pieces of data don’t really come often. A lot of those teams, all they do is just focus on, “I did my work. I got out my daily report and I’m done.” And what it ends up being is you’re not able to create reports that create insights that actually have acted on it. So, you end up spending more time explaining what you need to do, not really understanding what all the different departments that you work to do, but you end up spending all this time trying to explain what you need to get done or what needs to get done. Then you end up just causing more problems and causing more confusion.

I also think that’s where we’re coming on this one particularly is repetition. If you have done the hard work, if your team has done the hard work, you have really solid recommendations, then you need to follow through. Let’s say if you did the presentation and everybody’s bought in and you are happy that your analysis worked through, but now you need to follow through to make sure that the analysis is turned into action. Because people are busy, generally. People tend to get lost in their day-to-days stuff.

They might go in a different direction. They might have different priorities, things may change, people may change, in terms of new leadership may come in. From an analytics leadership standpoint, it is the responsibility of the leader of that group to push this forward to the leadership again and again.

I really think that once you have a powerful recommendation, set of recommendations, powerful set of actions, you need to push that through consistently with a laser target you could focus, so then you know that you can see the results from your actions. Because typically what happens is once you have an analytics and you send it over to a specific department and you don’t follow through to see what are the actions taken, you don’t get satisfied with what you did. You don’t know what’s going on, you don’t know if that’s good or bad, if you wanna do that in the future, and if that’s gonna benefit the organization.

The communication actually goes two ways. It’s not only communicating the actual insights and clearly and frequently to your stakeholders, but also getting feedback and having them communicate back to you, whether those actionable insights created a positive return on investment or positive outcome.

8. Driving conclusion without statistical confidence

I think this mistake is also extremely important and it gets to the sophistication piece. A great example is, if you’re testing one particular messaging versus the other on your creative. If you think that one version is performing better than the other, you get excited. Especially, if this happens very frequently when this test is just launched, you’ll see that the one that you hoped was really performing better than the other, and you wanna quickly launch that as your success criteria, and you wanna push that creative forward without taking into a consideration the possibility that the other might exceed the previous one.

There can always be situations where you wanna know with a very high confidence that you have a winning outcome and your campaign has a better chance of conversion. However, the results must be backed by statistical confidence.

The other important point is, even in the general day-to-day analysis when we decide to identify what are the different impacts you’re having on your marketing budgets or your marketing performance, you should take into consideration the confidence index, so you are predicting with a higher accuracy, so your algorithms are more sophisticated; and you’re doing a comparative analysis of your algorithms with the answers that you had in the training sets.

In order to be an effective analytics professional, to be able to give statistical confidence in what you’re doing, to be able to communicate that across the stakeholders of the company, you need to know a little bit about finance, you need to know a little bit about marketing channels that you support, you need to know a little bit about e-commerce funnels, and everything. Because you have to be able to translate the statistical confidence into language that meets the needs of your stakeholders. If you can’t do that, you’re basically working in a vacuum.

9. Not focusing on the demand funnel progression metrics

I think a lot of this goes towards companies that have attribution models built out and they really focus on first-click and last-click attribution only but don’t really look at anything else. They’re not really building out a true funnel and looking at those metrics through the funnel and how are you looking at your marketing channels, how they convert or whether you’re looking at progression through a website from landing page to the shopping cart and how those convert through a funnel progression.

From my observation working in a variety of different projects, generally, people tend to worry about the top and the bottom of the funnel more than anything. What are you adding to the top of the funnel? How much traffic are you driving? How many campaigns are live? How many leads you are creating? What’s your marketing budget? And then, we worry about ROI, conversion rate, close rate, how many opportunities have been created? Or how many deals have been created from it? In doing so, we tend to forget what happens in the middle. For example: what happened to those leads that you created the last quarter? Where are they sitting now? Are they moving down the funnel? Before you start adding a new pipeline of leads, or sales people following through those leads and driving them all the way to conversion? Maybe you’re driving the leads that are just sitting there in a specific stage for a long time, so you need to tweak the funnel and push the leads forward. So again, not focusing on the funnel progression metrics and the velocity stages in the lead pipeline is detrimental to your outcome, even if you’re focusing on top and the bottom of the funnel.

There is an art and science to funnel management too.

10. Picking wrong charts or visuals for your presentations

The last thing you would want to do is you’re in a room with about 10 people and 15 people, senior management and stuff like that and you’re trying to explain something and the chart just doesn’t explain what you’re trying to solve for. All it’s doing is just creating more questions.

If you’re trying to communicate a particular trend, but you’re trying to show the trend in a bubble chart format, just because you think the bubble charts are really cool, that’s not gonna solve for it. You need to have a specific type of chart to show the trend. It could be an area map, it could be a trended simple bar charts or line charts. I think it’s more of a common sense, but it is also experiencing. I would suggest passing it to your peers, passing it to your colleagues, and getting their take on like, “Hey, does this make sense? Is it providing you the insights really quickly or am I missing something?” It’s always helpful to do that.

Another simple way, if you have a specific data visualization tool such as Tableau, if you go in Tableau, Tableau generally does a really good job of providing you a quick information on the charts you should be using to present your data. Visible is an iPad app that you can download and start looking at the charts and it will give you automatic recommendations for your charts, but always try to use the best possible charts to present the data and your story.

A good way is to look at an old presentation or an old set of charts that were used for the targeted audience that you have. There’re some people who only prefer bar charts.

11. Spending too much time in picking the right analytics tools

I think one applies to all types of organizations. Even in the big organizations where they already have analytics infrastructure, there’s always a need for a different type of tool that can provide them a different types of analytics. With all these upcoming startups that are focusing on data analytics and visualization, there’s always gonna be that shiny new tool that you think is gonna solve a bunch of your problems. I think the idea here is: Once you have understood the market, when you had done your RFP analysis, compared a good set of five to six different tools, and done your due diligence, let’s not spend too much time on saying, “Okay, which one should we go with?” Sometimes, especially if there are multiple stakeholders and well, it could take six months to a year just trying to figure out the right solution, which is a complete waste of time. Some companies have a stringent compliance process which could take another 1-2 months and increasing the tool selection and implementation timeline.

The trick is to go to your procurement and your management team to figure out what their goals or asks and go from there.

12. Running analytics in silos without proper integration

I think this comes back to the number two where we talked about not having a centralized approach. A lot of times, it depends on the size of the company. But this even comes back to this idea of performing analytics on raw data or not having enough insight. All the other extremely important data points that we talked about is when you’re analyzing data within silos and you’re not properly integrating. The hard part is you’re not making informed decisions based on the full outcome and the effect of what could happen if you make the wrong decision.

You’re also spending more time looking at different datasets and trying to figure out how it all fits together. The biggest pain point that I had and the reason why a lot of analytics and marketing professionals have multiple screens at their desk is because they have to open up three or four different tools on multiple screens, download that raw data, and use a tool like a Tableau or a simple Excel spreadsheet, and upload them into that to be able to find trends. It’s a difficult process and you’re actually hurting yourself by doing this.

I think the idea again goes back to the story you are trying to tell with the data. If you have a big data problem, then go create a data hub or data lake. Work with your IT, and then perform the integration across multiple segments. It is very important that you tie your data together to build a story.

A lot of it is also going back to the provider, whether it’s Google, IBM, Adobe, you name it. They have use cases that show the proper integration across those tools to break down those silos. Go and challenge the provider to offer you use cases, successful use cases on how that integration can happen and just use their expertise to your benefit.

13. Not using unstructured data for your analysis

I think unstructured data, which we’re talking here is essentially any kind of data that is not structured in rows and columns, requires lots of cleanup and is not readily available. For example, your videos, your imagery, acoustic data, your Twitter data like tweets and social media updates, and all the images that you post on your social media. All of these fall under different types of unstructured data.

Unstructured data is easier to ignore because it becomes a complex challenge to clean and use the data if you don’t have the in-house data science sophistication. In order to make this data usable, you have to parse out the data, remove null values, make it more structured, in terms of organization of the words that are being used in your tweets for example. There are lots of pieces that come into place when putting the structure of the data. If you don’t have the resources, either outsource it, and get the right resource to get your data structured or hire people that can help you to structure the data or leverage IT and BI to make that possible.

14. Lack of hypothesis-driven approach and too much data exploration

This comes down to the basic strategy of first asking, “What is the goal?” It sounds like such an easy concept. But really, what are you trying to solve for? And creating the hypothesis around, what could be the possible outcome of the data that you’re exploring? It’s almost as simple as that. Now, without that, you’re just going down a long path of analysis that’s not gonna go anywhere.

I think this tends to happen pretty frequently when you gain access to new tools or datasets. There’s always exploratory analysis, which essentially is trying to look into your analytics platform, and identify patterns, and develop recommendations from it. Whereas, hypothesis-driven analysts come with with a predefined set of hypothesis before performing any analysis. In business language, it means having the right set of business outcomes and the possibilities of those outcomes. Then you use this hypothesis to analyze the data and validate or invalidate your hypothesis. Instead of doing a lot more of exploration, which I think is great if you have a good enough amount of data you should also consider hypothesis-driven analytics approach.

15. Failing to integrate customer data for analysis.

This goes back to getting rid of those silos, running your analytics in silos without integration, and there’re different ways to do it. Some people run a great analysis within their various tools, within different channels, but at the end of the day they’re sitting there working with multiple KPIs and they’re not really integrating what the outcome of that dataset is to identify what is a true fact of the decisions I’m making and how does that affect the bigger picture?

This is also something that goes back to using your customer’s data, so then you can have better insights on who your target market it. I think, the idea here is to take your customer data and develop insights, so then when you go to market, you know what your customer’s purchasing patterns are, you know the industries they’re from, you know what the employee size is, you know what are the different segments of customers and you go attack that.

Now take action and listen to the podcast for a more meatier and detailed action plan.

Resources discussed in this podcast:

Subscribe
Notify of
guest

Time limit is exhausted. Please reload the CAPTCHA.

0 Comments
Inline Feedbacks
View all comments