Why Many Marketing Automation Projects Go South
As a data and analytics consultant, I often get called in when things do not work out as planned or expected. I guess my professional existence is justified by someone else’s problems. If everyone follows the right path from the beginning and everything goes smoothly all of the time, I would not have much to clean up after.
In that sense, maybe my role model should be Mr. Wolf in the movie “Pulp Fiction.” Yeah, that guy who thinks fast and talks fast to help his clients get out of trouble pronto.
So, I get to see all kinds of data, digital, and analytical messes. The keyword in the title of this series “Big Data, Small Data, Clean Data, Messy Data” is definitely not “Big” (as you might have guessed already), but “Messy." When I enter the scene, I often see lots of bullet holes created by blame games and traces of departed participants of the projects. Then I wonder how things could have gone so badly.
There are so many ways to mess up data or analytics projects, may they be CDP, Data Lake, Digital Transformation, Marketing Automation, or whatever sounds cool these days. First off, none of these items are simple to develop, or something that you just buy off the shelf. Even if you did, someone would have to tweak more than a few buttons to customize the toolset to meet your unique requirements.
What did I say about those merchants of buzzwords? I don’t remember the exact phrase, but I know I wouldn’t have used those words.
Like a veteran cop, I’ve developed some senses to help me figure out what went wrong. So, allow me to share some common traps that many organizations fall into.
No Clear Goal or Blueprint
Surprisingly, a great many organizations get into complex data or analytics projects only with vague ideas or wish lists. Imagine building a building without any clear purpose or a blueprint. What is the building for? For whom, and for what purpose? Is it a residential building, an office building, or a commercial property?
Just like a building is not just a simple sum of raw materials, databases aren’t sums of random piles of data, either. But do you know how many times I get to sit in on a meeting where “putting every data source together in one place” is the goal in itself? I admit that would be better than data scattered all over the place, but the goal should be defined much more precisely. How they are going to be used, by whom, for what, through what channel, using what types of toolsets, etc. Otherwise, it just becomes a monster that no one wants to get near.
I’ve even seen so-called data-oriented companies going out of business thanks to monstrous data projects. Like any major development project, what you don’t put in is as important as what you put in. In other words, the summary of absolutely everyone’s wish list is no blueprint at all, but the first step toward inevitable demise of the project. The technical person in charge must be business-oriented, and be able to say “no” to some requests, looking 10 steps down the line. Let’s just say that I’ve seen too many projects that hopelessly got stuck, thanks to features that would barely matter in practice (as in “You want what in real-time?!”). Might as well design a car that flies, as well.
No Predetermined Success Metrics
Sometimes, the project goes well, but executives and colleagues still define it as a failure. For instance, a predictive model, no matter how well it is constructed mathematically, cannot single-handedly overcome bad marketing. Even with effective marketing messages, it cannot just keep doubling the performance level indefinitely. Huge jumps in KPI (e.g., doubling the response rate) may be possible for the very first model ever (as it would be, compared to the previous campaigns without any precision targeting), but no one can expect such improvement year after year.
Before a single bite of data is manipulated, project champions must determine the success criteria for the project. In terms of coverage, accuracy, speed of execution, engagement level, revenue improvement (by channel), etc. Yes, it would be hard to sell the idea with lots of disclaimers attached to the proposal, but maybe not starting the project at all would be better than being called a failure after spending lots of precious time and money.
Some goals may be in conflict with each other, too. For instance, response rate is often inversely related to the value of the transaction. So, if the blame game starts, how are you going to defend the predictive model that is designed primarily to drive the response rate, not necessarily the revenue per transaction? Set the clear goals in numeric format, and more importantly, share the disclaimer upfront. Otherwise, “something” would look wrong to someone.
But what if your scary boss wants to boost rate of acquisition, customer value, and loyalty all at the same time, no matter what? Maybe you should look for an exit.
By nature, analytics-oriented companies are flatter and less hierarchical in structure. In such places, data and empirical evidences win the argument, not organizational rank of the speaker. It gets worse when the highest-ranking officer has very little knowledge in data or analytics, in general. In a top-down culture, no one would question that C-level executive in a nice suit. Foremost, the executive wouldn’t question his own gut feelings, as those gut-feelings put him in that position in the first place. How can he possibly be wrong?
Trouble is that the world is rapidly changing around any organization. And monitoring the right data from the right place is the best way to keep informed and take actions preemptively. I haven’t encountered any gut-feeling — including my own — that stood the test of time better than data-based decision-making.
Now sometimes, the top-down culture is a good thing, though. If the organizational goals are clearly set, and if the top executive does not launch blame games and support a big data project (no pun intended here). Then, an indefinite amount of inter-departmental conflicts will be mitigated upfront (as in, “Hey, everyone, we are doing this, alright?).
Conflicts Among Teams — No Buy-in, No Use
But no amount of executive force can eliminate all infighting that easily. Some may say “Yeah, yeah, yeah” in front of the CEO or CMO, but sabotage the whole project behind the scene. In fact, I’ve seen many IT departments get in the way of the noble idea of “Customer-360."
Why? It could be the data ownership issue, security concerns, or lack of understanding of 1:1 marketing or advanced analytics. Maybe they just want the status quo, or see any external influence on data-related matters as a threat. In any case, imagine the situation where the very people who hold the key to the of source data are NOT cooperating with data or analytics projects for the benefit of other departments. Or worse, maybe you have “seen” such cases, as they are so common.
Another troublesome example would be on the user side. Imagine a situation where sales or marketing personnel do not buy into any new way of doing things, such as using model scores to understand the target better. Maybe they got burned by bad models in the past. Or maybe they just don’t want to change things around, like those old-school talent scouts in the movie “Moneyball." Regardless, no buy-in, no use. So much for that shiny marketing automation project that sucked up seven-figure numbers to develop and deploy.
Every employee puts their prolonged employment status over any dumb or smart project. Do not underestimate the people’s desire to keep their jobs with minimal changes.
Players Haven’t Seen Really Messy Situations Before
As you can see, data or analytics projects are not just about technologies or mathematics. Further, data themselves can be a hindrance. I’ve written many articles about “good” data, but they are indeed quite rare in real life. Data must be accurate, consistent, up-to-date, and applicable in most cases, without an excessive amount of missing values. And keeping them that way is a team sport, not something a lone tech genius can handle.
Unfortunately, most graduates with degrees in computer science or statistics don’t get to see a real bloody mess before they get thrown into a battlefield. In school, problems are nicely defined by the professors, and the test data are always in pristine conditions. But I don’t think I have seen such clean and error-free data since school days, which was indeed a lifetime ago.
Dealing with organizational conflicts, vague instructions, and messy data is the part of the job of any data professional. It requires quite a balancing act to provide “the least wrong answers” consistently to all constituents who have vastly different interests. If the balance is even slightly off, you may end up with a technically sound solution that no one adopts into their practices. Forget about full automation of anything in that situation.
Already Spent Money on Wrong Things
This one is a heart-breaker for me, personally. I get onto the scene, examine the case, and provide step-by-step solutions to get to the goal, only to find out that the client company spent money on the wrong things already and has no budget left to remedy the situation. We play with data to make money, but playing with data and technology costs money, too.
There are so many snake oil salespeople out there, over-promising left and right with lots of sweet-to-the-ears buzzwords. Yeah, if you buy this marketing automation toolset armed with state-of-the-art machine-learning features, you will get actionable insights out of any kind of data in any form through any channel. Sounds too good to be true?
Marketing automation is really about the “combination” of data, analytics, digital content, and display technologies (for targeted messaging). It is not just one thing, and there is no silver bullet. Even if some other companies may have found one, will it be applicable to your unique situation, as is? I highly doubt it.
The Last Word on How to Do Marketing Automation Right
There are so many reasons why marketing automation projects go south (though I don’t understand why going “south” is a bad thing). But one thing is for sure. Marketing automation — or any data-related project — is not something that one or two zealots in an organization can achieve single-handedly with some magic toolset. It requires organizational commitment to get it done, get it utilized, and get improved over time. Without understanding what it should be about, you will end up automating the wrong things. And you definitely don’t want to get to the wrong answer any faster.
Stephen H. Yu is a world-class database marketer. He has a proven track record in comprehensive strategic planning and tactical execution, effectively bridging the gap between the marketing and technology world with a balanced view obtained from more than 30 years of experience in best practices of database marketing. Currently, Yu is president and chief consultant at Willow Data Strategy. Previously, he was the head of analytics and insights at eClerx, and VP, Data Strategy & Analytics at Infogroup. Prior to that, Yu was the founding CTO of I-Behavior Inc., which pioneered the use of SKU-level behavioral data. “As a long-time data player with plenty of battle experiences, I would like to share my thoughts and knowledge that I obtained from being a bridge person between the marketing world and the technology world. In the end, data and analytics are just tools for decision-makers; let’s think about what we should be (or shouldn’t be) doing with them first. And the tools must be wielded properly to meet the goals, so let me share some useful tricks in database design, data refinement process and analytics.” Reach him at email@example.com.