Marketing Data Archives - Chief Marketing Technologist https://chiefmartec.com/category/marketing-data/ Marketing Technology Management Thu, 06 Apr 2017 23:15:26 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 https://chiefmartec.com/wp-content/uploads/2021/09/cropped-chiefmartec-icon-150x150.png Marketing Data Archives - Chief Marketing Technologist https://chiefmartec.com/category/marketing-data/ 32 32 5 data assumptions that marketers should avoid https://chiefmartec.com/2017/04/5-data-assumptions-marketers-avoid/?utm_source=rss&utm_medium=rss&utm_campaign=5-data-assumptions-marketers-avoid https://chiefmartec.com/2017/04/5-data-assumptions-marketers-avoid/#comments Fri, 07 Apr 2017 11:30:44 +0000 https://chiefmartec.com/?p=2141 The following is a guest post by David Dunne, CEO of Velocidi. Marketers using data to inform strategy is not a novel concept. Yet the process for harnessing this information and the amount and type of data available to marketers have converged to redefine data-driven marketing in 2017. Data that is siloed by applications, teams, and legacy systems has resulted in incomplete or incongruent resources. Access to total enterprise data, including customer data, sales data, …

5 data assumptions that marketers should avoid Continue Reading »

The post 5 data assumptions that marketers should avoid appeared first on Chief Marketing Technologist.

]]>
Data Assumptions Marketers Should Avoid

The following is a guest post by David Dunne, CEO of Velocidi.

Marketers using data to inform strategy is not a novel concept. Yet the process for harnessing this information and the amount and type of data available to marketers have converged to redefine data-driven marketing in 2017.

Data that is siloed by applications, teams, and legacy systems has resulted in incomplete or incongruent resources. Access to total enterprise data, including customer data, sales data, and research, has widened the scope of the marketer’s reach. And this sheer volume has made the process of funneling data through to actionable insights more complex and challenging.

Brand managers, striving to maximize spend and performance, have an opportunity to embrace advanced marketing analytics and positively impact collaboration, real-time decision-making, and revenue. This starts by removing the perceived barriers to entry for working with data.

In order to maximize the value of marketing data, here are five assumptions marketing leaders should avoid:

Assumption 1: Data is not forward-looking; I can’t be responsive with data.

A 2016 Forrester study found that marketers are struggling to be responsive in real-time interactions with customers, as they focus on traditional performance measurements such as customer retention and acquisition.

The same study identified big data technologies as the bridge between customer experience and these marketing priorities. Many marketers limit data to postmortem evaluation of campaign performance. Outdated data-management processes are often to blame — the laborious tasks of managing and reporting leave little time for forward-looking analysis.

But marketing technology that streamlines multi-channel performance measurement can provide near-real-time analysis that enables greater responsiveness. For example, automated visualization and predictive modeling help marketers identify patterns within data that can inform real-time decision-making.

Assumption 2: Marketers must become mathematicians to use AI.

Last year, Demandbase and Wakefield Research found that 80% of B2B marketers believed AI would “revolutionize” marketing by 2020. However, only 26% reported confidence in their own understanding of AI’s applications. Furthermore, more than half cited technology integration and staff training as barriers to entry.

Contrary to popular belief, marketers do not have to be familiar with advanced mathematics and algorithms to make use of machine learning. The best marketing intelligence platforms — that is, the applications and infrastructure that enable access to and analysis of marketing data — will seamlessly integrate AI technology via reporting, visualization, and insight-generation, giving marketers more time to discover trends and opportunities for growth.

Assumption 3: Data undermines creativity.

Effective marketing is a combination of data-driven insights and creative thinking. Data-in, data-out is not the be-all and end-all of strategic output. Marketing will always be a creative industry; but marketers who embrace data-led creativity will have the upper hand.

Integrating audience, brand, and sales data to predict customer behavior may tell you what you need to communicate, but not how — that personalization comes from knowing your audience. Data science, then, complements the marketer’s own intuition; it doesn’t replace it.

Think of data as the layer of insight that fuels your own creative thinking.

Assumption 4: Someone else on my team will take care of the data.

Last year, Forbes Insights and Oracle Marketing Cloud reported significant struggles in brand-agency collaboration due to evolving roles within each organization.

Data management is surely a casualty of this dynamic. Collaboration between brand and agency teams is extremely important, but somehow data seems to get pushed off from one team to another.

If marketers want to maximize collaboration and, therefore, boost the value of marketing performance, closing the data literacy gap across both brand and agency teams is key. Proximity to data is essential to faster, timelier decision-making. When marketers put the right tools and processes together for collaborative data practices, there is tremendous opportunity for growth.

Assumption 5: I’ve invested too much into data management to disrupt this system.

A December 2016 study by Econsultancy found that while marketers have access to multi-channel data, their “methodologies are still forcing everything through the same outdated, siloed processes.”

Obtuse systems for data management and reporting have the unfortunate advantage of denying marketers insights that could impact revenue, customer retention, and media spend. Marketers are in a position to weigh the costs of retaining outdated systems as data accessibility grows.

Luckily, disruption is more manageable when your objectives are clear. For a faster line from data to insights, marketers should look for a system that provides a single view of all data in near-real time.

Thanks, David!

The post 5 data assumptions that marketers should avoid appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2017/04/5-data-assumptions-marketers-avoid/feed/ 1
The rise of the marketing data lake https://chiefmartec.com/2017/03/rise-marketing-data-lake/?utm_source=rss&utm_medium=rss&utm_campaign=rise-marketing-data-lake https://chiefmartec.com/2017/03/rise-marketing-data-lake/#respond Thu, 09 Mar 2017 13:00:09 +0000 https://chiefmartec.com/?p=1965 The following is a guest post by Doug Kessler, co-founder and creative director of the B2B marketing agency Velocity. They worked closely with Informatica, one of their clients, to produce educational content on marketing data lakes, including a book by Franz Aman and Anish Jariwala. It’s some of the best writing on the topic that I’ve come across. It’s way too late to call the rise of the marketing operations function. But as the role …

The rise of the marketing data lake Continue Reading »

The post The rise of the marketing data lake appeared first on Chief Marketing Technologist.

]]>
The Marketing Data Lake

The following is a guest post by Doug Kessler, co-founder and creative director of the B2B marketing agency Velocity. They worked closely with Informatica, one of their clients, to produce educational content on marketing data lakes, including a book by Franz Aman and Anish Jariwala. It’s some of the best writing on the topic that I’ve come across.

It’s way too late to call the rise of the marketing operations function. But as the role matures, we’re starting to see more and more ops people spending more and more of their time on one thing: data wrangling.

A recent long-form article by Informatica, What Is Marketing Operations?, gives a thorough update on how the role is changing. And it’s all about data. Data integration between cloud and on-premise applications (or cloud-to-cloud integrations); data quality; master data management…

You’d expect Informatica to feel this way, since they make data management software. But even three years ago, it was a rare marketing department where you’d hear conversations about this kind of thing. Today, it’s a rare one that isn’t talking about them.

Having gone wild on all the things that marketing apps let us do, we’ve all now hit the wall. Because each of these apps generates its own Mississippi of data. And each tags and structures that data in its own special way (including a unique convention for customer IDs). So when it’s time to bring all those data puddles together — for analysis, modelling, segmenting, whatever — it’s an unholy mess.

The number one imperative for marketing ops

This is the single most important imperative for every marketing operations team: to fight against data fragmentation and find ways to unite the whole marketing stack at the data layer.

The traditional way to do that was to extract, transform, and load (ETL) everything into a marketing data warehouse. And that still works for lots of use cases.

But data warehouses were invented way before things like cloud apps and social media feeds. So instead of being optimized for the kinds of data that marketers deal with every day (high-volume, un-structured, multi-sourced), they’re optimized for fewer sources of data that are pretty well-structured.

Because of this, marketing operations teams often find warehouses expensive, hard to work with, and a nightmare to change (to add new data sources, for instance).

Just when we thought we’d escaped the clutches of IT, the data warehouse forced us back into the IT queue to do even simple things like run new reports. And skyrocketing data volumes cause costs to skyrocket too.

Enter the marketing data lake

Here’s where the data lake comes in. Specifically, the marketing data lake.

A data lake is a data store, just like a warehouse, but with some important differences:

DATA WAREHOUSE DATA LAKE
Data types Structured Unstructured or multi-structured
Database schema Schema-on-write Schema-on-read
Cost Expensive storage Low-cost storage
Ideal for Penny-perfect, super-secure financial reporting Agile marketing analytics and decision-making
Agility Difficult to add new reports and queries Easy to add new reports and queries

Source: The Marketing Data Lake by Franz Aman and Anish Jariwala

Data lakes have no problem with unstructured or multi-structured data. That’s what they’re for. Just dump in any source and worry about it later. They run on low-cost storage (usually in the cloud, like in AWS) so you can keep more data.

The big news about data lakes, though, is that line about database schema — the way a database structures its data. A data warehouse is “schema-on-write,” meaning you need to know how you want to structure your data before you build and load the warehouse. It’s a fundamental decision that you need to make right up front. Changing it sucks.

A data lake is “schema-on-read,” which means you worry about structuring the data only when you’re making a query, running a report or building some kind of new dashboard or use case.

That small-sounding difference is actually huge for marketers. Because it means we can capture more data from more sources — even before we know how we might use that data.

Garbage In, Garbage Out (GIGO)

Of course, it still pays to put clean, high-quality data into your data lake (or you’ll spend 80% of your time on housekeeping later instead of actually putting the data to work). The need for good data governance doesn’t go away with data lakes. If anything, it’s even more important.

So it helps to standardize important things like campaign codes and consistent definitions of things like “customer,” “sales region,” and “product” before you go too far. But with basic governance in place, a marketing data lake will do all kinds of good things for your marketing operations:

  • Make it easier to combine data sources for easy analysis — like syncing your web analytics with your marketing automation or CRM or email performance data or…
  • Give you the foundation for that elusive “single view of the customer” – the holy grail of marketing.
  • Stay agile — spinning up new use cases and adding new data sources quickly and easily.
  • Make your data more reliable — so your models are better and your predictive analytics actually predict.

Bottom line? A marketing data lake lets you really take control of your own marketing data instead of holding it hostage inside all the apps you use.

Make nice with IT

That crack about “escaping the clutches of IT” — I was just kidding. The new world of data-driven marketing operations demands a close relationship with the folks over in IT.

Even if you knew how, would you really want to spend your time spinning up a Hadoop cluster on AWS, building your data lake, and integrating all the different data sources you’ll be working with?

This is the stuff modern IT folks are great at. And, as long as you work closely with them, they’ll get you there much, much faster and in far better shape.

So go make friends with IT. Talk to them about a marketing data lake. And become the master of your own marketing data.

Thanks, Doug!

The post The rise of the marketing data lake appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2017/03/rise-marketing-data-lake/feed/ 0
A brief introduction to network science math in marketing https://chiefmartec.com/2017/01/touring-math-network-science-marketing/?utm_source=rss&utm_medium=rss&utm_campaign=touring-math-network-science-marketing https://chiefmartec.com/2017/01/touring-math-network-science-marketing/#comments Mon, 16 Jan 2017 16:05:11 +0000 https://chiefmartec.com/?p=1864 The following article is a guest post by Tyler Foxworthy, chief scientist at DemandJump. I loved how this article connects several fields of math to marketing challenges that are being addressed with artificial intelligence. I’ve added several links to Wikipedia pages, if you want to geek out on this like I have. The field of network science, which is rooted in statistical physics and graph theory, has emerged in recent years to help us understand …

A brief introduction to network science math in marketing Continue Reading »

The post A brief introduction to network science math in marketing appeared first on Chief Marketing Technologist.

]]>
Network Science in Marketing

The following article is a guest post by Tyler Foxworthy, chief scientist at DemandJump. I loved how this article connects several fields of math to marketing challenges that are being addressed with artificial intelligence. I’ve added several links to Wikipedia pages, if you want to geek out on this like I have.

The field of network science, which is rooted in statistical physics and graph theory, has emerged in recent years to help us understand how complex networks such as the internet form and evolve over time.

Social networks, the financial system, airlines, communication systems, the Internet — these complex networks are the substrate and substance of modern life.

Networks such as the Internet form following a process referred to as preferential attachment, whereby the competitive dynamics of the free market influence the topology of the internet like a glacier cuts a valley through a mountain range.

Preferential attachment is often referred to as “the rich get richer” phenomena — i.e., the more popular a node in a network (e.g., domains, websites, social influencers), the more likely it is that a newcomer to the network will want to link to it.

Over time, preferential attachment results in clustering among related web pages in a way that makes modern search engines possible. Data-driven marketers recognize these clusters as competitive ecosystems. Arguably, the reason Google succeeded, where earlier search engines failed, was due to their innovative understanding of the topology of scale-free networks.

The proliferation of marketing data and rapid advances in hardware allow marketers to map and measure the dynamics of their market ecosystems. The methods of network science allow them to comprehend it.

A holistic, network-level understanding of a marketing ecosystem provides countless benefits for marketers, and, in fact, will alter the course of marketing indefinitely. This represents a critical opportunity for marketers to shift away from last-click attribution and other simplistic models. Today, marketers can identify which nodes (domains, websites, social influencers) are affecting traffic and revenue — multiple steps before customers reach the domains of them or their competitors.

Graph theoretic methods enable savvy marketers to not only construct highly accurate maps of their market, but to track competitors, identify exactly where wins and losses occur, and predict where the greatest revenue opportunities can be captured.

It’s the difference between the first cartographers trying to map the geography of the Earth, and the 2000+ satellites that orbit our planet today. Having a holistic, network-level view, allows us to understand our surroundings with exponential precision.

Holistic Networks in Marketing

When coupled with artificial intelligence techniques such as reinforcement learning, these methods lead to strategies that optimize budget allocation and, ultimately, drive revenue.

Harnessing the Power of a Network: Propagation Value

Intuitively, marketers know that connections in a competitive ecosystem are not static. Instead, they fluctuate in response to seasonality, trends, and the marketing effects of one’s competitors, among other factors.

Although significant insights can be obtained from a static connectionist view of the network, these insights become significantly more predictive if we are able to understand how connectivity patterns within competitive ecosystems persist into the future.

Using advanced mathematical techniques (including Markov Chain theory and algebraic topology), it is possible to determine both the direct traffic contributions of a particular web domain and its indirect traffic driving potential within the network the marketer or brand is trying to dominate.

We refer to this indirect (or “pass-through”) traffic as the propagation value of a source. Differences in connectivity and temporal traffic patterns between domains can lead to drastically varying propagation values. This means simply that while two sources might drive the same direct volume of traffic to a website, one of the sources often drives traffic to additional sites in the ecosystem that are of more value to a specific marketer or brand.

The best marketers will seek to understand the direct traffic contribution of each source — as well as the indirect propagation value of the same sources — to uncover the greatest sources of traffic and revenue generation.

The marketer that has visibility to the most powerful direct and indirect sources can anticipate competitive shifts, seize share, and win at the traffic and revenue game.

By prioritizing opportunities in the ecosystem with higher propagation value to them and their competition, marketers will see greater gains in traffic across multiple sites in their ecosystem than selections motivated solely by direct or last-click traffic.

Optimizing Budget Allocation with Artificial Intelligence

The proliferation of marketing channels has resulted in an abundance of media and marketing investment opportunities, along with a glut of data. When coupled with the continuous shift in consumer preferences, trends, seasons, and other variables within an ecosystem, it has become impossible for marketers without sophisticated AI tools to detect, predict, or respond in real time to opportunities or competitive threats.

In marketing, impactful AI implementation comes down to one goal — real-time, actionable results.

While AI is cast in the media as a singular, ominously powerful technology, the reality is that there are many varieties of machine learning algorithms, and each is suited for specific tasks.

In fact, the art of AI is rarely in building the algorithms themselves. Rather, it is in the ability to define problems in such a way that the algorithm can identify patterns and “learn” the best solution, given the data and computational resources available.

In much the same way that self-driving cars have internal models for identifying lanes and other vehicles, temporal network models are able to form a context from which to define the majority of marketing problems and optimize predicted outcomes across a wide array of possible marketing actions.

While marketers can certainly benefit from other well-known varieties of artificial intelligence — such as those devoted to natural language processing or image classification — the future of AI in marketing will be centered around dynamic optimization methods such as reinforcement learning that learn equally well from historical data as well as real time performance.

Reinforcement learning (RL) is a modern artificial intelligence technique that can address problems which are indiscernible by traditional optimization methods. RL is able to continuously adapt itself based on new information, making it ideal for optimization problems embedded within dynamic networks such as budget allocation and display targeting.

Moving Forward

Marketers have much to look forward to from AI in the years to come. Graph theory and algebraic topology are not artificial intelligence. They are the foundation upon which intelligent systems for online markets should be constructed. The environment is a moving target, and multiple competitors, trends, multiple scales, along with incomplete data sets, make these problems especially difficult.

For the next several years, we will likely see more co-pilot type systems, rather than autopilot (“set it and forget it”) systems. Gradually, as scientist learn to build better models for capturing market dynamics, these systems will take on more of the mechanical load of marketers, leaving them time to focus on the more strategic and creative aspects of their jobs.

Thanks, Tyler!

The post A brief introduction to network science math in marketing appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2017/01/touring-math-network-science-marketing/feed/ 2
A quick tour of measurement for video campaigns https://chiefmartec.com/2016/11/quick-tour-measurement-video-campaigns/?utm_source=rss&utm_medium=rss&utm_campaign=quick-tour-measurement-video-campaigns https://chiefmartec.com/2016/11/quick-tour-measurement-video-campaigns/#comments Tue, 22 Nov 2016 00:55:37 +0000 https://chiefmartec.com/?p=1800 The following is a guest post by Steven Wastie, the CMO of Origami Logic. His epic Periodic Table of Marketing Signals guest post was one of our most popular ones this year. Video is an effective mechanism for amplifying a brand message and engaging with consumers in ways that “connect” with them. In a recent analysis we did on Rio Olympic campaigns, the brands with popular videos/commercials drove the most amplification across social platforms. However, …

A quick tour of measurement for video campaigns Continue Reading »

The post A quick tour of measurement for video campaigns appeared first on Chief Marketing Technologist.

]]>
The following is a guest post by Steven Wastie, the CMO of Origami Logic. His epic Periodic Table of Marketing Signals guest post was one of our most popular ones this year.

Video is an effective mechanism for amplifying a brand message and engaging with consumers in ways that “connect” with them. In a recent analysis we did on Rio Olympic campaigns, the brands with popular videos/commercials drove the most amplification across social platforms. However, according to a 2016 IAB Video Upfronts study on advertisers and agencies, 42% of digital marketers have trouble measuring and proving the ROI of their video campaigns.

Why is that the case? Well, there are many reasons why measuring the performance of video campaigns is not easy. A couple of the primary reasons are:

  • Many different platforms are being used today to publish videos. You not only have brands publishing videos on traditional platforms like YouTube but now they are posting videos natively on Facebook, Instagram, Snapchat, Twitter, and others. Measuring the effectiveness of a video campaign that spans a variety of publishers, channels and devices is not easy.
  • The measurement and execution of digital videos are tightly dependent on each other. The goals of a video campaign and the metrics that need to be measured have a significant impact on the video format that should be used. This is not the case with most other digital mediums.

So what do marketers need to do to run an effective video campaign? Here are some tips:

  1. Establish campaign goals, metrics and benchmarks early. Like other mediums, video campaigns can have different goals — awareness, engagement, conversions, advocacy. While deploying video ads across multiple channels at scale to drive brand awareness is the most effective approach to DV campaigns, they can also successfully engage customers, move them along a consideration path, and even drive immediate conversions. Regardless of the campaign goal, it is important to measure the right metrics. Here are some of the key metrics for different goals (note that there is some overlap in the metrics used for each goal):
  2. Video Metrics by Goal

  3. Closely review how metrics are defined by publishers, especially in newer social channels. One of the downsides of having so many different video publishing platforms to choose from is that platforms are inconsistent in how they define their metrics. In fact, many of the definitions, as defined by the platform providers, are not even consistent with the industry definitions, as defined by organizations like the Interactive Advertising Bureau or the Media Rating Council.

    A great example of this is how an impression is defined:

    • Interactive Advertising Bureau — “measurement of responses from a Web server to a page request from the user browser, which is filtered from robotic activity and error codes, and is recorded at a point as close as possible to opportunity to see the page by the user” — this definition is more aligned with traditional display.
    • Facebook – “each time an ad can be viewed when it enters a person’s screen on the Facebook Audience Network.”
    • DoubleClick Campaign Manager — “does not log an impression until the video has buffered enough to start playing.”
    • Twitter – video_total_views, which are views “at least 50% in-view for 2 seconds. This is the most in line with the MRC (Media Rating Council) standard.

    It is important therefore to understand how each platform defines certain terms. We all take for granted the “impression,” since it is the most ubiquitous term used in digital advertising. However, it is so important since it is the basis for how we pay for media and measure reach. Based on these publisher metric definitions, Twitter video impressions seem to be the most worthwhile since viewability is factored in. Compare video metrics and what they mean across different publishers in our Video Metrics Index.

  4. Select the video ad format that tracks the metrics you care about. Once you know your goals and the metrics you want to track for a campaign, then you need to choose an ad format. Different ad delivery tags are used to deliver different ad formats. Picking the right delivery tags is important because ad delivery tags determine which metrics can be tracked. Also make sure that publishers being considered support the appropriate ad formats/ad delivery tags and can report against them.

    Below are details on ad delivery tags and what they support:
    Video Campaign Delivery Tags
  5. Pay particular attention to viewability and fraud. It is important to be proactive in making sure that your video ads are viewable by human traffic and not a fraudulent bot. Here are some steps that should be taken:

    • Implement third-party viewability tracking tags and measure viewability with MRC (Media Rating Council) accredited metrics.
    • Work with inventory partners to understand what 3rd party viewability vendors they support. Also, handshake on viewability rate expectations before launching a campaign.
    • If you are buying from a publisher or an ad network, ask they how they prevent fraudulent traffic on their network.
  6. Establish an internal process to activate against your video metrics and campaign insights.

    Providing team wide visibility into important metrics like video completion rates (VCR) will bring both digital planning and creative production teams together to ensure that creatives are resonating with target audience segments. Planning teams can also use detailed metrics to course-correct and optimize/reallocate budgets across publishers to make sure marketing dollars are being spent wisely.

    Included is a suggested internal process for planning and optimizing video ad campaigns:

    Video Metrics Cycle

Measuring the performance of video marketing campaigns is not easy but it is worth the effort. A key benefit of digital video is that there are more trackable user engagement metrics, like video completion rate, than other types of digital ads. This helps marketers gain a better understanding on how individual consumers are reacting to their brand message.

Ultimately, by measuring the right metrics and having performance benchmarks that are based on the goal of a campaign, marketers can make in-flight adjustments by updating the creatives or reallocating budget among video ad publishers. In doing so, budgets and ROI can be maximized.

The post A quick tour of measurement for video campaigns appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2016/11/quick-tour-measurement-video-campaigns/feed/ 1
Analytics CEO makes a passionate case against marketing attribution https://chiefmartec.com/2016/10/analytics-ceo-makes-passionate-case-marketing-attribution/?utm_source=rss&utm_medium=rss&utm_campaign=analytics-ceo-makes-passionate-case-marketing-attribution https://chiefmartec.com/2016/10/analytics-ceo-makes-passionate-case-marketing-attribution/#comments Sun, 30 Oct 2016 12:11:33 +0000 https://chiefmartec.com/?p=1738 The following is a guest post by Sergio Maldonado, the founder and CEO of Sweetspot Intelligence. The above cartoon by Tom Fishburne was not part of the original article. I have seen the Emperor walking naked for too long, and I wish I could be that naive kid in the crowd. I do not believe in marketing “attribution”. Beyond the combined power of algorithms, data, software and professional know-how, the concept is — at its …

Analytics CEO makes a passionate case against marketing attribution Continue Reading »

The post Analytics CEO makes a passionate case against marketing attribution appeared first on Chief Marketing Technologist.

]]>
Marketing Attribution Myth

The following is a guest post by Sergio Maldonado, the founder and CEO of Sweetspot Intelligence. The above cartoon by Tom Fishburne was not part of the original article.

I have seen the Emperor walking naked for too long, and I wish I could be that naive kid in the crowd. I do not believe in marketing “attribution”. Beyond the combined power of algorithms, data, software and professional know-how, the concept is — at its foundation — flawed.

Here’s an attempt at explaining my standpoint, although it’s worth noting that more scientific avenues have already been explored with similar conclusions.

1. A BLACK HOLE

It all started with a beautiful idea. Cross-channel attribution (or “multi-touch attribution”) became a popular concept at the time when web analytics had just completed its journey from IT to the marketing department (circa 2008).

What was it about? Essentially, our ability to assign a specific value to each touchpoint or event — often a paid ad view or click — contributing to a final business outcome or conversion. This would allow us to stop giving undeserved credit to the last or first campaign or touchpoint logged in the chosen system of record.

Increasingly more sophisticated techniques for the integration of owned, paid, and earned media touchpoints within first-party analytics environments have subsequently increased our capabilities, eventually spinning off a software category of its own.

What is not to like about the whole concept? It embodies everything marketers ever wanted to get out of data!

The cross-departmental synergies it creates cannot be ignored either, starting with a translation of marketing outcomes/touchpoints/reach into dollars — a long dreamed bridge between the CMO and CFO camps — and continuing with an engineer-friendly understanding of the marketing process, which is music to the ears of CIOs and marketing technologists. Plus CMOs can now be held accountable, making CEOs happy. In other words, killer fuel for company politics.

But there is of course an ultimate promise: a mastery of the formula results in consistently getting more money out of it than the amount originally invested. Which is truly unbeatable.

Why question it, then?

Well, attribution — even when solely focused on digital channels — places a very tall order on prerequisites:

  1. It requires linearity. There is no single, common timeline sheltering all possible initiatives in the vast realm of marketing. Even less so one in which such investments and our desired business outcomes coexist. A fictional “snapshot” is required when cause and effect reside in parallel realities — one that is defined by our even more arbitrary decision-making milestones.
  2. It requires causality. Causality does not happen in aggregate, but instead at very granular level. More specifically, at customer, user, fan, visitor, lead level. This means cross-device and cross-media integration at a user level are imperative if attribution is to work. In other words, attribution requires the famous 360-degree view of a statistically significant amount of potential customers. More on that separate fiction in a minute.
  3. When pertaining to humans (always the case thus far), the understanding of attribution is limited by the understanding of the human mind. Our culture, life experiences, perceptions… affect a system (human brain) which we do not entirely comprehend. Now, can the human mind ever understand its own intricacies?

Lizard Brain

Okay, this was somewhat vague and abstract — highly relevant picture of the lizard trying to understand the lizard brain aside — but the ground is now paved for a more straight-forward explanation.

So here’s a second try. Attribution will not happen because:

  1. There is no common timeline. Many campaigns, channels, or media have a longer term impact on true business outcomes that we can even measure. Are you prepared to maintain never-ending conversion funnels so as to properly take into account the long-term impact of social reach and cultural associations? Will you really be able to compute all possible conversions?
  2. There is no single customer view. People are less “digitally unique” and traceable every day. First because they choose to be. Second, because they scatter their attention and touchpoints across multiple, isolated environments. Third, because privacy standards or data protection laws prevent further integration.

Sure, I will have to explain myself much better on this latter point:

  1. Useful as it is for its primary purposes, cross-device identification is not enough to get us to a single view of even a fraction of our potential customers. And it will only get worse unless we bring supercookies back into the picture, which is highly unlikely given the next point below.
  2. We easily brush privacy compliance aside, but the EU’s upcoming General Data Protection Regulation (“GDPR”) [PDF] will draw a red line in the sand (or the Atlantic Ocean), with the many US companies participating in the Privacy Shield program most likely dragged into a much harsher reality. Furthermore, the FTC has identified the compliance of cross-device identification activities as one of its top priorities, while the Federal Communications Commission (“FCC”) has just introduced an opt-in approach in its own Broadband Consumer Privacy Rules.
  3. Social trends run counter to a single identity. Surely much more important than regulatory limitations, as these only follow social unrest. But society keeps finding much more effective and enforceable means of defending itself against lack of transparency.

“But what about new models combining deterministic information (truly integrated at granular level) with probabilistic data? Have we not overcome technical and legal constraints with smart algorithms?”

For starters, even though I proposed this myself as a solution at the time, probabilistic models are now facing the same legal challenge: the EU’s GDPR will label this non-PII data as “pseudonymous” (rather than anonymous) if it can be used for profiling purposes, and the collection or processing of such data will be subject to the very same limitations/burdens as of May 2018. And this month’s ruling on IP addresses by the European Court of Justice will ensure that the very concept of PII as a threshold for compliance becomes a thing of the past well before that.

Secondly, does it really matter that you put together the best sounding algorithms and weight distribution alternatives when all you ponder are touchpoints within your sphere of control? The core limitations have not changed one bit, and yet we place our faith on the more sophisticated blend. Do we simply want to believe in magic?

Now, as convinced as I am that attribution does not work in itself, I can surely appreciate that attribution efforts (i.e., investments in the pursuit of such nirvana) do in fact produce positive and tangible results.

2. THE LEGITIMATE PURSUIT: FINDING ROI

Taking it from that point, if attribution is impossible, useless, even illegal (!)… then why do we spend fortunes on this mission? Look at the amount of effort gone into the said blending of space and time:

eMarketer estimates that over 50% of US marketers are using digital attribution models in 2016, with over 60% expecting to expand attribution to offline channels in 2017.

A recent report by eConsultancy concluded that 43% of organizations reported having a single customer view — yet only 12% claimed to have the required technology in place. As pointed out by by David Raab (see his comment on Scott Brinker’s post), the companies who say they have built a complete view without the technology “are either magicians or fooling themselves.”

Perhaps more relevant are the findings of the multiple workshops and roundtables on attribution taking place across uncountable marketing events worldwide. Having attended a few of them myself over the years, I can confirm that the following summary from a recent (eConsultancy) Digital Cream gathering could have perfectly been transposed from any other:

“Marketers found that offline data was very difficult to match up with online data as there was a lack of customer identification at offline touchpoints. This meant that that measuring ROI for online campaigns in the offline space was nearly impossible.”

Has anybody actually got this thing to work? (Successful readers, please do comment!)

From a very cynical point of view, were attribution achievable or had it ever been attained by anyone alive, the never-ending impact of eternal ROI would have drained every other source of income on the face of the planet. Those in possession of such formula would drive unlimited resources towards one offering after another making the legend of Midas pale in comparison.

All of which leads me to conclude that there is no such Holy Grail.

A few recent studies would confirm that I am not alone: Gartner’s recent Hype Cycle for Digital Marketing and Advertising (2016) saw attribution sliding down the Peak of Inflated Expectations deeper into the Trough of Disillusionment. According to Gartner, reasons for the descent range from unrealistic expectations to vendor hype. In contrast, old-fashioned, aggregate data-based marketing mix modeling was considered to remain in the Plateau of Productivity.

Where, then, is the ROI of attribution? Despite all of the above, I could never understate the beneficial side effects of this effort. To name a few:

  1. A welcome understanding of campaign naming conventions across the entire organization (and its multiple agencies).
  2. An imperative to audit the organization’s martech and adtech stack to ensure maximum interoperability, discarding legacy solutions unable to provide reliable reporting APIs or bulk access to client data.
  3. A push for brand-driven data governance and first-party measurement as opposed to agency-driven, platform-driven, or media-driven measurement.
  4. Ensuring full control over the “data layer” for normalized data collection purposes.
  5. Insights on the sequence/story that customers have chosen to put together with the myriad of digital experiences that we enable, providing basic visibility on previously hidden touchpoints.

Do any or all of these justify your investment? Perhaps, if you ponder the amount of money you could have wasted in useless campaigns that you cannot truly measure anyhow.

But hardly so if you compute the huge cost of opportunity incurred. An attribution project happens at the expense of many others. And the side effects listed above can become far more agile stand-alone endeavors when purposely tackled.

(To be completely fair, some of these side-effects are already identified as the final goal in certain, definitely more sensible, approaches to attribution).

3. WHAT IS THE ALTERNATIVE?

As a summary of all of the above, cross-channel attribution serves a much higher purpose than the attainment of ROI in itself. The alignment of resources and minds towards such common purpose results in a powerful driving force. Quite the paradox: emotions driving the quest to encapsulate emotions.

Should we then find a new dream? Is there an alternative source of inspiration that is actually attainable? Something that has indeed been achieved by colleagues or competitors and in fact resulted in a tangible competitive advantage?

Equally important: does turning our backs on attribution imply a denial of the possibilities of data? Should we stop believing in its ability to determine our priorities, discover anomalies, validate hypotheses, or unveil truly useful insights?

Not at all. There simply is something very different to “attribution” as the pinnacle of data-driven marketing.

How about starting with the acceptance of this new demand-led reality in which you cannot expect to shape or understand each customer journey, but instead you are finally able to obtain a single view of your own business. A single view of your brand. A single view of the experiences you provide. A “brand journey” for your customers.

Opening ourselves to this simple premise paves the ground for three more thoughts:

  1. Customer-level intelligence belongs in the space of data-powered automation/optimization/activation (data as an engine), not data-driven decision-making (data as a witness). In other words: Customer Data Platforms (CDPs) and Data Management Platforms (DMPs), not decision-support systems. As for the mentioned limitations of cross-media and cross-device identification, they will matter little when the focus is placed on the collection and storage of permission-based first-party data. That is, until it is customers who voluntarily store and share their own data at every demand-led interaction.
  2. We love to understand the inner components of every process because that is the world that many of us grew up in. But discerning the pieces is no longer needed to work with the whole. Aggregate data is not inherently inferior to granular data. And there is great power in correlations.
  3. The very nature of digital data (mostly unstructured or semi-structured) has provoked a database and data management revolution. The new models -or rather the fact that models are not a prerequisite for data collection- result in a peaceful coexistence between an in-depth understanding of the few (customers) and a shallow understanding of the many. This is aligned with the said new mindset to let go of the search for neverending causality/structure and, I believe, completely disrupts the traditional approach to “business intelligence.”

There are, in sum, enough open fronts to entertain the most hyperactive and ambitious of CMOs looking to reallocate a sizeable portion of their budgets — only this time backed by first-hand experience.

So, here are a few potential alternatives if you find yourself in the said Trough of Disillusionment, with the essential question being: what were you truly seeking in attribution?

  1. Was it all along about maximizing return on investment, understanding for “return” a very clear set of short-term measurable outcomes? Then the answer could be in marketing mix modeling (using aggregate data).
  2. Was it about making the most of the time and resources at hand? How about adopting agile marketing methodologies? (Get yourself a copy of Scott Brinker’s own Hacking Marketing book.)
  3. Were you hoping to find golden insights in the advanced analysis of multiple sources of data? Great, but why make that “data lake” project the new center of the universe when both decision-making (data-as-a-witness) and data activation (data-as-an-engine) can still fly much faster and further on their own?
  4. Were you instead looking for medium-specific optimization through behavioral, cookie-based analytics and testing? Digital analytics may then be what you need.
  5. Was it about nurturing one-to-one relationships with your customers? Do you really need ongoing, human decision-making for that? Customer data platforms (CDPs), primarily first-party data, and data management platforms (DMPs), primarily third-party data, will let you activate customer data in any content personalization or media buying platforms you may want to plug into them.

If however, what you really are looking for is the said holistic view of your owned, paid, and earned media, then we are back into the realm of human-driven decisions, information delivery, and performance management. And this requires special treatment.

4. A NEW HOPE

We have discussed the growing importance of a 360-degree view of your brand and the experiences it creates in the face of the marketing revolution that a demand-led world has brought about. This purpose is fully aligned with the promise of omni-channel intelligence and not contaminated with customer-level data integration imperatives.

It is at such decision-making level, where ideas meet data, that the most crucial things happen. For the executive decision is the one step that defines the business and cannot be automated.

This concept, somehow sitting “between traditional BI reporting and advanced analytics,” with the specific challenges of digital data and marketing agility in mind, has finally been recognized as a space in its own right by Gartner’s recent Guide for Marketing Dashboards.

And even though “dashboard” is a very generic word that pretty much applies to everything these days, I believe the Gartner team has done a great job at both differentiating “pure players” from generic self-service BI or digital analytics solutions and setting all of them apart from data visualization (or visual discovery) offerings — something long overdue.

Most importantly, the authors mention a few of the things that are unique at this intersection of data/science/performance and ideas/decisions/collective intelligence:

  • Data connectivity for multiple owned, earned, paid media sources
  • Built-in marketing templates and metrics
  • Workflow features allowing the dissemination of insights

And I certainly believe that some of the promises of this new, growing space are just as exciting as those of attribution. To name a few:

  1. An aggregate summary of investments and partial outcomes. Knowing how you are doing half way into the race should be a good indicator of eventual success. Not to mention the cost savings associated to putting an end to countless hours of manual reporting work.
  2. Anomaly detection models that take advantage of a very large amount of behavioral and media performance data. Cost savings come in this case from even more expensive analysts’ hours.
  3. “Contribution models” or “journey distribution flows”, computing aggregate reach, response, or behavioral milestones along the path of online and offline experiences offered by the brand.
  4. A combination of social media metrics with intentional and brand equity benchmarks originated in trusted third parties better positioned to gather consumer feedback than most individual organizations.

All of them cost significantly less than attribution projects and happen within weeks. But given that I am biased — Sweetspot is both focused on the above and one of the four solutions listed as “pure players” in the Gartner guide — you should not take my word for granted.

There you have it, if you must: your ROI.

I very much expect a heated debate. Please bring it on.

Thank you, Sergio. Readers: do you agree or disagree? Share your viewpoint in the comments below, or if you have a longer rebuttal for marketing attribution, let me know — I’d be happy to publish a well-argued counterpoint. A more extensive report from Sergio’s perspective with examples is available on Sweetspot’s website.

The post Analytics CEO makes a passionate case against marketing attribution appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2016/10/analytics-ceo-makes-passionate-case-marketing-attribution/feed/ 26
The absolutely epic Periodic Table of Marketing Signals https://chiefmartec.com/2016/08/epic-periodic-table-marketing-signals/?utm_source=rss&utm_medium=rss&utm_campaign=epic-periodic-table-marketing-signals https://chiefmartec.com/2016/08/epic-periodic-table-marketing-signals/#comments Mon, 29 Aug 2016 10:15:56 +0000 https://chiefmartec.com/?p=1669 The following is a guest post by Steven Wastie, the CMO of Origami Logic. While it naturally aligns with his company’s solution, I think his Marketing Signals Framework — and this absolutely epic Periodic Table of Marketing Signals — transcends their own product and is a powerful concept for marketing technlogy management. CLICK ABOVE FOR A LARGER VERSION As a modern marketer, you’re overwhelmed with information, all the time — and there’s no end in …

The absolutely epic Periodic Table of Marketing Signals Continue Reading »

The post The absolutely epic Periodic Table of Marketing Signals appeared first on Chief Marketing Technologist.

]]>
Periodic Table of Marketing Signals

The following is a guest post by Steven Wastie, the CMO of Origami Logic. While it naturally aligns with his company’s solution, I think his Marketing Signals Framework — and this absolutely epic Periodic Table of Marketing Signals — transcends their own product and is a powerful concept for marketing technlogy management. CLICK ABOVE FOR A LARGER VERSION

As a modern marketer, you’re overwhelmed with information, all the time — and there’s no end in sight as complexity and the pace of change shows absolutely no sign of letting up.

Your job looks nothing like it did five years ago — even one year ago. Yet despite this permanent state of change, you are, more than ever, accountable for knowing exactly what’s happening across your organization’s eclectic mix of marketing activations at all times. So you’d have no problem explaining how your campaigns are performing right now, today. Right?

It’s a simple question, but for most, it’s incredibly difficult — if not impossible — to answer. Mastering measurement has always been hard and, for many, even aspirational. As measurement vectors and approaches become increasingly diverse and sophisticated, many marketers are more than a little overwhelmed.

Marketing Signals: Bridging a Critical Gap

As channels, platforms, media and devices proliferate and fragment, there are more ways for consumers and brands to interact than ever. This is causing an explosion in the number, types, and velocity of marketing interactions, making it difficult for marketers to understand which of them matter, which are relevant, and which warrant a response. With no respite in sight, what’s the solution?

It’s time to start thinking in signals.

Marketing signals make the critical connection between data and insights. They are responses to your marketing activity, such as ad clicks, social interactions, or website visits, along with derived performance indicators provided by marketing platforms such as click-through rate or cost-per-conversion. These signals provide the best picture of audience interest and intent, as well as marketing performance, across paid, earned and owned media.

Right about now, you’re probably thinking, “This sounds great, but how do I actually select the right signals?” A good question, as there are countless signals clamoring for marketers’ attention. Making things more difficult, there’s no such thing as a “top signal” for the masses, or a handy list of prioritized signals to follow. By definition, any signal can be important — each unique campaign dictates the significance of the signal.

To make the proliferation of marketing signals actually work for you, you must first decide on a disciplined way to select, collect, organize, integrate, and analyze them. Only then will you be able to measure and optimize campaigns in a strategic, standardized manner.

No More One-Size-Fits-All Approach

A new, company-agnostic Marketing Signals Framework provides marketers with a “method to the madness.”

By organizing key signals across all marketing channels, this framework helps marketers understand which signals are key to their specific campaigns’ success. Selecting signals according to campaign objectives instead of predefined sets of metrics enables marketers to better align their marketing measurement towards their goals, and generate insights more efficiently (click for a larger version):

Periodic Table of Marketing Signals

The Science of Signals

The Periodic Table of Marketing Signals is a visual representation of the Marketing Signals Framework and organizes signals across marketing channels according to three specific dimensions:

  • Channel: The single channel where the marketing activity is executed. It can be digital (e.g., display, paid search, social) or offline (e.g., print, TV, radio).
  • Objective: The specific business objective of the campaign or activity. Signals are categorized by the following objectives: awareness, engagement, conversion, loyalty, and advocacy.
  • Signal Type: The approach used to quantify campaign success. Signals are organized by volume, quality, efficiency and shares.

Using this framework, marketers can easily identify the signals that matter based on the campaign objective or marketing activity, channels used for activation, and the approach used to quantify success.

Now let’s see it in action.

Imagine your company is launching a product in a new market, with a cross-channel campaign activating across TV, display, and social channels. As this is an initial entrance focused on growth, you do not expect to show ROI immediately. The primary objective is to garner as much awareness as possible, with media efficiency also being an important factor — as you have limited budget per usual. Additionally, you are interested in measuring website engagement volume to assess the increase in brand awareness.

Using the periodic table, you can browse through the red “Awareness” block, to search for volume and efficiency signals in the relevant channels: TV, DPL, and SOC.

Based on this exercise, you end up with the following set of signals to measure:

  • TV: Gross Impressions, Net Reach, GRP, TRP, Cost per Point, CPM
  • Display: Impressions, Reach, Display Time, CPM
  • Social: Impressions, Unique Impressions, CPM

Next, to find Engagement signals, you move to the “Engagement” block, colored in orange, and locate the relevant WEB volume signals:

  • Web: Visits, Visitors, Page Views, Time on Site, Actions

The periodic table represents just one portion of the entire marketing signals framework. Layers above include compound signals, such as cross-channel impressions, or total social engagement across all engagement types. These compound signals help to simplify measurement by reducing the total number of signals and can be customized based on campaign and context.

Anatomy of a Marketing Signal

Each marketing signal has a number of characteristics that describe how they behave, where they are applied, how they break down, how they interact with and affect one another, and how they combine to form compound signals. These characteristics are key to understanding how each signal should be analyzed and interpreted.

Here are some examples:

  • Signal Properties: Understanding detailed signal properties is important for a number of reasons. Some signals are only available for a subset of media or publishers or may represent slightly different types of response depending on the media or platform source.
  • Marketing Signal Properties

  • Signal Interaction: Signals affect one another and some can be derived as a calculation of other signals. Understanding how signals interact helps greatly in analyzing marketing results and identifying the root cause of performance changes.
  • Marketing Signal Interaction

  • Signal Breakdown: Certain signals are themselves aggregates of smaller sub-signals. As the marketing landscape shifts and measurement standards change, sub-signals can gain greater adoption to become independent signals in their own right.
  • Marketing Signal Breakdown

  • Signal Aggregation: Compound signals are formed by aggregating multiple signals within or across channels. They help simplify measurement by reducing the total number of signals required, but should be used with caution as they can be misleading given the wrong context.
  • Marketing Signal Aggregation

Indicators, Reimagined

It’s important to note that marketing signals are not new indicators. Rather, they include and build upon existing measures of performance.

Take ROI, for example. In many cases, ROI is an important metric to use in evaluating the success of a campaign. However, it does not paint a complete picture, as underlying data is often inaccurate, incomplete, or out-of-date. Additionally, ROI metrics may not be in context of individual needs and campaign objectives.

Marketing signals — unlike traditional metrics or data — include multi-dimensional measures of quality and relevance to ensure they generate the best and most valuable insights. Additionally, calculations of KPIs specific to campaign objectives and categorization of performance by brand, country, product, etc., add context to the data and make results more relevant to the business. Finally, marketing signals blend science (numbers and metrics) with art (creative, copy, metadata, and strategy) to comprehensively illustrate what is working.

Simply put, marketing signals go beyond representing results. They reach further, bridging the gap between raw data and insights, allowing marketers to gain immediate and clear direction on where opportunities lie and how to further improve results.

More than the Score

Just as traditional indicators are still very relevant, so too are campaigns. Despite catchy headlines trumpeting the “death of the campaign,” campaigns will never cease to exist, as consistent messaging, over time and across channels, has a greater impact on brand awareness than “random acts of marketing” ever will.

However, in the new world of marketing signal measurement, campaigns must be continuously measured, adjusted, and optimized mid-flight. What happened last week or last month is increasingly insufficient as consumers demand immediacy and relevance. Signals provide marketers with new opportunities to engage consumers in the moment — when they are most receptive.

The marketing discipline continues to evolve and has become a complex, rapidly changing, and noisy daily battle. Cutting through this noise and measuring the signals that matter has become a strategic priority — not just for marketing, but for the business.

It’s not just about keeping score or vanity metrics anymore. It’s about finding new ways to win. Using a standardized framework to organize and analyze signals provides a way to not only understand what your score is right now, but also to distinguish “real” goals from “vanity” goals, find ways to score goals faster and better, and make sure that the goals you are scoring are relevant and understood by the business.

Thanks, Steven!

Looking for more insight on marketing technology management? Be sure to check out our upcoming MarTech Europe conference in London, November 1-2. You can use the code “MarTechInsider” (case sensitive, without the quotes) to receive an extra 10% discount.

The post The absolutely epic Periodic Table of Marketing Signals appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2016/08/epic-periodic-table-marketing-signals/feed/ 3
A Rosetta Stone of marketing technology terminology https://chiefmartec.com/2016/08/rosetta-stone-marketing-technology-terminology/?utm_source=rss&utm_medium=rss&utm_campaign=rosetta-stone-marketing-technology-terminology https://chiefmartec.com/2016/08/rosetta-stone-marketing-technology-terminology/#comments Thu, 25 Aug 2016 15:27:30 +0000 https://chiefmartec.com/?p=1664 Eskimoes have a large number of words for “snow” — or so goes a cliché meme that’s been around for decades. Exactly how different those words are, compared to the languages of any other culture in the world, is a matter of debate. But it’s a remark that almost always elicits a chuckle, since one pictures Eskimoes having a lot of snow in their lives (global warming notwithstanding). So it seems natural that they would …

A Rosetta Stone of marketing technology terminology Continue Reading »

The post A Rosetta Stone of marketing technology terminology appeared first on Chief Marketing Technologist.

]]>
The Marketing Technology Rosetta Stone

Eskimoes have a large number of words for “snow” — or so goes a cliché meme that’s been around for decades. Exactly how different those words are, compared to the languages of any other culture in the world, is a matter of debate.

But it’s a remark that almost always elicits a chuckle, since one pictures Eskimoes having a lot of snow in their lives (global warming notwithstanding). So it seems natural that they would acquire a large vocabulary to describe the many different contexts and characteristics of snow.

So, how many different words do you think marketers have for key data records in CRM and marketing automation systems?

Not surprisingly, quite a few.

Also, not surprisingly, they vary from one martech vendor to another. What HubSpot calls a Contact, Marketo calls a Lead, Salesforce Pardot calls a Prospect, and now ABM-darling Engagio calls a Person. So you can really confuse the hell out of someone by asking if they’ve made contact with a lead and determined if that person is a prospect.

rosetta_stone

But I didn’t fully appreciate the scope of this martech-rose-by-any-other-name phenomenon until I ran into Zak Pines, VP marketing at Bedrock Data and author of the Moneyball Marketer blog, at the #FlipMyFunnel event in Boston a couple of weeks ago.

Zak showed me a preliminary version of the epic infographic at the bottom of this post that attempts to align the different names for key records across 67 marketing technology vendors.

It’s like a Rosetta Stone of martech terms!

I love stuff like this, because it illuminates the challenges we’re wrestling with in marketing technology management — and nomenclature is no small one — but also helps improve the situation by revealing some measure of order below the chaos.

I asked Zak to share a few thoughts on this:

1. For context, what does Bedrock Data do and what’s your role at the company?

Bedrock Data automates cloud integrations. Our customers use our product to connect and synchronize multiple cloud systems in real-time. It’s a key technology for getting customers to a single view of the customer and helping them become more agile with their processes and reporting.

My role is to head up marketing, and creating content of value to our prospects and customers is an important part of our marketing.

2. It’s pretty self-explanatory, but can you describe this infographic and how it came about?

I have the good fortune of sitting right next to the sales team on the phone with customers every single day. The conversations always center around which systems a customer is looking to connect — sometimes three, four even five systems.

It was clear from these customers that every system has their own unique nuances — what types of data they center around, how that software vendor chose to organize its system and what they call things.

3. What are some of the most interesting takeaways from this graphic for you?

First off, it reinforced some of the lessons of your marketing technology supergraphic — that there are many, many cloud systems out there, and that data is often dispersed across those systems.

It’s also a good visual to demonstrate why it’s so hard for companies to get to a single view of the customer. With so many systems, many of which are performing departmental or specific sub-departmental functions, and each with their own unique terminology, you get a sense of the challenge.

The other takeaway which I mentioned in the article published alongside it, is that although vendors may want it to be the case, these systems don’t map to the customer journey in a linear fashion. If you plot the customer journey to these systems, it will bounce all over. So when thinking about aligning this data, it’s not about moving data from point A to point B, it’s about bringing all this data together for a single, consistent view of the customer.

4. Even though the labels vary from one system to another, how similar are the underlying record structures? How big of a problem is this at the next level down, the individual fields associated with things like contacts, companies, and opportunities?

At a basic level, there’s similar data being managed that people care about. People have names and email addresses and phone numbers. Opportunities have a $ amount, and a status and a close date.

But you’ll find that each software company has solved it in their own unique way. It was probably driven at the inception of that product by a product manager or an engineer, and the nomenclature for that system has stuck.

5. Do you think this genie can ever be put back in the bottle to define one common data language for modern marketing?

I think that’s very ambitious for the industry, Scott.

I think step one, and probably the most important step, is for individual companies to have standard terminology and definitions that they are using across their business.

The different system terminology adds to the confusion, but I’ve seen many companies where not only are the systems using different nomenclature, but the teams are too in their day-to-day interactions. You’ll hear questions like, “What’s a lead? What’s a contact? What’s a campaign? What’s a program?”

Tying back to the infographic, the line across the top is the Bedrock Data term that we use as a standard. So when we are working with customer and connecting multiple systems, data is getting normalized across all of the connected systems using a common Bedrock term.

Matrix of Marketing Technology Terminology

Thanks to Zak and the team at Bedrock Data for pulling this together!

And if you’re looking to make more sense of marketing technology management, be sure to check out our upcoming MarTech Europe conference in London, November 1-2. You can use the code “MarTechInsider” (case sensitive, without the quotes) to receive an extra 10% discount.

The post A Rosetta Stone of marketing technology terminology appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2016/08/rosetta-stone-marketing-technology-terminology/feed/ 1
Storytelling and data: when beautiful metrics can’t beat words https://chiefmartec.com/2016/04/storytelling-data-beautiful-metrics-cant-beat-words/?utm_source=rss&utm_medium=rss&utm_campaign=storytelling-data-beautiful-metrics-cant-beat-words https://chiefmartec.com/2016/04/storytelling-data-beautiful-metrics-cant-beat-words/#comments Wed, 13 Apr 2016 13:34:04 +0000 https://chiefmartec.com/?p=1543 The following is a guest post by Sergio Maldonado, founder and CEO of Sweetspot Intelligence. Family storytelling die-hards may have heard of Bill Gordh, an award-winning, banjo-playing storyteller that has performed with the New York Philharmonic as well as at the White House Easter Egg Roll. He has also traveled extensively around the world. I had the opportunity to attend one of Bill’s storytelling workshops a few weeks ago. Although he did not go into …

Storytelling and data: when beautiful metrics can’t beat words Continue Reading »

The post Storytelling and data: when beautiful metrics can’t beat words appeared first on Chief Marketing Technologist.

]]>
Storytelling and Data

The following is a guest post by Sergio Maldonado, founder and CEO of Sweetspot Intelligence.

Family storytelling die-hards may have heard of Bill Gordh, an award-winning, banjo-playing storyteller that has performed with the New York Philharmonic as well as at the White House Easter Egg Roll. He has also traveled extensively around the world.

I had the opportunity to attend one of Bill’s storytelling workshops a few weeks ago. Although he did not go into dissecting the specific tricks of his art, a few things called my attention:

  • He turned a very basic script into an exciting adventure, peppering a rhythmic journey with various amusing stopovers that granted engagement. Then he rounded it up by making a dramatic scene out of a simple ending.
  • He ensured consistency through repeated structural elements across his story’s timeline. This allowed children to easily follow and participate in a few guessing games.
  • He added images to reinforce key episodes in his story.

When the workshop was over (and we proudly walked away with our own visual storyboard), I started thinking about all the misconceptions that now exist around storytelling in the marketing technology space.

Are we really using storytelling techniques in our regular exchange of data-driven insights? Can data visualization amount to storytelling, in itself? What are the missing pieces, if not?

Here’s a quick analysis of the very concept of storytelling as it applies to marketing and business in general, followed by a few thoughts on how to make the most of it in a data-rich environment.

Storytelling, deconstructed

Combining multiple alternative definitions into one, I would put this forward: Storytelling is the act of communicating ideas through the use of narrative elements with the purpose of educating, entertaining or influencing others.

A few key elements, therefore:

  • Communicating: With the yet unanswered question as to whether human language has to be present at all, in oral or any other form. More on this in a few paragraphs.
  • Ideas and purpose: These two come closely intertwined. Although a sequence of events may amount to a story, it is the underlying ideas that allow us to achieve our purpose in telling it.
  • Narrative elements: Ideas alone are not stories. They must be tied to events or facts, which are put into narrative format to ensure emotional impact and future recollection.

An easy way to see these elements evolving and joining forces is to look back at the history of storytelling, as many of the ancient cultures we have come across left evidence of its usage as a means to transfer knowledge, explain the unknown or project authority.

The documentation of storytelling in our modern, business context can be traced back to Dale Carnegie’s theories on public speaking. Mr. Carnegie argued that stories are key to make ideas clear, interesting and persuasive. He also covered various techniques to succeed with real-life illustrations, including the focus on details and the use of words that allowed the audience to visualize “pictures” (such as proper nouns or figures, as opposed to abstract concepts).

This can easily be connected with the thoughts of Stanford University’s Jennifer Aaker in our current times: we extract meaning from the personal connection we make with stories, and this connection brings about an emotional side of decision-making that, we now know, precedes logic.

Storytelling techniques have abounded in multiple areas of business management. From David Ogilvy’s ads to Steve Jobs’ pitches, many well-known business leaders have successfully leveraged them. More traditional purposes have simply been replaced (disguised?) by our everyday business needs: selling, building strong teams, defining brands… or acting on data.

Together with these new purposes, the means by which storytelling can be delivered have also suffered their own evolution in the new, digital, context of business. It remains to be seen, however, whether we will eventually be able to find a clear break between form and function.

Form vs. function

Revisiting the first and last points in our definition of storytelling (communication, narrative): Is human language required? Is there storytelling beyond oral expression? Do podcasts and video qualify? Where do images, symbols and metrics fit?

Let’s try to answer these questions:

a) Narrative as a basic requirement of form
Narrative, as a representation of connected events, is perhaps the most crucial form factor in storytelling. Albeit it does not require words (think of a comic strip without dialogue), narrative does demand a sense of progress over time: events are placed along an imaginary timeline, even though flashbacks and parallel sequences may be at play.

Arguably, elements of consistency should also be present in narrative. These elements keep events tightly connected to each other regardless of the manner in which chronology is used. Though, of course, these elements are more tied to semantics than structure and, as a result, dependent on language, our next item.

b) Human language
One of the first books I happily bought in my early twenties was a bilingual (English-Spanish) copy of Anna Livia Plurabelle, the most widely discussed chapter in James Joyce’s last book, Finnegans Wake.

I was absolutely fascinated by the mere idea of the author making up his own language through conceptual shortcuts, and even borrowing terms from five or more languages as he saw fit to best express a given thought (which makes for a rather hopeless translation). I wondered:

Could human language become akin to “object-oriented” programming? Could we start adding levels of abstraction until the evolution of our species morphed our heads to look like those of the martians in our comic books?

Joyce’s original idea was simple, but extremely powerful: language determines the limits of our thoughts, and it is only through a richer representation of ideas/concepts that we can boost our own understanding of every experience.

Thousands of essays had been written before on the nature of language from multiple angles. While Kant considered it the ultimate representation of thought, Russeau called it an “instinctive expression of emotions”. In most recent times, Wittgenstein has defended the theory that language shapes our experience of the world. An approach that could be easily connected with Joyce’s: the limits of your language are the limits of your world.

Our very culture or experiences are reflected in the manner in which we employ or process words. While metrics and their various representations can be culture-neutral (and never entirely: take colors!), words are undeniably partial.

When we hear the word “subtle” we do not expect violent turns. When the word “problem” appears in a phrase, our brains prepare for impact. So even prior to dealing with elements of narrative, How then can the choice of words not prove essential to obtain a desired effect?

In other words (paradoxically enough), human language is not just form, but also function, when it comes to appealing to the emotions of others. Which in turn makes language itself unavoidable if storytelling is meant to have and emotional and memorable impact.

c) The language of images
If language is a combination of words and symbols (as these may simply represent such words or the letters they are made of), Where do images belong? Do they not share the same roots? After all, symbols, as a representation of reality were all we had before language even existed.

We could argue that many symbols would qualify as images as long as they remain “visual” representations of something (take hieroglyphs). But just as many symbols remain dissociated from real objects, images can easily fall short of becoming a symbol -e.g. by representing someone in merely descriptive terms.

As human beings, our ability to create images preceded written text by 32,000 years (records of the latter will “only” take us 8,000 years back), so it would make sense to accept that images played the role of written language while specific words were slowly being codified. Which in turn would separate our ability to understand the world around us from our capacity to asynchronously -not being present- communicate such understanding.

This denial of an “objective” understanding in the absence of words, coupled with a denial of images as a direct shortcut to the meaning of objects or scenes being represented is fully consistent with our everyday experiences. Suffice to quote digital analytics guru Avinash Kaushik in relation to using “stock photos” when presenting data insights: “Photos are very personal. We bring our biases, our life experience [… ]. You lose control of the story”.

But surely we cannot abandon our faith in images. After all, they are worth “a thousand words”, or so we often repeat without much questioning. Do we refer to its ability to communicate while being presented? Are we talking about our ability to remember instead? How about its potential to educate, entertain or drive action?

I was very happy to come across a study that finally dared to explore this assertion further. In Reduction and elimination of format effects on recall, Paul W. Foos and Paula Goolkasian focused solely on memory and concluded that an image was in fact worth 1.5 printed words (and little more than a single spoken word).

This is all very interesting, but we would still need to draw the line between images replacing “language” (photographs, conceptual or descriptive drawings) and images representing numbers. Do we process metrics (and their graphical representation) differently or do we transform them into words/language?

d) The language of numbers
“Number sense” is the term regularly used to describe the intuitive understanding of numbers by human beings. This entails our ability to count, but also any other animal’s ability to perceive changes in a number of things in a collection.

But back to our question: Do we process numbers independently from language?

Only to a very limited extent. As some studies show, it is through language that we are able to link up our small, exact number abilities (naturally being able to count up to three) with our large, approximate number abilities (naturally being able to understand that we have “many” things). In other words, mathematics requires abstraction, and abstraction is built on language and symbolic representation -a numeric system.

This is even supported by archeological evidence that language predates numeracy. Not to mention the many illustrations of language influencing mathematical ability.

So we must accept a close relationship between human language and mathematics, but, as discussed earlier, numbers and their symbols are actually language-neutral. And insofar as they allow us to build a substitution layer that is common to multiple languages, they provide a means of record and communication that is dissociated from human language… and the empathy that comes with it.

Jennifer Aaker’s thoughts come very handy once again. Speaking at the 2013 Future of Storytelling Summit, the social psychologist explained (and illustrated with data) that stories are memorable, impactful, and personal in a way that “statistics” are not.

e) Audio and video
Recent decades have taken recorded human language beyond text and images, to audio files and video recordings. Digitalization has made both of them widely available to anybody with as little as a mobile phone. Neither podcasts nor video files have not stopped growing in popularity in the business context.

On the basis of the previously mentioned Foos and Goolkasian study, a podcast would have a stronger impact than written text, if only in terms of our ability to remember the underlying message.

Video format can take this even further, allowing us to support such recorded speech with visual elements, while avoiding a lack of control in the manner in which message and visualization are processed together. In other words, video provides a synchronous illustration of recorded speech that becomes the closest thing to oral communication. As a disadvantage, however, all supporting elements of the story would have to be embedded within the video, conforming a self-contained piece instead of an effective complement to other environments where, for instance, data becomes the primary context.

Storytelling meets data

In summary, data visualizations cannot amount to storytelling. Void of language, they are unable to independently transmit ideas, clear purpose and emotional impact.

The very concept of “data storytelling” seems to me rather far-fetched as a result.

This said, as business management is increasingly more data-driven, metrics are indeed becoming a crucial part of any story. After all, data visualizations do provide excellent support when metrics are part of the story.

This said, as business management is increasingly more data-driven, metrics are indeed becoming a crucial part of any story. We could ask ourselves, however, on a case-by-case basis:

  1. Is the data shown at the heart of the ideas we aim to communicate?
  2. Is having an impact on the metrics being represented the purpose of the story?

Storytelling is often put at the service of entertainment or coverage of current events when none of these conditions (ideas and purpose, as per our prior definition) is met, with data playing the role of providing additional context or back-up information. Multiple examples of this can be found in the media. As such, supporting data visualizations will range from static point-in-time charts to open, reader-driven visual discovery widgets (providing a tool for the audience to retrieve valuable related information).

If only the first condition is met, storytelling will be at the service of information delivery (an equally legitimate scenario in data-driven management) or the exercise of influence, having an impact on other business metrics not directly related to those on display -e.g. higher-level objectives in the organization.

If, however, both conditions are met, storytelling will have been put at the service of data actionability. The impact of human language is in this case required to provoke a measurable reaction on the part of internal stakeholders in the face of a particular set of metrics. A combination of author-driven data visualizations, language and images will be at play in this case.

This (“action-driven storytelling”) could happen as a stand-alone effort or as part of predefined process. The first one can be as simple as an infographic (combining words, symbols, images and numbers in a logical arrangement that favors sequence). The second would be best understood in the context of Insight Management methodologies.

Under the original Digital Insight Management principles laid out by Eric T. Peterson in 2012 (in a paper sponsored by Sweetspot), digital analysts are provided with a means to accompany their insights with recommended actions (a “bottom-up insight delivery” process). Once acted upon (by decision-making data consumers), the impact of those actions is recorded along the timeline provided for each KPI.

This data-driven optimization workflow was soon followed by an alternative top-down approach to insight management (a “flagging” system), built on the premise that an effective distribution of metrics results in management being in the best possible position to dynamically define the priorities of the analyst’s job.

But there still was room for improvement. As businesses demand open, interoperable ecosystems, Insight Management had to become a natural part of existing enterprise collaborative and generic workflow environments. Furthermore, together with KPI updates, insights had to become ubiquitous, permeating other layers of internal communication or information delivery. And this had to happen in a way that supported author-driven narrative features.

This led to what we now call Integrated Insight Management, bringing about a new perspective of storytelling that is focused on performance, with words and narrative put at the service of data actionability.

Integrated Insight Management

Machine learning, automation and scalability

An elementary axiom supports all three insight management approaches (bottom-up, top-down, integrated): data analytics cannot happen without the intervention of data analysts or data scientists, no matter how sophisticated our tools or how clean our data.

Palantir’s Peter Thiel has supported this assertion better than anyone else in Zero to One (2014):

“We have let ourselves become enchanted by big data only because we exotize technology. We’re impressed with small feats accomplished by computers alone, but we ignore big achievements from complementarity because the human contribution makes them less uncanny.”

But we are never free of hype and fads (the new wave of storytelling has a lot to thank them for too), and so we are now faced with the collective illusion of “self-service business intelligence,” on the basis that data consumers — business stakeholders — can themselves be empowered with unlimited data exploration and forecasting capabilities across a myriad of structured, unstructured and semi-structured data sources, regardless of how incompatible or poor the various data models or data sets involved may be.

Besides a large amount of limitations (technical, performance-related and data governance-related), these attempts relegate storytelling and insight management to ad hoc reports and meetings aimed to obtain buy-in for a given action. In other words, they deprive organizations of the progress already made in connecting the dots between performance management, data governance and insight actionability.

But not all are bad news when it comes to automation and machine learning: a repeated “data»insight»action»measurement” process against a common set of metrics in a well-delimited business domain will result in a valuable repository of the best potential courses of action when faced with future challenges.

And there is much more within reach today:

  • Automated generation of textual summaries of current performance against goals or past periods by combining the “what” that regular KPI updates embody with the “why” that their most closely associated dimensional breakdown represents.
  • Text analytics to predict the potential impact of a given set of words.
  • Automated reshuffle and update of pre-built infographic modules to shape stories that drive action.

It is certainly time to bring some of these together with our current storytelling and insight management capabilities.

Final thoughts

The division of roles in the marketing data space (analyst-data consumer-other stakeholders) is unstoppable, no matter how powerful our algorithms and grandiloquent our defense of “self-service BI”.

This plurality results in many data analysts and even more data consumers quickly becoming “data ambassadors” and “information delivery experts” in need of communication tools that ensure the impact of metrics (and their insights) on the broader organization, starting with the unmatched power of words.

Storytelling takes these words one step beyond, provoking emotional connections that drive action. I strongly believe that insight management methodologies provide the best possible grounds today for this powerful tool to thrive.

Data integration and analysis endeavors have already taken irrational amounts of budget and time for the little real impact they have had on the large organization. I believe it is time to put a fraction of such investments on the effective delivery of metrics and data insights. And I suspect this people-focused layer holds the answer to justifying every other prior effort.

Thank you, Sergio! For those of you who would like to read more about this topic, Sweetspot Intelligence offers a white paper that combines in-depth coverage of these and other points with case studies and product screenshots, available here: The Marketer’s Path to Data-driven Storytelling and Actionable Insights.

The post Storytelling and data: when beautiful metrics can’t beat words appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2016/04/storytelling-data-beautiful-metrics-cant-beat-words/feed/ 4
Why data-driven marketers shouldn’t trust the data fully https://chiefmartec.com/2013/11/data-driven-marketers-shouldnt-trust-data/?utm_source=rss&utm_medium=rss&utm_campaign=data-driven-marketers-shouldnt-trust-data https://chiefmartec.com/2013/11/data-driven-marketers-shouldnt-trust-data/#comments Tue, 12 Nov 2013 13:37:02 +0000 https://chiefmartec.com/?p=599 There’s a great new report out by the IBM Institute for Business Value, Analytics: A blueprint for value, that’s well worth reading. The above graphic — stating that 66% of leaders from their research are “confident” in their data and “trust” it — is excerpted from it. However, this post is not about that report (well, not really). Let me first disclaim that the context of that graph, as best as I can tell, is …

Why data-driven marketers shouldn’t trust the data fully Continue Reading »

The post Why data-driven marketers shouldn’t trust the data fully appeared first on Chief Marketing Technologist.

]]>
In Data We Trust

There’s a great new report out by the IBM Institute for Business Value, Analytics: A blueprint for value, that’s well worth reading. The above graphic — stating that 66% of leaders from their research are “confident” in their data and “trust” it — is excerpted from it.

However, this post is not about that report (well, not really).

Let me first disclaim that the context of that graph, as best as I can tell, is that leaders are confident about using data in day-to-day decision-making. They trust the quality of the data, where quality is a function of timeliness and accuracy within a set of enterprise-class data standards. They have faith in the capabilities of their analytics teams and technologies.

And that’s all good.

But the choice of labels on that graph — admittedly, taken out of context — triggered an important tangent that I think is worth discussing. It’s good to be confident in embracing data-driven management, but great data-driven managers should always be skeptical of the data.

That isn’t a contradiction, even if it sounds like one.

A tale of two strategies and the data that stood between them

Let me start with a real-world example, which a director of marketing at a high-tech company recently shared with me (with some identifiable details changed for anonymity). They sell an IT infrastructure product, but one that is typically evaluated by front-line IT managers rather than the CIO. However, as part of their growth strategy, they sought the CIO’s attention to sell broader solutions.

As one way to accomplish that, the director proposed making changes and additions to their website to address CIOs more directly. Yet he got tremendous pushback from the web marketing team who, referring to their web analytics, insisted that the data showed they shouldn’t do that. The more deep and technical their content was, the more visits and shares it received. Their data-driven conclusion was that they should create more front-line manager content, rather than content for the CIO.

Since the web marketing team was evaluated by metrics such as visits and shares — chalk one up for data-driven incentives — they were reluctant to pursue any initiatives that didn’t linearly extrapolate from their existing strategy. Note the circular nature of this argument:

  • We produce web content for front-line IT managers.
  • This gets us visits and shares from front-line IT managers.
  • Therefore, we should produce more content for front-line IT managers.

This isn’t being data-driven. It’s more like being data-enslaved.

At no point in this debate did anyone doubt the “accuracy” of the data. But that data exists in a context. The director was interested in a different context. He was asking questions in the context of CIOs, but the web team was answering with data in the context of front-line IT managers.

Looking at this from afar, it may seem obvious — there are many options for how to resolve this dispute. But strategic inflection points like this are a natural place for such disconnects to occur. In some ways, this is Clay Christensen’s classic innovator’s dilemma: what worked in the past holds us back from what needs to change in the future. Data from the previous strategic context can further calcify that prior worldview. To break out of the dilemma, you need to consider different data.

It’s not a question about whether the data is right, but whether it’s the right data.

By the way, this story has a happy ending. They set up a different site for content marketing to CIOs — an experiment to see if this marketing director’s hypothesis could be proven — which grew to be very successful. It never received as much absolute traffic as the site that served front-line IT managers — after all, there are many, many more front-line IT managers than CIOs out there. But their pipeline for CIO-level solution sales grew tremendously, as did their brand reputation at the C-suite level.

Being skeptical of the data, not skeptical of using data

Data was supposed to settle arguments, to get us out of situations where people just argued from their gut, and eventually somebody pulled rank (the HiPPO).

Ironically, however, it can have the opposite effect:

  1. People develop a belief.
  2. They find data that correlates with that belief.
  3. They use that data to cement their belief into an unassailable tower.

One need look no further than the rancor of partisan politics in the U.S., and the ways in which opposing nuggets of data are used to cudgel the other side rather than objectively seek the truth, to see this folly in action. The two problems at the heart of this dilemma are:

  1. There’s a nearly unlimited amount of data to choose from out there.
  2. There’s a nearly unlimited number of narratives one can build around that data.

Once we recognize that, we realize that data doesn’t necessarily settle arguments. Data can tell us what, but it can’t inherently tell us why. To uncover the why, we need a very specific kind of data: data from controlled experiments. Since why is often more important than what for deciding a future course of action, experimentation is a more powerful management tool than data analysis.

This is why I believe that big testing is more important than big data.

Of course, not all of the data we deal with comes from controlled experiments. For that matter, it’s very hard to have a perfectly controlled experiment in the real-world. We have to deal with data that is imperfect in helping us answer why. But imperfect data can be far better than using no data at all.

The caveat of “can be” in that last sentence boils down to using data to make decisions, while simultaneously remaining skeptical of the narrative, implied or explicit, surrounding that data.

You can have confidence in using data while not fully trusting the data. It’s not that you doubt its accuracy. But you’re skeptical of data’s context, relevance, and completeness in the conclusions that are being drawn from it. (Note that skepticism is not cynicism — it’s about questioning for the purpose of progress, not questioning for the purpose of being obstinate.)

I’ve come to believe that the best way to work with data strategically is to adopt a Bayesian worldview, which has been advocated by two of my favorite marketing strategy thinkers, Gord Hotchkiss and Greg Satell.

In Bayesian strategy, it’s not about data being used conclusively, but data being used probabilistically to continuously update a mental model of the market. It’s not about certainty, but relative certainty — with an open mind to data that may alter one’s Bayesian beliefs. You proactively look for signals of change.

“Fact-based decision-making” is a term that’s growing in popularity. And while I laud the intent behind it — data over opinions — I remain cautious that “facts” are a slippery thing, especially when you go from the what to the why. Opinions can be all too easily disguised with data to look like facts.

Good fact-based decision-making reveals itself by being as open to data that contradicts the current direction as it is to data that confirms — or can be interpreted, with some creative narrative, to confirm — the status quo.

P.S. For a shorter version of other heuristics for thinking about data, see my post earlier this year on 14 rules for data-driven, not data-deluded, marketing.

The post Why data-driven marketers shouldn’t trust the data fully appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2013/11/data-driven-marketers-shouldnt-trust-data/feed/ 2
2 human strengths that are also 2 data weaknesses https://chiefmartec.com/2013/10/2-human-strengths-also-2-data-weaknesses/?utm_source=rss&utm_medium=rss&utm_campaign=2-human-strengths-also-2-data-weaknesses https://chiefmartec.com/2013/10/2-human-strengths-also-2-data-weaknesses/#comments Tue, 01 Oct 2013 13:32:20 +0000 https://chiefmartec.com/?p=574 Two of the greatest strengths of the human mind in marketing: Our ability to detect patterns from imperfect information. Our ability to tell stories that resonate and spread. Ironically, these are also two of our biggest weaknesses in data-driven marketing I’m presenting today at DemandCon on a panel about effective marketing management with data. The official title of the session is “measuring marketing impact.” Yet while I am firm believer in the power of data …

2 human strengths that are also 2 data weaknesses Continue Reading »

The post 2 human strengths that are also 2 data weaknesses appeared first on Chief Marketing Technologist.

]]>
Two of the greatest strengths of the human mind in marketing:

  1. Our ability to detect patterns from imperfect information.
  2. Our ability to tell stories that resonate and spread.

Ironically, these are also two of our biggest weaknesses in data-driven marketing

I’m presenting today at DemandCon on a panel about effective marketing management with data. The official title of the session is “measuring marketing impact.” Yet while I am firm believer in the power of data in modern marketing, I am cautious of the many ways in which data can be abused.

So I may be playing the foil on today’s panel.

Most of what I’ll be discussing is covered in two posts that I wrote last month:

But I have two new images that I’ll be sharing with the audience to illustrate the points at the top of this post.

Patterns are everywhere, but dead men don’t wear plaid

First, there’s this one, which came from a set of my daughter’s pre-school flip cards:

See the pattern?

What shape comes next? It’s almost reflexive: circle.

Our brains are wired to naturally detect these inductive patterns. It’s a feature. But it’s also a bug. Because, when you think about it, that’s an incredibly small sample of data that we’re looking at. How can we conclude the answer is “circle” so swiftly and confidently?

How do we not know that the pattern is one triangle, one circle, two triangles, one circle, three triangles, one circle, etc. That’s certainly a possibility. And one that, if we saw a larger sample of the data, we’d be able to instantly conclude that the next objects would be four triangles followed by one circle, and so on.

Yet we can invent many other patterns too — an infinite number, actually.

We also have to consider noise. What if the sample we saw had some errors, misreadings? Someone recorded a triangle where there should have been a circle, or accidentally drew an hexagon instead of a circle. How robust would our pattern detection be to such errors? The answer: it depends. It depends on the pattern, the noise, and frankly, whether there really is a pattern, or just something that coincidentally looks like one.

It’s a powerful gift we have to detect patterns in data. But we can deceive ourselves if we simply take patterns at face value. The pragmatic marketing thing to “detect but verify.”

We can seek to triangulate the pattern with other sources of data. Or, better yet, we can run experiments to test our hypotheses. While we can never get a perfect guarantee, we can use such techniques to dramatically increase the likelihood that the pattern we’ve identified is real (and meaningful).

All data-driven marketers are storytellers

The second image is this exchange I had with Jeff Cram a few months ago on Twitter:

Data != Truth

Jeff was admonishing a big data panel on their use of outdated data to make an important point. More recent data actually contradicted their argument. As Robert Plant said, that’s “pumping irony.” So I made a wisecrack about data isn’t truth. Jeff quipped back, “Not until it’s retweeted.”

And that’s the problem in a nutshell.

Marketers, perhaps more than anyone, are excellent storytellers. Seth Godin has eloquently made that point. As it turns out, we’re also highly susceptible to good stories too.

So we can take an anecdote of data that we hear once — perhaps from some survey that someone ran years ago, with who-knows-what kind of methodology, that produced a tantalizing soundbite that has bounced around the web with various interpretations and embellishments — and accept it as gospel. We can then weave it into our own sermon. We believe it. And with enough passion, we can persuade others to believe it.

But that doesn’t necessarily make it true.

At least not true in the present, at a different time, under different circumstances.

Don’t get me wrong. Raw data is boring as hell. To leverage data to accomplish good marketing, we have to put it in the context of a meaningful story. But the more scientifically accurate word should be “hypothesis,” not story.

As with our prone-to-misfiring pattern matching talents, we should listen to but verify such stories — even our own. If it’s a story that we’re going to hang any important initiative on (or our reputation as an authority on a conference panel), it is worth verifying that we have the latest and most relevant source.

Where feasible, the best thing we can do — again — is to run an experiment to verify that hypothesis in the native context where we expect to apply it. The power of experimentation to separate data myths from data reality is the key reason why I continually advocate for big testing over big data.

In our digital age, perhaps the biggest mistake marketing can make is to ignore the data. But the second biggest mistake may be to trust the data too much.

The post 2 human strengths that are also 2 data weaknesses appeared first on Chief Marketing Technologist.

]]>
https://chiefmartec.com/2013/10/2-human-strengths-also-2-data-weaknesses/feed/ 1