“We don’t market to people anymore. We market to algorithms.” And therein lays a great danger, warned Kodi Foster (senior vice president of data strategy at Viacom) at PSFK CXI 2018, Innovation in the New Consumer Experience, in New York.

Foster passionately presented a warning about AI-driven tech/media platforms that seek to change human behavior. From a martech standpoint, “We don’t market to people anymore. We’re marketing to the devices that stand between us and people. We’re trying to game theory algorithms to get our marketing messages, content, whatever, in front of a human being.

We’re not trying to understand human beings; we’re trying to understand the biases of algorithms. Whether you’re in content creation, marketing or advertising, that’s your job. Understanding human beings is NOT your job. Just accept that.”

Those laboring in the data science community to create predictive algorithms understand that the majority of their input is worthless. “Think about that when you are trying to make predictions based on algorithms, because most of the input is s@#t,” said Foster. “Even though they tell you they are trying to make you more connected; they are really working to promote something, a wildly important nuance.”

Beware Curated Promotion

As all promotion is curated, the question becomes: who is providing the curation? Curation then becomes a self-fulfilling prophecy within those ecosystems as algorithms display only the content they want you to see. So when you engage, the algorithm believes it was right and subsequently serves up more of the same kind of stuff. “But how does the algorithm know that was the stuff you wanted in the first place? The more you click, the more you get similar content,” added Foster.

The problem is this approach creates a social contagion. For example, if you use the same kind of cascade model that the U.S. Center for Disease Control employs to forecast the spread of the flu, it’s quite similar to what happens across social media platforms and messaging tools. Information spreads in the same way, causing social contagion. “This is largely what has led to the spread of the fake news problem,” declares Foster, “because we assume that everyone in our network believes the same things that we do and we, therefore, assume that our belief is bigger across the world than it really is. We believe that more people share that belief because everyone we’re around believes the same thing. That is the danger of the social echo system.”

This problem is actually more frightening than we presently believe – witness Facebook’s recent problems with the US government. The issues are starting to come into focus, but we’re not yet fully aware of the ramifications. “This is a big deal because if you are trying to understand people and audiences and how to engage them, how can you deal with them if the data isn’t accurate?

We believe the Mark Zuckerbergs of the world when they tell us that data is really trying to connect us, when actually they have other goals. Only recently have we begun to understand what happens as the tissues of human civilization become more and more fragmented. How do you create a predictive algorithm when there are tens of thousands of ‘realities’? Whose reality do you follow?” bemoaned Foster.

While data scientists busy themselves trying to create a theory of everything to reach Millennials or Gen Z, how can they succeed when the data pulled from social platforms is hardly the best proxy for understanding people? “Ultimately, how are we going to entertain people in meaningful and authentic ways, when we are essentially using lies to connect with them?”

While the data community tries to make sense of it all, one must ask if they are actually contributing anything useful or if they are just taking advantage of people who don’t quite understand what they do. Surely, these issues should be of paramount importance to brand stewards, especially in light of what we have witnessed in Europe with the implementation of GDPR.

Cover image: Markus Spiske