Tuesday 2¢: A Semantic Database, Artificial Intelligence and Why Marketers Should Care

As you’ll read on this website, one of the things that is different about censhare is that it’s developed around the concepts of a Semantic, Graph Database, rather than...

  1. chevron left iconTuesday 2¢: A Semantic Database, Artificial Intelligence and Why Marketers Should Care
ian-truscott.jpg
Ian TruscottJuly 25, 2017
  • Digital Marketing
  • Technology

Welcome to the Tuesday 2¢. It’s Tuesday, the weekend is a distant memory and it’s time to let off some steam and give our 2 cents on a hot industry topic. This week Ian Truscott gets his geek on, talking about semantic databases and why marketers should care about them.


As you’ll read on this website, one of the things that is different about censhare is that it’s developed around the concepts of a Semantic, Graph Database, rather than a relational database management system (RDMS) that content management systems have used for decades.

There are a lot of technical buzzwords there, it’s something we put a lot of R&D investment into and have a patent pending on the latest iteration of this technology. It is the “Smart” in our Universal, Smart Content Management Platform, but why take this approach, what does this mean for marketers?

The simple answer is that it’s extremely fast at delivering content, particularly when delivering content based on complex relationships for use cases such as personalization and it scales massively. The bigger motivation is to build something that is looking forward, into the future of content management and its delivery.

To describe this forward-looking position, I often talk about Semantic technology being the foundation for the next wave of the customer experience; Artificial Intelligence. Which, as we can see from consumer devices like Google Home and Amazon Echo is now more than a future trend, but is a reality in consumer content consumption.

Virtual assistants are also playing a greater role in the front-line of customer interactions with automated call handling in call centers and chat bots appearing on websites, that attempt to triage and deal with simple customer queries, or establishing basic information before handing over to a real live operator.

Those are the obvious use cases for AI, simulating a human interaction - but let’s consider a less “in your face” form of intelligent customer interaction; the current marketer’s challenge du jour - personalization. Or to be more specific the delivery of relevant content and offers to the consumer, across multiple channels using the plethora of big data breadcrumbs the consumer is sharing with us.

The business case for personalization is proven, it improves every metric a marketer has, from open rates to engagement, in B2C it fuels cross-sell, upsell and repeat business (etc etc). Yet, the history of delivering relevant, personalised content has traditionally been based on creating simplistic hand cranked rules.

Meanwhile, the context of a consumer, into which we want to serve content is ever more complicated; more devices, more touchpoints and a more fragmented consumer journey. Consumers are also increasingly savvy about what a good experience should be, what their data is worth and what marketing technology is capable of and this isn’t just B2C, but B2B buyers are expecting the same experience.

Therefore, the assumptions in these simple personalization rules are very easy to get wrong and the consumer can see it when it happens.

Everyone has a story about terrible personalization, such as after buying a gift on-line for a niece and having a digital customer experience that then treats them like a 13-year-old girl, with product recommendations that follow them around on sites such as Facebook.

The need to apply some intelligence is apparent.

Therefore, content delivery driven by some form of Artificial Intelligence is the next step in the evolution of our systems of engagement and clearly needs to be considered a component of a digital experience strategy.

And, according to Virtual Strategy Magazine, it starts with the underlying database or data model:

[Artificial Intelligence] ..starts with the basic principles of semantic technologies and data linking, which create a basic framework for artificial intelligence and cognitive computing to exist. The linking of data in a declarative, standardized manner is integral to the forms of pattern recognition and machine learning that define cognitive computing.

This is even before we get to the part about how this technology can scale, as the content storage demands will become ever more complex, as we consider voice, virtual reality, augmented reality, 3D models as “content” to be managed and served to the consumer.

Our opinion is if you are going to choose a technology to invest in and put at the centre of a 21st century content operation, with all of this to come, then a semantic database is the sensible choice.

ian-truscott.jpg
Ian Truscott
Ian Truscott has a passion for creating ART (Awareness, Revenue and Trust) for B2B software companies as a marketing leader and is a censhare alumni. Wanting to connect a like minded community and share something useful, he founded Rockstar CMO, a monthly digital publication, and is currently helping B2B companies create ART at appropingo.

Want to learn more?