Guest Blogger

Personal data is the mother’s milk of all nascent intelligent assistants. That image may be a little more graphic than most of us would like, but it provides a certain framing for a discussion of the sanctity of the relationship between an individual and virtual agent. The best agents, both virtual and live, best serve customers when they have a decent amount of “context” to work with. In the enterprise setting, it starts with strong confidence that the individual on the line is the person he or she claims to be. That makes identification and verification a key component of the customer engagement model – but that’s a topic for another day.

Once a customer’s identity is established, quality of service and engagement depends on the amount of accurate data or metadata that can be brought to bear to anticipate questions, recommend next actions and provide best answers. Knowledge of past purchases and payments, loyalty program status and transcripts of past conversations can be culled from internal systems. Other data and metadata, such as current location, credit scores and other items of interest can be obtained from third parties.

All these informational elements, while vital for establishing productive conversations, are also classified as personal or personally identifiable information (PII). For many years, enterprises have been bound by the European Union’s Data Protection Directive 95/46/EC (aka “The Directive”) in ways that varied slightly from country to country but was often distilled to “thou shalt only collect personal data for a specific purpose and with the informed consent of the person providing such data.”

In addition to the challenges created by The Directive surrounding data acquisition, the costs that arise from data breaches in terms of expense and loss of reputation are formidable. Under the current law, CEOs must have deep domain knowledge about all things data. What does the company collect about its customers? Where is it stored? What is done with that data? How long do they keep it? Can customer request changes or removal of data about them?

Failure to comply to The Directive has resulted in modest fines – now seen as a slap on the wrist. But the implications are much greater from a brand or marketing sense. Highly visible breaches and compromise of personal data can result in job loss in the C-Suites and worse. And if you think that is bad, the advent of the General Data Privacy Regulation (GDPR), which will go into effect in May 2018 will lead to multi-billion dollar fines and set a global scope to enforcement efforts. has featured a number of excellent articles about why companies must comply with GDPR and what it will take to do so affordably and effectively. Keith Dewar, the Group Marketing and Product Director at, posted a very interesting set of thoughts not too long ago. He posited that the idea of gaining consent could be used to “re-establish trust of organizations.” He closed his article by noting that “[p]lacing individuals in control of their own data is the start of the trust relationship,” and then added that this would lead to “greater engagement between organizations and their customers.”

Keeping Control of One’s Personal Data

I agree with his idea, but I would take “control of personal data” one step further. It’s time to think about how engaged companies can put tools into the hands of customers that would provide them with the mechanisms for controlling when, how and to whom, and under what conditions they release or reveal personal information. My recommendation is that this become a set of functions that a personal virtual assistant should perform.

Managing a person’s digital identity, keeping track of log-ins and passwords, asserting primacy in a loyalty program and even attaching terms and conditions on the use of one’s data are tasks that many individuals find so onerous that they avoid them totally. As they surf the web for bargains, for instance, they will often just “click the button” to log-in with Facebook, Twitter, Google or one of the other popular sources of social sign-on.

I’d like to see individuals turn to selected personal assistant – be it Alexa, Google Assistant, Cortana, Bixby, Arbo (from Panasonic), Frank (from MyWave) or others yet-to-be-named – as trusted entities that navigate through digital commerce domains securely and in a way that conforms to their expressed preferences regarding treatment of personal data. It’s a tall order, and one that requires more discussion and specification.

Fostering plans for Intelligent Assistance that establishes “self-sovereignty” as the governing principal for treating personal data in the world of digital commerce is something to aspire toward. It is also a major topic for discussion at the Intelligent Assistants Conference in London in May.

You may also like...

Keep Up To Date - Subscribe To Our Email Newsletter Today

Get the latest industry news direct to your inbox on all your devices.

We may use your information to send you details about goods and services which we feel may be of interest to you. We will process your data in accordance with our Privacy Policy as displayed on our parent website