Crossing the privacy line: a human perspective

As businesses continue to take advantage of consumer data, there remain some questions as to how dealers should move ahead with this delicate issue to ensure a good customer relationship and experience.

Consumers are becoming increasingly aware of privacy these days — and it’s not just who has information on them, but how that information is used.

In a July article for this publication, I summarized a model of the three key segments of consumers when it comes to privacy:

Pragmatists: those who are willingly share their data if they perceive the end result will benefit them;

Fundamentalists: those who are very guarded with their data and refuse or prefer not to share; and

Unconcerned: the “whatever!” people, or those who don’t care.

The segmentation was done by global consulting firm Acxiom/DMA in late 2018, and I believe the segmentation still holds up, even at the rate things change these days.

The segmentation becomes even more useful when we begin to think of the advances being made in the application of AI (artificial intelligence) capabilities to the management of customer information. With those advances comes a much greater need for any organization using AI to ensure that boundaries are clear with regard to which customer data is being used, and how the data is being used.

Avoiding the “creep factor” with your customers

There are two sides to the data/privacy issue. One is the legal side, over which a business has little control and must respect and adhere to. The other is the more human side — how your customers feel about the fact that you have data on not just their vehicle, but them as well, and that you likely use AI to your (and hopefully their) advantage in managing your relationship with them. That is the ethical side.

With more and more dealerships beginning to deploy AI tools to manage their customer relationships, it’s especially important to pay close attention to this human or ethical side. Just as, when you go online to buy merchandise or check out hotels or travel destinations, you leave a footprint that can trigger targeted messages. Smart CRM systems today are able to target messages to your customers that are more appropriate, effective and timely. Your customers likely appreciate this and it is an essential part of relationship building.

But there is a danger that the line is crossed and the customer starts to feel that you know them too well — it gets creepy. It would be similar to the way you feel when you’ve just done a quick search for a hotel in Spain and suddenly you’re inundated with ads for all things Spanish.

Recently, I came across a very interesting article that talks about this, along with what you can do to avoid it — or at least minimize the customer’s level of discomfort. It’s based on a presentation given by Katrina Taylor, head of user experience at the U.S.-based fashion rental firm Armoire — itself an innovative business highly dependent on strong customer relationships and word-of-mouth. The article is written by Amy Gesenhues and appeared in the online site Martech in November, 2019.

In the context of AI, Taylor makes the point that “we want machine learning to be more human, but then we’re freaked out when it is.” She offers six pointers that organizations can use to minimize the “creep factor” that sophisticated AI might engender with customers.

These pointers assume a fairly high level of integration of AI within an organization’s customer data. While not all dealerships may have this level of integration, the CRM systems commonly used in our industry are capable of doing this and they are increasingly in use across the dealership population.

Probably the one pointer that stood out most to me was number five — keeping humans in the loop. I’ve seen dealership technology (sales and service) evolve substantially over the years and there is one common thread in determining successful deployment: the accompanying human involvement to enhance the capabilities of technology. Taylor makes the point that “the purpose of tech (in retail) is to advance the human experience, not the other way around.”

Privacy is as much about the ethics as it is about the data

In Canada, we may think our regime of privacy laws and protocols are strict, but we’re still way behind other jurisdiction like the European Union (EU) with their GDPR (General Data Protection Regulation) regime.

We are headed in that direction and the new privacy environment will create challenges for Canadian businesses such as new car dealers, which collect, store and manage customer data of many kinds. Not to mention the data that will potentially be accessible via systems within the vehicles themselves, and which can reveal much about each owner.

I had a discussion about AI and privacy with Dr. Ann Cavoukian, a three-term Information and Privacy Commissioner of Ontario and currently Executive Director of the privacy consulting firm Global Privacy and Security by Design (Toronto).

Dr. Cavoukian is quite clear — there are significant privacy and ethical considerations that will become much more pertinent as businesses, large and small, expand their use of AI to enhance their customer data. While serving as Expert-in-Residence at Ryerson University’s Privacy by Design Centre of Excellence, she provided seven essential elements of a privacy policy when AI is involved and where companies need to address the ethics of deploying artificial intelligence:

Transparency and accountability of algorithms is essential;

Ethical principles must be applied to the treatment of personal data;

Algorithmic oversight and responsibility must be assured;

There must be respect for privacy as a fundamental human right;

Data protection and personal control of privacy must be the default;

Companies must proactively identify security risks, minimizing harm to customers; and

There must be strong documentation to facilitate ethical design and data symmetry.

This might seem like overkill to some businesses, but as the legislative environment evolves, these steps will become the price of entry.

For dealerships using CRM vendors or systems where AI capabilities are embedded, responsibility for safeguarding customer privacy and for ethical approaches to designing and implementing solutions must be acknowledged. We tend to overlook the fact that developing AI capabilities not only involves customer consent, but also requires close attention to how the algorithms are “trained”. Does the forming and the training of the algorithms meet ethical guidelines? The end-user will likely be held accountable if this is not the case.

What about younger customers? Do they care about privacy?

Going back to the segmentation mentioned earlier, there are some clear patterns in terms of how consumers in different age groups see and are prepared to trade-off aspects of their personal data. Not surprisingly, almost six in ten consumers in the 18-24 age-group are Pragmatists — the highest of any age group. Only 12 per cent are Fundamentalists (very guarded) and 31 per cent are Unconcerned.

We need to be careful however, that we don’t interpret these numbers as indicating that younger consumers don’t care about privacy. They do, but while they are more likely to accept the fact that many companies have their personal or secure data, they are also likely to demand that these companies protect the data.

A recent blog by Savannah Peterson, founder of Savvy Millennial, paints a picture of the very different world that these consumers have experienced from a very young age. They recognize that for social media (and therefore, for many companies) “their most valuable asset is user data and it’s only now that they’re starting to capitalize on it. I feel like my data is more susceptible to being sold now more than ever.”

Another quote from Peterson’s article sums up the feeling: “I think our secure data floating around with so many different companies gives it a bigger chance of being stolen.”

We might think that dealerships carry relatively little personal data on customers, but that’s less and less the case these days. Personal information provided by the customer, communication via email, text or social media, financial information and information gleaned from the vehicle are now much more easily brought together. Layer on the rapid development of AI, and the potential to “creep” a customer in very specific ways is there.

What younger consumers will demand is that companies employ sophisticated and effective tools to protect their data and prevent misuse. These consumers are growing up in a world where concepts like blockchains are commonplace and are seen as the next level in data security. Fancy password protection or user access code policies won’t cut it — not even close!

Related Articles
Share via
Copy link