The economics, structure, and behavior of platform ecosystems and organizations

Customer and partner data will be some of the glue holding together ecosystems

There are two existing models of trust that are relevant to business. Let’s call them “experience trust” and “emotional trust.” We are going to explore two new models of trust, explain why they are so disruptive, and create a strawman as a way of thinking about the way forward.

Experience trust is simple to grasp. Think about using your bank card, pushing the brake pedal in a car, getting on a plane, charging your phone, posting a picture on Facebook, using a vape pipe, drinking water, taking a taxi, texting, etc. Every time you do something the ‘experience’ functions, within reason, as you expect it to. Expected feedback loops reinforce a message that whatever you use can be trusted. Society depends on experience trust. It makes life simple and convenient. As the old advert goes; “it does what is says on the tin.” Virtually every brand and every company has close to 100% experience trust, as without it there are unlikely to be new let alone repeat customers. Rules, regulations and standards make the services repeatable anywhere at any time from any provider, essentially experience trust makes usage and choice easy.

Emotional trust is more subtle. “Do I believe that the company I am about to use has my best interests at heart?” Your bank can make the payment (‘experience trust’ — I know the payment will happen as promised), but that is not the same as trusting the bank to sell the best products, service or advice, or as a long-standing customer to reward you for loyalty in the way providers bend over backwards to entice and reward new customers. Whilst there are always a few exceptions, the reality is that pharma, government, the church, charities, banks, social media, medical, insurance, CPE, retail, gaming, media, and auto have eroded our natural goodwill, our emotional trust. As consumers we feel that “brands don’t have our best interests at heart”. Yes, you can use any service and trust it will do what you want (the joy of regulation and standards) but we have generally lost faith in companies’ purpose, ethics, morals, and integrity. To hide this stark reality many of the world’s biggest brands spend vast sums in marketing and branding to keep consumers focused on the utility of the experience as the reason to trust them. We have little option but to fall back on experience trust as our best mechanism for selecting any brand and over another, leaving emotional trust out of the decision-making cycle.

However, this is not healthy. We don’t ‘emotionally trust’ digital brands at one level because we know that our data is being used and abused as the mechanism to make money, to create value and generate wealth for the business and its shareholders in exchange for the service. We translate this into the motive behind every piece of communication with us, as their gain is at our loss on some level; however subtle.

Something disruptive and novel is happening and it opens up a whole new world of trust that was either lost or forgotten or maybe never existed. This will, I am sure be debated. For want of a better way to describe this new trust component I will label it Enablement trust (keeps everything to E — the third E of trust!) Before we describe enablement trust, we need to remind ourselves of the context of data portability and data mobility, where the consumer can ask for a copy of their data back. The wider concept (which is not new) is that the user is the best person to keep a copy of their own data. The base thinking being that the user will want the data to be correct and there is only one source. Up to now (2019) this has been a very hard concept to grasp and evaluate but companies are emerging who make this ideal simple such as digi.me. As an analogy for the way it works consider Microsft Outlook; we don’t understand how email works, but we use it, we don’t need to understand how giving users control and consent of their data works, but it must be secure, private, trustable and simple.

Why Disruptive: BigTech, banks, and corporates collect and control your data in their silo. These players offer to users products based on their limited view of the user ( a subset of all user data). From this limited position the companies use this position to offer and sell products and services for which there is a good business case, but which may not be in your best interest because, for example, the business case for the service that would have matched your needs did not work at scale. In taking the scale option they erode your emotional trust in them. As we have become increasingly digital, brands are collecting and keeping more data on and about you and through the Privacy Statements we all sign up to they have the legal right to reach you. Owing your data gives them the power in the relationship and they hope that functional trust and marketing is enough to keep you “loyal.”

Were the data back in your own “care” or were you at least to have greater control of it (recent legislation is designed to do this — GDPR, PSD2, Data Portability, data mobility = the data economy), the user could decide who looks at their data and who can provide products and services. When the user is in control not just of the data but also of the consent (they have the ability to switch on and off who has access and to what), this will create a shift in the power balance.

“Brands have largely paid lip service for the past 30+ years to customer first, customer centric and all that thinking” (Anthony Thomson). When data is back with the creator (user), the customer is genuinely now first, as the corporate will now have to ask the customer to look at the customers data. This changes the relationship.

Enablement trust changes a vital aspect of the trust relationship. If the corporate fouls up in the current model, the user has very few real alternatives. In the new model the user can turn off their feed to their data. The corporate, who works out that by putting the user first, and genuinely creates amazing customer experiences, is the one the user may choose to keep using and allow access to ever more new and rich data. This is enabling the corporate to re-establish emotional trust by showing to the user that they are putting the user’s best interests at heart: because they have to if they want to compete and it would be the smart thing to do.

This concept does not wipe out or destroy the existing model or Brands. It allows a few Brands to move into a new position. “First Movers” will be rewarded.

The idea of giving control of data back to users could induce two kinds of response. The first is the immediate response. ‘Lock down, never going to happen. This weakens our position. We have invested to create this data. The value is in control. We know better than the user.’ In summary: defense and defend. This position will survive and companies who are here will still flourish.

The second response, more subtle, is that we have a chance to change the game, we can be the first mover, we can win by doing what we have said we would do forever — put the customer first and do that every day in everything we do.

Brand Values for Banks

One argument from Banks and the wider fintech market is that they are the safest place to store data. However, when we centralise value (money for instance), it becomes attractive to rob, pinch, steal or walk off with, and protecting it comes at an ever increasing cost.

In the old language the rationale could be summed up as “dynamite and vault”; in our modern language “very attractive hacker economics.” The centralised deposit(s) become of great interest to those who want take (pinch) user data and control it, and see it as a way to increase their own value. Centralised works first for the institutions and second for the consumer.

A more subtle part of the conversation (should we trust banks with our data in a big vault) turns to where modern day value comes from with data, and that is in the sharing of data. The essential value of data is relational. Data in a vault with no access (other than you with your key) has limited value. There is some value possible for an individual if they want the bank to monetise their data in very small increments (that might “add up” nicely over time), but that is a different story. This area will pitch the bank and BigTech against each other.

The issue banks have lies in the inherent tensions of their brand value in relation to data that we have been discussing. Their brand value today is built on being a trusted champion of ‘our’ data security, privacy. They charter with us to defend, hold, keep, store, protect. Your cash will be protected. In fact we are so sure, we will give you a guarantee.

However if the value for your data comes from the sharing, how does this align to the lock it up, store it, and protect it way of building brand value?

Data from The 15 Most Common Brand Positions in Retail Banking — If you use one of these 15 common themes, you’ll have to apply your entire organisation to it with gusto. That means 100% at every customer-facing touch point. You have to go above and beyond in order to stand out. If you don’t align every aspect of your organisation around your brand, you’ll just end up being another commoditised “also-ran.” That’s what separates “great service” from “lip service.”

However, not one brand value is about sharing your data! Worth pondering is the concept that if banks focused on transactions/ payments as a brand value and not safe keeping, protection and guarantees, the argument would shift. Payments work as there is a fundamental idea of sharing. You need to share your card, share details, share the payment processes — payments has a far better alignment to New Digital Banking thinking.

Before we look at the new model for sharing data from Banks and brand alignment, we need to revisit the four kinds of trust I mentioned at the start. We know experience trust is good, we know emotional trust is broken, we can guess enablement trust is coming — probably from a growth fintech rather than an incumbent; however, where is this fourth one, Tony?

The fourth trust we need to unpack is about the employees and directors of an institution having trust in their own system and systems. This is not experience and functional as that is how the user perceives what the company does and how it works. System trust is about governance. This is not corporate governance as in comply or explain — nor the dark arts of Data Governance, which is a whole different topic. ‘Systems Trust’ is about the corporate governance of data, so that directors and officers can be held accountable for the systemic way in which data is ‘managed’.

Imagine you are the CEO of a bank. I am going to ask you a few questions over a coffee. Let’s see how we get on.

Hello Jenny (CEO of bank x), have a think. Do you trust your CFO ? But please don’t answer yet.

Do you meet your CFO pretty much daily and talk about finance and numbers? “Yes I do.”

Do you have a monthly board meeting and spend 50/60% time talking finance? “Yes we do.”

Do you have an accounting system? “Yup.” Are your monthly PL,BS and cash flow generated from the system? “Of course.”

Do you have independent NXD for Remuneration and audit? “That’s just good practice.”

Do you have an external auditor who you churn every now and again? “Again, just good practice.”

Right, back to the CFO. You trust the her? Let’s be honest, you trust the system that the CFO is responsible for. You trust the CFO for analysis, experience and insight.

Do you believe that data is already or going to be a core/key/critical asset for your business survival and growth? YES!

Phew. That all ways worries me

Who is accountable for data: your CTO, CDO, COO, CIO? You trust your CxO ? Please don’t answer yet. Let’s assume that at least one of them is. Do you meet your CxO all the time and talk about data and analysis “Actually no.”

Do you have a monthly board meeting and spend 50/60% time talking data? “No.”

Do you have system for meta-data reporting across the organisation including suppliers, customer and employees? “No.”

Do you have any data generated about data from a system that you report on? “No.”

Do you have independent NXD for data, ethics, privacy? “Er, no.”

Do you have an external auditor who looks at your data? <rueful smile> “I think you already know the answer to that one, Tony!”

Do you trust the CxO? Let’s be honest, you have no idea and you are not alone!

How are you reporting on consent, how do you know where data comes from for marketing, how are you tracking how you track people, how are you ensuring that data given to suppliers is used in accordance with the terms of the contract, how do you test ‘use data’ provided to you to see if you are compliant with your own terms; is privacy consistent? How are you analysing test data for automated decision making, how many automated decision are you making, who is checking for bias in your data and automation, do you know where you data came from? How do you know that you data is real data, how are you checking that the HR software you use does not bias against anyone?

Why is this important to get our heads round? Decisions and working relationships are increasingly shaped by data and now of course by AI, which ‘feeds’ on data like whales feed on phytoplankton. How can we have faith in the decision if the data is not trustworthy and we do not have a line of sight into how the new elemental force that is data is shaping the working practices of the company? How can the Board do its work in this new area without that line of sight?

How does this story so far offer value and insights into the problem and solution?

Let’s start from the user centric view of the market. The two axes selected for this viewpoint of the market are about the user having control and the user experience of trust/exploitation. Why these two axes? If we want to grow the data economy, an assumption is that growth will come from the user having more trust and more control, which means that there has to be better regulation, standards and governance.

Plotting these as X and Y — we can show the existing market models, the effect/perception of branding and where the regulator wants the market to operate.

First up. The majority of the existing models for data exist where the user does not feel in control and where the user feels exploited.

Branding then takes this bottom left existence and makes the users (consumer) believe that we have more control or are less exploited than we actually are. For example: Facebook gives the user preference controls over their data. This does not shift their actual business and data model, but many users feel more in control. Google and Twitter provide services for free as part of a value exchange. The user feels trust in the service (email) but if she thinks about not in relation to how their data is exploited for commercial gain. Apple is interesting in their position as the user feels trust and in control but they are neither. Apple is in total control and exploits the user with branding, experience and lock in. Apple is a walled garden, and we regulated walled gardens into new models starting in 1990/92 — just think AOL or unbundling of IE4.

The regulators desire is to provide a framework for a market where the user is in control and the user also feels trust in the service (for the right reasons).

The regulator has a big tool bag to try and drag the models from the existing position to the new, however the companies have an equally large toolkit to ensure that the regulator remains frustrated and prevents the move in model, because the model they have delivers super profitability and growth. The regulatory model will probably deliver neither.

Health and Safety (H&S) provide an interesting case study. The regulator for a long time, encouraged change in behaviour with best practices, rules, regulations and even fines. What created change was when directors became accountable and responsible for H&S with the introduction in 1974 of the act. The key was to prevent delegation of responsibility and to make a breach of H&S a criminal action. https://en.wikipedia.org/wiki/Health_and_Safety_at_Work_etc._Act_1974

What changed was that there was suddenly a reason and a motivation to change.

Current regulation around privacy is too weak. My view would be we follow the H&S framework. It would extend the new Australian “ abhorrent violent material” laws to include privacy and make a breach in privacy a criminal offence.

However, this is probably not enough on its own. We need to think and discuss how to upgrade our corporate governance processes to incorporate data and its implications. Data is not oil — data is data. We have, since 1992, been improving reporting, the management of information and oversight of process to deliver corporate Governance.

The “new’ force of Data creates the need for an upgrade to the existing approaches, as data is unique. It is not like finance, cash, assets, people, suppliers or operations.

The proposal is that best practice governance will form a third committee for an organisation that has to report to public shareholders. This third committee would stand alongside the remuneration committee and audit committee and would be something like a “Privacy and data ethics committee.”

We have to help companies to take a different approach to finding ways to avoid regulation. We also need to help ourselves as there is increasing confusion due to the volume of conflicting regulation and create simple accountability and responsibility metrics that rest with the board. As with H&S and fraud, the directors cannot then say it is someone else’s problem or there was a process problem.

Holding directors to account for data means that we can have trust in the systems. This will mean that some players can be more transparent, everything that BigTech does not do today. Why will that help? Winning more customers.

Trust and transparency in the system and taking the customer on the journey by story-telling (imaginative TV ads) is one way that springs to mind that existing Banks with Brand values can compete in this new world. There will be other ways and the introduction of more personalised engagement helps to show that a brand really cares about me and puts my best interests first. But if there is no faith in the system, that is a difficult customer “promise” to deliver on. If not overseen at Board Level with a meaningful flow of “management information” such ideals rich looking shallow, could be misinterpreted by customers and maybe some employees as a new kind of manipulation. This would actually worsen the Trust Gap. Don’t make promises you cannot keep or at least cannot show through process and review that you have every intention of keeping. The System will need to move swiftly and in line with brand promise when things go wrong.

The more everybody in the organisation can trust to the way in which data is managed and “responded to” in the organisation the better the relationship between human judgment and accountability and data and its sibling “AI” will be. Better data will then lead to better decisions and more margin.

We have covered experience trust (it works); emotional trust (do they work in your best interest — broken), enablement trust (give the user their data and ask them to trust you) and then system trust, making directors accountable and responsible for data — changes the culture and attitude towards data.

The difference data makes is that we can now see who does what they promise and who does not — then we can trust. How much work is a Board really willing to do to be seen as “Trustworthy?”

The first two trust models worked in a pre-data world — we now need to think in a data-driven world….


Originally published at https://medium.com on May 31, 2019.