As ‘gatekeeper’, financial institutions have a legal obligation to avoid money laundering and financing crime and terrorism. It is their job to carefully take account of their customers and monitor their transactions. As put by Kees van Dijkhuizen, CEO of ABNAMRO and recently rapped over the knuckles for this by the Dutch authorities: "We have to know precisely what customers bank with us, what they do with their account. We also need to assess the risks of every customer both in retail and in business. Eventually, this is important to discover money laundering and to assess the risk on financing terrorism”.
Segment of one
All banks are looking for ways to use their data to improve their offerings. In the massive databases of a bank resides the promise to personalize services to market segments and even to individual customer. Banks plan to go beyond personalization by segment, develop individualized communication and experiences for the ‘segment of one’. Allowed through data, advanced analytics and digital technologies, this is the ultimate level of innovative personalization to boost the core business.
On June 6th ING presented a first step towards profile-based services in a review of its privacy statement. “By your bank statement” argues Annet van der Hoek, director of ING’s Customer Services, “we know you a little bit. With this knowledge we are able to advise you. Or, on the right time, offer you one of our own ING products”. Think of a suggestion to open a savings account if the bank notes the payment of a first child allowance, indicating family expansion.
Flooded with comments of angry customers and after critical reviews by the Dutch Consumers’ Organization, the Dutch Data Protection Authority and the supervisory authorities in the financial industry, ING rapidly postponed its plans. The very next day, ABNAMRO, decided also to shelve the own program for now.
Profiling and automated decisions
This secondary use of customer data in customer profiles is a subject in the General Data Protection Regulation (GDPR).
The underlying logic is simple. Many people don’t know they are subject to profiling and do not understand how this works. As the result of profiling somebody may be labeled and may – with wrong or incomplete data – suffer serious consequences, like the wrongful refusal of delivery of certain products and services. It may even mean discrimination. Profiling and automated decision making may involve privacy risks for the persons concerned.
Decisions taken and executed purely based on automated processing of data are forbidden. Say: an algorithm decides whether an institution provides a loan to someone and the decision is communicated without human intervention. Or think of a decision with a legal impact: like the refusal or attribution of housing allowance or child benefit, or the automatic disconnection of your mobile phone because of an open tab.
There are exceptions to this rule. First of all, the automated decisions with a specific legal base. Think of automated controls on and the prevention of tax evasion. Also excepted are decisions taken with the explicit consent of the person concerned. But take care: consent management is not a simple ‘opt-out’: a list of people that expressed not to cooperate.
Key in understanding the regulation on profiling and automatic decisions is the ‘material nature’ or ‘significant importance’ of the impact.
Targeted advertising
Online advertising is usually based on profiling and automated decisions. The European Data Protection Supervisor (EDPS) assessed ‘targeted advertising’ as an activity without ‘significant importance’ to the privacy of the targeted audience.
But there are cases when a targeted advertisement hits the criterion of ‘significant importance’. It depends on the level of detail of the profile, the expectations in the audience, the delivery of the advertisement and specific vulnerabilities of people in the profile. You may not target people with financial debts with advertisements for gambling or payday loans.
In its guidelines of October 2017 EDPS tried to clarify the criteria for accepted ‘targeted advertising’. Reponses by both the Dutch banks make it absolutely clear that it is hard to decide how automated decisions are handled by Dutch Data Protection Authority. Rest assured, you have not heard the last of this!
Impact assessment
For GDPR a Data Protection Impact Assessment (DPIA) is the vehicle to assess the impact of the collection, storage and processing of personal data. A DPIA is required any time you begin a new project that is likely to involve ‘a high risk’ to other people’s personal information. The initiative has to be taken by the data processor and contains:
The DPIA assesses and documents the significant impact of minorities or vulnerable people.
Compliance is not obvious
Profiling and automated decision making can be very useful for both organizations and the people concerned. Organizations are better equipped to segment their markets and tune products and services to the individual needs to the benefit of these individuals. The question is how to get through this mine field?
Data protection has many angles. None is self-evident. GDPR is not about ticking boxes, it is about a commitment to properly manage the trust of big data boosting business and the right for individuals to control one’s life.
Technology may help you to find your way. I think of a well-designed application will help organizations – step-by-step – in your compliance to the many aspects of GDPR.
Simplifying GDPR compliance starts with the delivery of GDPR capabilities to all parts of the organization. Build a ‘single source or records’ to prevent gaps and ambiguity. It is the unique and essential source for the different data processing routines.
All processed data need documentation in a Record of Processing Activity (ROPA). In our solution each piece of personal data is classified and mapped to a business system or application via your configuring data, which ensures any data breach is fully understood and effectively managed.
It is the foundation of a ‘single system of action’; an unambiguous and automated structure of workflows to deal with subject access requests, consent management, data portability and the registration of data breaches and other incidents. This brings a ‘single structure of control’ and with that peace of mind for all those in charge of data protection and privacy.