Resources / Opinion / Unwrapping the regulatory technical standards of PSD2

Unwrapping the regulatory technical standards of PSD2

In a few weeks, many of us will be taking our Christmas holidays – the season to be jolly, the office Christmas party, excess, occasional restraint and looking ahead to 2018. What’s certain is the inevitable pace of change, innovation and the unexpected. What’s also certain though is that the Revised Directive on Payments Services (PSD2) provisions will apply from January 2018. Service providers will have a little more time to meet the deadline for the regulatory technical standards (RTS) and this date will be around September 2019.

The details of PSD2 have been discussed at length in many blogs – the increased competition, the better mobile experiences for users, etc. So, being an engineer at heart, I wanted to look at what technical guidance is given in the RTS. The complete report is available here and it covers what service providers must do to protect people from fraud and cyber crime. Central to the RTS is the requirement for Strong Customer Authentication (SCA). These are “the requirements with which security measures have to comply in order to protect the confidentiality and the integrity of the payment service users’ personalised security credentials”.

So, what’s required? Reading through the RTS it says:
“Payment service providers shall ensure the authentication by means of generating an authentication code”


“Payment service providers shall adopt measures mitigating the risk that the authentication elements, categorised as inherence and read by access devices and software provided to the payer, are uncovered by unauthorised parties.”

To me, this means that we need to protect sensitive authentication data on our devices from unauthorised parties. On a mobile device, this could be malware, installed by a malicious actor wanting to steal authentication data. Failure to protect sensitive data is number three in the Top Ten Security Risks in the Online Web Application Security Project (OWASP) (complete list here). It catches people out all the time and quickly makes headlines (Uber hack link).

Mobile application developers need to design applications in such a way that both sensitive application data and logic are protected. They should always use strong solutions that have been subject to years of evaluations and peer review. Bruce Schneier wrote: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can’t break. It’s not even hard. What is hard, is creating an algorithm that no one else can break, even after years of analysis. And the only way to prove that, is to subject the algorithm to years of analysis by the best cryptographers around.”.

So, back to the RTS, which also covers user interactions:

“1. Payment service providers shall ensure the confidentiality and integrity of the personalised security credentials of the payment service user, including authentication codes, during all phases of authentication including display, transmission and storage. ”

It’s interesting that the RTS mentions the display. This is a harder problem to solve; how does the application developer secure the screen against malware, a key logger or screen recording application? Point of sale devices used by merchants have secure displays, but the mobile phone in the user’s hand may not.

Trustonic showed this very problem (and a solution) back in 2016, watch the video below for more details.

Thankfully, many vendors are addressing these challenges, including Trustonic with its unique Trusted Application Protection (TAP) solution, designed to bring the best possible on-device security to bear.