The new generation of cardholder authentication “3D Secure 2.0”
3D Secure has often been (and still is being) promoted as the magic miracle cure, which should cure the misery of default on the merchant side. Initiated by VISA in the early years of this millennium and prominently placed as said miracle cure, however, the teething troubles soon showed up – first and foremost the problems with the “conversion rate” among 3D-using traders. The use of 3D Secure caused unintentional payment cancellations by the cardholders and thus reduced the sales of the affected merchants. The conversion rate describes the ratio of the visitors of an online shop based on clicks to the conversions, i.e. the conversion of prospective or interested buyers into buyers.
The problem, on the one hand to minimise the risk of payment default by chargebacks, but at the same time to permit maximum potential sales at the participating merchants, could not be solved in the used variant of the 3D method (version 1.0). When PSD2’s European payment supervisors then demanded strong customer authentication for much of Europe’s well-known card payment traffic, they took pity on the merchants. The major credit card organisations (Visa, MasterCard, AmericanExpress and JCB) formed and defined a new authentication standard, “3D Secure 2.0”, within the joint venture “EMVCo”, which today is largely responsible for the EMV standards. This was to turn the former miracle cure into a remedy that would have to completely eliminate the suffering of the merchants and at the same time meet regulatory requirements.
3D Secure 2.0 is also the answer by card organisations to the requirements of strong customer authentication (the PSD2), which is already to be implemented by September 2019. The new specification also ensures that the international schemes offer a consistent standard for consumers, merchants, issuers and acquirers.
In October 2016 the time had come and the specification of the new standard was published by EMVCo. Looking at the operational steps of the new method from the helicopter perspective, serious changes can not be easily recognised in comparison to the old method. The devil is as always in the detail, and it is precisely these details that give hope that with the 2.0 version one has found a cure. The new procedure has defined different process steps for new (or at least modified) roles. The classic, well-known role from the point of view of the traders in the old procedure, was the role of the Merchant Plug-In Operator (MPI). This is explicitly no longer used in the new specification. It therefore remains to be seen how today’s MPI operators will operate with a technical solution in the 3D Secure 2.0 process (for example as a technical service provider of a “3DS server”).
In addition, the product managers at EMVCo have integrated a new ingredient that reduces payment cancellations in the old 3D process – and even stops them altogether. The so-called “Frictionless Flow”, namely, allows within the new standard an authentication without additional interaction with the person to be authenticated.
Now that the regulations of the two largest credit card organisations (VISA and MasterCard) regarding the new 3D Secure 2.0 procedure have been adapted with the Autumn 2017 release, it is now time to advance the implementation of 3D Secure 2.0 in the (partly new) operational instances.
However, to be able to use the new procedure, each participating entity must implement technical changes in their systems, since the procedure involves some changes compared to the old authentication.
By 01/01/2020 at the latest, however, according to the current plan of the MasterCard, all authentications should be carried out only according to the 3D Secure 2.0 standard. However, Visa has already postponed its April 2018 rollout (dealer-initiated authentications only) to April 2019. The timetable seems very ambitious planned and will then have to be confirmed by reality.
Crucial to the success, however, is the future use of the process by the e-commerce community – that is, the transaction volume using 3D Secure 2.0 authenticated payment transactions. Therefore, assuming that the “3D Secure Weaving Machine” (consisting of Access Control Server and Directory Server) is (or has to be) implemented by the operational specifications and deadlines of the credit card organisations, the merchant remains the same as before – and as in the old procedure – can make or break the success of this innovation. And this is precisely what the teething troubles of the old “miracle cure” know from their own, painful experience, and should therefore show a rather moderate interest in a (from their point of view) imposed renovation.
The acquirer as a liable entity in the 4-party model must inevitably have an immense interest in the use of the new procedure, because only in this way can he comprehensively get rid of the liability in the case of a chargeback case back to the issurer by means of a liability-shift. So that the acquirer can use the new procedure effectively at the merchants connected to him, the problem of the conversion rate must be solved. This in turn can be eliminated by definition within the new standard only if the majority of the authenticated transactions are processed via the newly defined “Frictionless Flow”, in which an additional security query in the authentication process with the cardholder becomes superfluous. However, this “Frictionless Flow” implies that the merchant directs enough information about the cardholder and the transaction to be authorised in the authentication process to the issuer, who then “favourably” agrees to this authentication without further request from the cardholder, based on their own risk assessments.
It is therefore quite unclear as to what percentage, at the end of the day, authentication in “Frictionless Flow” is processed. And this is precisely where the credit card organisations have left their acquirers in the cold, since on the one hand they do not make binding stipulations to the issuer regarding the risk assessment in-house, but on the other hand they do not provide the acquirers with any support for using the new standard.
Operationally, 3D Secure 2.0 brings many new features with it and is also well-equipped for regulatory purposes. The status of a “facelift” of this tool can therefore be safely attested. However, if 3D Secure 2.0 is to trigger a “quantum leap in authentication” – and the potential for doing so is given by the new specification – further definitions or restrictions are needed to get rid of forever the old teetthing problems of the “Conversion Rate”.