Amid the ongoing pandemic crisis, there has been a surge in demand for online forms of services wherever possible. Unfortunately, the greater dependence on digital platforms has seen a rise in financial fraud as well. This may be partly attributed to Indian consumers’ digital illiteracy and inability to identify dubious digital features of user interfaces (UI) as deployed by clever service providers to entrap consumers. This calls for the government to introduce regulatory safeguards in the digital finance ecosystem to prevent service providers from leveraging the cognitive limitations of vulnerable consumers through in-app tricks.
Until recently it was common for online travel booking portals to bundle travel insurance with bookings. They used pre-checked consent boxes at the payment stage to default the users into buying insurance cover, even when the user had no active interest in purchasing the product. To the surprise of many, the Insurance and Regulatory Development Authority of India (IRDAI) reacted against this practice and directed insurance companies to “ensure that any portal or App providing the travel insurance coverage shall not pre-select the option of buying the travel cover as a default option”. IRDAI raised concerns that such clever in-app defaults impede consumers’ “informed choice”. This regulatory intervention was the first of its kind in India where a regulator objected to the use of dark patterns in UI and brought to light several concerns such practice entails with regards to consumer protection in the digital economy.
The pre-checked buttons in digital interfaces is a classic example of a dark pattern. The term ‘dark pattern’ was coined by Harry Brignull, who defines them as interface designs that “trick users into doing things that they might not want to do, but which benefit the business in question”. In the example above, the use of ‘pre-checked boxes’ assume the user’s default preference as to purchase an add-on financial service even when it is not the case. Here the users are expected to be attentive to uncheck the box and opt-out. But as the UK’s Financial Conduct Authority has submitted, often consumers use digital services under time constraints and are less likely to opt-out or be aware of existing defaults. Therefore, beset by their cognitive vulnerabilities, limited rationality and the constraints on time and attention, a sizeable proportion of consumers end up buying services without informed consent.
More worryingly, research suggests that individuals with lower levels of education and in urgent need of money make for easy targets for financial service providers. This has significant implications for India, which is characterised by low income, low levels of digital literacy and a sizeable proportion of first-time users of the internet. Such consumers having subscribed for financial services unintentionally and unknowingly get severely exploited. For instance, the Reserve Bank of India (“RBI”) recently mentioned the reports where individuals have fallen prey to dubious digital lending platforms which promised easy credit but trickily charged excessive interest rates and adopted high-handed and inappropriate recovery methods. RBI highlighted that digital platforms are ‘misusing’ in-app digital agreements to access data on the mobile phones of vulnerable consumers and sending messages to their contacts, categorising them as delinquents and using social shaming to recover credit. It is these tactics of digital lending apps that has now resulted in harassed consumers committing suicide, which eventually has prompted RBI to scrutinize lending platforms closely.
Dark patterns based interfaces are manipulative in sharp contrast to persuasive marketing efforts. The distinction between personalisation and manipulation has been at the heart of policy issues that stem from the use of personal information during the supply of financial services. It is crucial that a legal framework must be incorporated to identify manipulative dark patterns in the context of consumer’s choice, value-system, and socio-economic background.
Regulators with direct jurisdiction over dark patterns include data protection authority, consumer protection authority and the sectoral regulator(s) that has jurisdiction over the provider using dark patterns. Simultaneous jurisdiction can lead to duplication of regulation or regulatory arbitrage. In the US, Federal Trade Commission (“FTC”) is the federal body in charge of consumer protection and regulates dark patterns by using its power to punish “unfair and deceptive practices” under section 5 of the Federal Trade Commission Act. FTC v AMG Capital Management is an important ruling on dark patterns where AMG, a payday lender, deployed dark patterns in its digital loan agreement to auto-renew (instead of close) expensive payday loans as a default option. The FTC found AMG Capital Management guilty of unfair and deceptive practices.
In India, dark patterns such as hidden costs could fall under the remit of section 11(ii) of the Consumer Protection Act 2019 (“CPA”) and the Central Consumer Protection Authority (“CCPA”) to redress such issues. But the absence of broad terms like “unfair and deceptive practices” in the CPA could significantly constrain the regulator’s effectiveness in regulating other kinds of dark patterns. Therefore, consumer protection legislation could be amended to allow CCPA to regulate dark patterns comprehensively. Further, as significant as regulation is, it is important to design appropriate regulatory tools that regulators could use.
- CCPA shall issue guidelines clarifying which market practices could be considered manipulation or personalization. To achieve this, the regulator must crowd-in opinions from the community, engage in inclusive public consultations, conduct primary studies to gauge users’ privacy preferences and be transparent about their decision-making processes.
- The success of regulators in regulating dark patterns will depend on the language of the existing statute and its ability to include dark patterns as the ‘cause of action’. To overcome dark patterns, existing consumer protection framework could be amended to equip them to deal with them.
- Amended legislative framework should enable regulators to undertake audits looking for digital interfaces that foster deceptive and unsuitable selling or nudge users to share excessive personal information. Regulators could also lay out model interfaces or guidelines to help providers design user-friendly interfaces.
- For cases in which intervention by more than one regulator is needed, a comprehensive legislation defining a mechanism for inter-regulatory coordination should be enacted.