Mishcon de Reya page structure
Site header
Main menu
Main content section
Abstract AI lights

COVID-19: FCA reflections – Algorithmic trading and market abuse

Posted on 17 April 2020

Who is responsible if an algorithm commits market abuse?

With headlines such as 'US stocks fall 12% in worst day since 1987' and the VIX index, the market's 'fear gauge', jumping to a record high on 16 March 2020, we are seeing the severe impact of COVID-19 on stock markets. In the FCA's report, Algorithmic Trading Compliance in Wholesale Markets (the 'Report'), published in February 2018, it recognises that firms operating in wholesale markets are increasingly using algorithms for a number of purposes across their trading activity. In 2019, JPMorgan estimated that only about 10% of US equity trading is now done by traditional investors. Given the volatility in the market and the widespread use of algorithms, has your firm considered the implications of its AI algorithm making the 'rational' choice to engage in market manipulation to maximise profits?

The FCA's concerns about the potential market abuse that an AI algorithm could cause is apparent as set out in its recent Business Plans and communications. In light of COVID-19, the FCA has reiterated its expectation that 'firms should continue to take all steps to prevent market abuse risks…and [the FCA] will continue to monitor for market abuse and, if necessary, take action'. It is likely that the FCA has its eyes on certain actors in this area. Given current market conditions, this article encourages firms to act now to avoid the potential of being made an example. This article presents the FCA's thinking, possible approaches to an investigation and points for reflection on risk mitigation.

What is algorithmic trading?

The FCA defines algorithmic trading as 'trading in financial instruments which meets the following conditions: (a) where a computer algorithm automatically determines individual parameters of orders such as whether to initiate the order, the timing, price or quantity of the order or how to manage the order after its submission and (b) there is limited or no human intervention.'

The FCA's thinking

Data ethics and specifically, algorithmic decision-making, was a cross-sector priority for the FCA in its Business Plan 2019-2020 and continues to be a priority in its recently published Business Plan 2020-2021. In its Business Plan 2020-2021, the FCA has re-emphasised its interest in the interplay between technological developments and market abuse and the importance of ensuring safe, appropriate and ethical use of new technologies.

Building on this, in February 2019, Julia Hoggett, the FCA's Director of Market Oversight, said: "I can see a world where seemingly ‘rational’ AI, unconstrained and exposed to certain markets and data, would deem it entirely rational to commit market manipulation. Now, the FCA cannot prosecute a computer, but we can seek to prosecute the people who provided the governance over that computer."

Possible approaches to the FCA's investigation

Who may potentially be the target of a FCA investigation? Perhaps the people with algorithmic trading as their certified function? This certified function encompasses those involved in the deployment of the trading algorithm and those having significant responsibility in ensuring it is compliant with the firm's obligations. Have the appropriate people been certified? Should this include people from the first, second and third line of defence? Or, should responsibility 'roll up' to the top, to the designated senior manager, especially where that person's statement of responsibility spells out that they have ultimate regulatory responsibility? Could it be the firm as a whole given the diffusion in decision-making?

One intriguing question is about the AI aspect of the algorithm. If it is artificially intelligent and capable of learning on its own, does it not, to an extent, have a will of its own? If it is autonomously making decisions, does this affect its creators' responsibilities? Does it amplify the responsibility of those overseeing it in the market? If it is 'off the shelf', can the authorised firm that purchased it escape responsibility? But, perhaps it has been modified by the authorised firm? These are some of the questions the FCA may have to untangle when it, inevitably, deals with such an investigation.

In its FCA Mission: Approach to Enforcement document dated April 2019, the FCA states that 'if it appears that individuals may be involved in the suspected serious misconduct of a firm, the FCA will investigate those individuals at the same time as it investigates the firm'. If the FCA were to detect 'serious misconduct' in such a case, it seems likely that relevant senior individuals will be placed under investigation to determine where, if at all, responsibilities lie. When investigating, the FCA may begin by considering statements of responsibility and management responsibilities maps.

Points for reflection: risk mitigation

Building on the Report's findings and wider thinking, firms may wish to consider the following when reflecting on this topic:

  1. Development and testing (D&T)
  • D&T framework: Is there a clear methodology for D&T to ensure the algorithmic trading system: behaves only as intended, complies with the firm's obligations, complies with the rules of the relevant trading venue(s), and does not contribute to disorderly trading?
  • Sign off: Has there been appropriate challenge by a range of objective, competent and informed parties prior to sign off? What will the FCA read into this about a firm's culture?
  • Documentation and audit trail: Is the documentation and audit trail sufficient throughout the D&T process to illustrate why decisions were made and how they were tested?
  1. Governance and oversight
  • Senior management: Can senior management articulate the rationale for decisions made? Do they have suitable information, for example MI, to assess the situation in an informed way? Has a senior manager been designated specific responsibilities in their statement of responsibility? If so, how are they evidencing their decision-making?
  • Role of compliance: Is compliance able to identify and reduce algorithmic trading risks? From a functional perspective, has compliance been involved at all stages? From a technical understanding perspective, does compliance have the ability to constructively challenge?
  • Other functions: Which other functions have been involved in governance and oversight? What will the FCA read into this about a firm's culture?

Conclusion

The FCA's concerns about the potential market abuse that an AI algorithm could cause is apparent. Its expectations in this area are explicit. In its recently published Business Plan 2020-2021, the FCA has emphasised its interest in the interplay between technological developments and market abuse. It is likely that the FCA has its eyes on certain actors in this area. Given current market conditions, this article encourages firms to act now to avoid the potential of being made an example.

Practical guidance for COVID-19
Read the latest COVID-19 related updates on our hub.

 

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

I'm a client

I'm looking for advice

Something else