The Ontario Securities Commission (OSC) announced October 10 the publication of its new report, entitled Artificial Intelligence in Capital Markets: Exploring use cases in Ontario.
Calling the report an important first step in understanding which oversight, regulation or guidance might be appropriate to address the growing use of artificial intelligence (AI) in capital markets, the report’s authors also say it aims to raise awareness of both the opportunities and associated risks.
“We undertook this research to better understand how AI is being developed, tested and used by capital market participants in Ontario,” says Grant Vingoe, CEO of the OSC. “AI has the potential to impact processes and stakeholders throughout our capital markets and raises important questions about managing risk, governance and the potential for malicious use. As industry shifts towards wider adoption, collaboration among regulators, market participants and innovators is critical to support responsible innovation.” The OSC in a statement adds that it wants to be more proactive about its approach to technological innovation.
Jointly developed by the OSC and Ernst & Young LLP, the report goes on to look at use cases in efficiency, revenue generation and risk management. It also looks at value drivers, challenges, AI governance and the role regulators can play.
“The disruptive nature of AI systems has raised important questions about the role of regulation and governance in managing risks as well as the potential for its malicious use,” the report states. “Regulators like the OSC are considering how oversight, regulation or guidance can facilitate responsible AI innovation and adoption in Canada.”
The report reveals that market participants are currently using AI to enhance existing products and services, rather than create new ones. AI is also described as being at an intermediate stage of adoption, with development being concentrated among larger firms with the resources to develop AI models in-house. Firms are using AI to improve operational processes, trade surveillance and detection of market manipulation, and to support customer service. Less developed areas of use include asset allocation and risk management. Challenges include data constraints, lack of skilled labour and corporate cultures.
“Data constraints can be a significant hurdle, whether from a lack of available data or changes in the distribution or characteristics of data over time (data shifts). Attracting and retaining AI talent can also be challenging due to competition from technology vendors. Market participants also need to face their own internal challenges in adapting operating models and culture to benefit from AI. Issues related to privacy, bias, fairness, explainability and interpretability need to be addressed when developing or procuring an AI system,” they state.
“Understanding these use cases provides a foundation for us to consider how to best support responsible innovation and adoption in Ontario’s markets, including the extent to which oversight, regulation or guidance can support this objective.”
Related :