Over the past year FINRA’s Office of Financial Innovation held meetings with over two dozen market participants, including broker-dealers, academics, technology vendors and service providers in order to better understand the use of Artificial Intelligence (“AI”) in the securities industry. This past June FINRA issued a 20 page report which it described as an “initial contribution to an ongoing dialogue” about the use of AI in the securities industry. FINRA notes early in the report that it is not intended to express any legal position and does not create any new requirements or suggest any change in any existing regulatory obligations. So the report is merely food for thought on the topic of AI in the securities industry.
The paper is broken down into three sections; i) a description of the types of AI, ii) an overview of how firms are using AI in their business, and iii) the regulatory considerations surrounding AI. Here are some takeaways from sections ii and iii.
AI Applications
FINRA notes that firms are using AI in many different ways including, among other things, bolstering compliance and surveillance functions, assisting in making suitability determinations, automating operations, algorithmic trading, portfolio management and communicating with customers.
Customer Communication
Remarkably, FINRA noted that at a few firms, virtual assistants are accepting and processing trade orders within certain thresholds although FINRA did not elaborate on what those thresholds are. Some firms noted that they are experimenting with the development of virtual assistant applications available to customers through third-party platforms, such as Amazon’s Alexa, Google’s Home Assistant, and Apple’s Siri. FINRA warned, however, that these applications may also pose certain challenges and potential risks, surrounding customer authentication, data privacy, cybersecurity, and recordkeeping.
Marketing and Suitability
Some firms are creating “holistic customer profiles,” using data from a variety of sources such as a customer’s phone and e-mail interactions, social media updates, web browsing on the firm’s website, spending patterns and customer assets (held both at the firm and outside the firm). All of this information is analyzed using AI tools that can help a registered representative better tailor their suggestions for new products and have a better understanding of what is suitable for an investor. FINRA noted that the firms it spoke with were all cautious about employing AI tools that may offer investment advice and recommendations directly to retail customers, citing several legal, regulatory, and reputational concerns.
Customized research
FINRA’s report noted a growing use of AI tools to provide curated market research directly to customers to share relevant information on various investment opportunities. For example, as noted in the earlier section, AI-based tools may offer customers social media data and related sentiment analysis on investment products and asset classes.
While these AI tools offer the potential to customize investment suggestions for customers, FINRA warned of potential concerns and challenges related to data privacy, use of corrupt or misleading data. FINRA also warned that firms using AI still need to “adapt[] to each customer’s unique circumstances” which seems to indicate that reliance on AI alone will not suffice to meet a firm’s suitability obligations.
Portfolio management and trading
Algorithmic trading and portfolio management is probably the most well-known use of AI in the securities industry. The significant take away from this portion of the paper was FINRA’s warning that some AI might not be sophisticated enough to produce reliable predictions in the face of natural disasters unusual market volatility.
Compliance and Risk Management
According to one survey of 100 firms 70% were using AI in their compliance and risk management tools. Firms are using AI for surveillance and are able to aggregate communications across various channels, such as emails, social media, and text messaging. Firms noted that these tools gave them the ability to move beyond a traditional lexicon-based review to a more risk-based review, such that they could decipher tone, slang, or code words, which may be indicative of potentially risky or non-compliant behavior. Firms are using AI surveillance to detect potential money laundering, terrorist financing, bribery, tax evasion, insider trading, market manipulation, and other fraudulent or illegal activities.
FINRA noted that firms using AI to assess credit risk management and particularly the credit worthiness of counterparties have unique concerns regarding what FINRA termed “discriminatory credit scoring based on biases present in the underlying historical data.” This is likely a reference to certain credit risk management AI applications that have proven controversial because of the resulting decisions to not extend credit to minorities to a degree that is well out of proportion to the population.
Regulatory Concerns
FINRA devoted the third section of its report to various regulatory concerns surrounding AI.
Supervision
The report notes that FINRA Rule 3110 regarding supervision still applies despite the use of technology. Firms are required to supervise their AI applications and will be held responsible for any failures that arise from lack of supervision. In this regard FINRA suggests, among other things, building a layer of human review of AI inputs to ensure accuracy and completeness and testing to ensure that outputs are in line with business goals, internal policies, procedures, and risk appetite. FINRA warns that “explainability,” i.e. understanding why a machine has made a particular determination, may be particularly important in AI applications that have autonomous decision-making features (e.g., deep learning-based AI applications that trigger automated investment decision approvals). The vast majority of firms that FINRA spoke with stated that their machine learning applications do not involve autonomous action but are instead used to aid human decision-making.
Data Bias
Machine Learning AI relies on large amounts of data to inform decision making. FINRA warns that underlying data must be “complete current and accurate.” FINRA noted that inaccurate data may lead to “inappropriate decision making” which could be a violation of FINRA Rule 2010 which requires firms in the conduct of their business, to observe high standards of commercial honor and just and equitable principles of trade. It is unclear what “inappropriate decision making” FINRA is referring to here but perhaps this is a veiled reference to the reports of AI models rejecting large numbers of minority loan applicants noted above.
Customer Privacy
As noted above, most AI applications rely on the collection of large amounts of data including information regarding customers. While mining this data can help firms provide better service to customers and suggest more suitable investment strategies, the collection and storage of this information poses distinct risks regarding the maintenance of customer privacy. FINRA advises firms to update customer consent forms to reflect any new information that is being gathered such as biometric or social media activity. FINRA also advises that firms review and update all of their policies and procedures with regard to customer privacy in light of new technology being implemented.
Trading Applications
With regard to AI applications in the in the trading sphere FINRA list various Rules that firms have been contending with for years and reminds firms that any AI application for trading is going to have to comply with these rules. FINRA cited Regulatory Notice 15-09, which stated, “[i]n addition to specific requirements imposed on trading activity, firms have a fundamental obligation generally to supervise their trading activity to ensure that the activity does not violate any applicable FINRA rule, provision of the federal securities laws or any rule thereunder.” As firms adopt AI algorithms and strategies in their trading functions, they benefit from reviewing and testing their supervisory controls to ensure that there is continued compliance with applicable rules and regulations, including but not limited to FINRA Rules 5210 (Publication of Transactions and Quotations), 6140 (Other Trading Practices) and 2010 (Standards of Commercial Honor and Principles of Trade), SEC Market Access Rule, and SEC Regulation NMS, Regulation SHO and Regulation ATS.
Conclusion
In sum it seem as if FINRA sees great promise in firms using AI to operate more efficiently and to provide better service to clients. On the other hand, it is clear that FINRA has a host of regulatory concerns that firms may be exposing themselves to through the use of AI.
Herskovits PLLC has a nationwide practice representing individual and entities in FINRA disciplinary investigations and proceedings and FINRA arbitration. Feel free to call us for a consultation at 212-897-5410.