Getting ready for tougher SEC scrutiny
Agency girds for more technical oversight and investigation, so financial institutions must anticipate heavier data demands
- Written by Mike Brodsky & Anthony DeSantis, Deloitte Financial Advisory Services LLP
“We need C++ programmers, algorithmic high-frequency trading people, people who worked on trading desks working on models . . . people with a level of technical and market expertise that's somewhat unprecedented."
You could easily mistake that as a Wall Street corporate IT chief reeling off the company’s talent wish list. The reality, however, is quite something else. It’s the wish list of the Securities and Exchange Commission.
The words come from Gregg E. Berman, who was named associate director of SEC’s Office of Analytics and Research in the Division of Trading and Markets in early 2013. Established in 2012, the office has both the mandate and tools to assemble “an unprecedented aggregation of trading information data.” Berman oversees research and analysis that will help inform SEC policies on markets and market structure.
The high-powered technical team taking shape in the Office of Analytics and Research is supporting the agency’s drive to gain better insights modern markets, where millions of dollars may move in millionths of a second. Such insight could lead to faster identification of troublesome or illegal behavior, as well as help improve understanding of events such as mini-flash crashes.
The SEC’s increased analytics and research activity could raise the likelihood of more rigorous, real-time monitoring of markets and financial services organizations' business operations and trading activities, as well as the potential for investigations, regulatory inquiries, and overall demands for company data. Companies can be better prepared for such possibilities by gaining greater understanding of their data and their IT systems from both business operations and compliance perspectives. Additionally, continuous monitoring of their trading activities can help demonstrate to SEC that they not only have a rigorous compliance program, but police it regularly to confirm its effectiveness. That can hold weight with SEC should an investigation be launched.
Applying MIDAS touch
Employing its new Market Information Data Analytics System (MIDAS), SEC’s growing team of technical and market specialists can capture and analyze data on all orders posted on the national exchanges, as well as all modification and cancellation of those orders, all trade execution of those orders, and all off-exchange executions. This analysis could help provide a more holistic view of organizations’ activities, including possible issues associated with trading or market timing.
Among transactions likely to receive scrutiny are trades in complex instruments, such as mortgage-backed securities, which may not trade openly in active markets. Large financial institutions may also be a particular focus because of their role in the 2008 financial crisis.
Regulators other than the SEC may also want data residing in enterprise systems. For example, a claimed violation of wage and hour laws might lead to requests from state and federal labor agencies for hourly wage data for thousands of employees.
Potential compliance problems
The availability of so much data in the hands of regulators who have sophisticated analytics and research capabilities could lead to a significant increase in overall disclosure requirements and, of perhaps greater concern, more short-notice requests for information about an organization's activities. For example, SEC and other government agencies might initiate investigations into how organizations record and treat the accounting of certain trades and other activities.
For some organizations, such requests could be difficult to fulfill for several possible reasons:
Data control issues. Various factors can affect the quality of data available to meet discovery demands. For example, an organization may receive a regulatory request for the entire population of a certain type of transaction or transaction series. However, a lack of adequate data controls, data standards, and integration between systems may result in certain people or divisions within the company using different data fields, or populating fields using data types and attributes that others don’t use. Or data may simply be “lost” as it flows from one system to the next. Queries of this data universe may produce robust information in some instances, and far less helpful data from others.
Here’s an example: the “free text field,” which allows inclusion of notes and commentary related to a dataset. First, certain areas of a financial institution may actively use the free text field as a helpful tool to capture pertinent information about a trade, business transaction, or the intent of a transaction. However, other areas may not use the field, or may characterize transactions differently. Second, the absence of standardized values for free text can create problems in accurately diagnosing or reporting on situations and issues requested by authorities. This situation thus may lead authorities to question why different parts of the business can’t communicate and provide consistent data. And, third, free text fields can hold clues provided by employees who enter comments that, intentionally or unintentionally, point to trading problems before or after the fact. These fields should be monitored regularly for potential red flags.
Inability to recreate actual transactions. As part of their inquiries, regulators may request to see the full life cycles of the transactions they are interested in—from the buy to the sell and the ultimate accounting of each trade. This often means that organizations need to recreate actual transactions in a manner that their systems are not designed to do. While explainable, such barriers to compliance may hold little sway with authorities.
Order processing-related issues. Problems can arise simply from not understanding how a transaction is routed through the organization. Once a trade is entered on a terminal, the associated data passes from the front office where the trade took place, to the middle office for recording and trade execution, and then on to the back office for accounting.
It’s not uncommon for data inconsistencies to arise or data not to be passed from one system to the next, as the system interactions in the course of this journey may have different intended purposes than that of the regulatory request, making it difficult to explain exactly what took place. Moreover, the systems for order entry, processing, and recording transactions may be different, and the types of data from each may vary depending on business requirements.
M&A and divestiture complications. An organization that has grown through acquisitions and mergers may have difficulty responding to information requests simply because systems can’t talk to one another. Or, an organization could have spun off a division whose transactions had only been coded in the name of the parent company. Regulators could request transaction information related to the spun-off company, but the organization’s data library isn’t granular enough to determine which piece of the institution it was. As a result, considerable time and effort could be required to figure out exactly what operations were involved and provide the required information.
Operational and system flaws. Some organizations may have systems that have been modified to adapt to new needs. Or, lack of forethought regarding what regulatory purposes systems will be used for can lead to problems. For example, organizations have faced penalties for underpayment to exchanges because their systems captured all transaction participants as “customers,” rather than assigning them to categories such as market maker or broker.
Addressing the issues
Ignorance of legal and regulatory requirements is no longer a defense, and the belief that the regulators only goes after big cases is invalid, as recent insider trading cases demonstrate. Financial services companies facing potentially greater scrutiny because of the work of the SEC’s Office of Analytics and Research can benefit from:
Assessing data in advance. Examine the availability, content, and depth of data. Look for instances where similar types of data in various silos can lead to inconsistencies in data treatment across the company. Even if your organization has confidence in the completeness and accuracy of the numbers themselves, it’s important that the data which is ultimately reported and the underlying data fields are consistent. As you assess your data, consider whether disparate data can be centralized to aide in consistency and responding to regulator requests.
Taking inventory of and knowing systems. Understanding what systems you have, their capabilities, their potential weaknesses, how they’re utilized, and how they communicate with one another can help you respond more quickly, effectively, and completely to discovery and regulatory requirements.
Testing controls. When issues such as exchange fees or transaction accounting surface in news accounts or elsewhere, it can be a good time to get ahead of the game. Examine your data and related controls to determine whether you face similar exposures. Ask, could we respond to a request to produce such information? Then monitor these controls regularly and record the results so you can demonstrate to investigators, if necessary, that you both have controls in place and test them consistently for effectiveness on an ongoing basis.
Having the right people at the table. An institution facing a regulatory request or inquiry may just bring the business side to the table and look to the IT function to provide extracts and incremental information. Instead, it is important to involve not only people on the business side who understand the bigger picture and can help in strategizing beyond the immediate requirements but also the IT personnel who can provide information about what data is available, what systems should be reviewed and what the challenges may be. Making IT personnel aware of the issue or purpose of the request may allow them to suggest alternative strategies or more completely fulfill the request based on their understanding of the systems and their capabilities.
Being ready on Day One. If your organization engages in a merger or acquisition, you often assume liability for the accuracy and completeness of the acquired company’s data immediately upon closure of the deal. Everything the company has done in the past often becomes your responsibility, so know what you’re acquiring in terms of data and systems, test existing controls, and put an integration plan in place. Also, involving people in the transition who understand the interplay of data between systems, as well as the fundamentals of data archiving and data warehousing, can help make data retrievable and usable in the future. At the very least, assessing and inventorying the data environment is important to being able to respond to future inquiries.
Get help if you need it. Meeting regulatory demands requires reasoned, timely, and thorough response. Organizations that lack the experience and resources to do so can benefit from obtaining a range of outside support for the process.
Staying ahead in a more demanding world
Regulators are assembling the resources and expertise to gain an unprecedented view of the data flow that powers today’s financial markets. Market participants, and the data they’ll be expected to provide, will be key resources in the government’s efforts.
Organizations that understand regulator expectations, carefully assess their own data profiles and needs, and implement processes, controls, and mechanisms for ongoing monitoring that equip them to meet compliance requirements can be better positioned for an era of more intense oversight and scrutiny.
About the authors
Mike Brodsky is a director at Deloitte Financial Advisory Services LLP, [email protected]. Anthony DeSantis is a principal at the firm. [email protected]
Tagged under Risk Management, Compliance/Regulatory,