Pre-emptive Financial Markets Regulation – next step for Big Data.
By Morgan Deane, member of the Board and International Head of Legal & Compliance for the Helvea-Baader Bank Group.
Some months ago, I commented that Big Data had significant potential as a prevention mechanism in financial markets regulation.
I suggested that it should be viewed this way instead of being something that only posed regulatory challenges.
Incidentally, in April, JPMorgan announced that they would be rolling out an algorithmic surveillance tool which is designed to monitor all communications of staff within certain of its higher risk units. It explained that the system seeks to identify potential rogue employees in a predictive fashion. It uses information fed into the system to help predict behaviour – information such as personal trading breaches, limit breaches, failure to attend compliance classes etc.
It kicked off an interesting round of discussions in the market place as it highlighted the potential of utilizing data in an effort to control and regulate. Once again, though, high volume data processing brought an air of trepidation as people highlighted the perils of pre-emptive disciplinary measures, likening the initiative to the movie ‘Minority Report’.
Like all attempts to manipulate the huge amounts of data available to banks, there is no silver bullet. It goes without saying that there needs to be a carefully thought out approach to how tools such as this are used. But we must acknowledge, if not applaud, the efforts of banks such as JPMorgan to embrace such initiatives as being the future of regulation.
When regulatory sanction arises as the result of the actions of a small number of individuals, then it is logical to seek a way to identify such problems in advance. And there is no doubt that the activity of employees on their workstations each day can give a reasonably accurate character profile.
In my previous article, I noted that the existing methods of tracking employee electronic activity is quite naïve and is based on the assumption that an employee engaged in insider trading or excessive risk taking will be careless enough to use typical key words and phrases which a traditional system can pick up. When Bloomberg announced the story regarding JPMorgan, the article referenced the views of Tim Estes, Chief Executive Officer of Digital Reasoning Systems Inc. He put the power of the future of surveillance tools available into perspective by noting that technology which was built for counter-terrorism surveillance can be used by financial institutions to analyse human language, because it is where, he believes, a person’s intentions are shown.
This is a major step forward, in my opinion. We have stepped beyond the previous limitations of obtaining the data and figuring out what we can do with it. The boundaries of what can be done with that data are constantly being tested and it is the power of the solutions available which is unsettling for many people.
The recurring fear for many is that their electronic conduct can now be analysed as a ‘big picture’. It is unlikely that it is a concern for data privacy which has caused this reaction. It has long been established that employees have no expectation of privacy when working in certain functions in the financial industry. It seems employees accepted this fact because they knew the sheer volume of data they create each day, on a variety of different platforms, would never be pieced together. But now it can.
It is hard to argue against the fact that, taken over a reasonable period of time, all of a person’s electronic activity does give a fairly honest picture of who that person really is. One only needs to look at any legal proceeding which involves the review and discussion of email traffic within a firm which is the subject of an investigation. Emails which appear harmless at the time can appear vastly different when read in the cold light of day years later. One can only assume that an algorithm capable of deciphering the interactions of people on multiple platforms in real time will lead to some interesting findings
So where shall this lead? I expect some interesting debates to arise on whether data privacy should be sacrificed for the sake of regulation. It will be very interesting to see if/when we have an employment litigation arise from an employee dismissed on the basis of pre-emptive surveillance results. It will put much greater focus on human resources professionals who will be required to justify decisions based on probability and possibility.
It also poses an interesting conundrum for the general public. The scandals uncovered in the financial industry have led the court of public opinion to support anything which more tightly controls and regulates those in the financial industry.
But the general public may find this technology rather unsettling too. It will cause them to think how such technology is being used elsewhere and whether they need to consider their own conduct. If nothing else, it will cause people to stop and think about how much data they share on social media every day. Could the day come when potential employers would scour social media channels for interviewee’s contributions and run them through an algorithm to determine suitability?
One thing that is for sure is that the ground is moving in respect of what big data can do. Where formerly people concerned themselves with their privacy and what should not be made public, people may now need to consider the way the ‘public’ data is perceived and whether this could bring about very different legal consequences for them in the future.