As the SVP of Products, Solutions and Innovation at Kx Systems, James Corcoran is part of a new chapter in software development at Kx. Since joining Kx parent First Derivatives as a financial engineer in 2009, James has worked around the world building enterprise systems at top global investment banks before moving to the Kx product team in London. James sat down with us recently to discuss his perspective on product design and our technology strategy for the future.
Q: You have had one of the most diverse careers at Kx, working with clients to build trading and analytics systems across equities, fixed income and foreign exchange asset classes, as well as working with clients outside of the investment banking industry. What trends have you seen?
A: Data volumes today are vastly bigger than what we were dealing with five, six or seven years ago. This continued growth in the amount of data being generated is true within our clients’ organizations and across industries in general.
We are also capturing increasingly diverse datasets. In the first part of my career, the applications that I was helping to build were usually quite specific – narrowly defined to solve specific problems. Today, as our clients are increasingly adopting data-driven cultures, our work is more focused on building Big Data platforms which feed real-time analytics to multiple downstream applications.
Q: How has the evolution of computer memory and data storage technology impacted application design?
A: These days we can keep more data in-memory than ever before. This means we can have bigger datasets representing real-time views of the world, and we have more choices in how we scale systems both vertically and horizontally. With more performant storage technologies, we can also run simulations and backtest our models against more and more historical data.
Q: Kx’s core technology was created more than two decades ago anticipating vast increases in data volumes, beyond what computers at the time could handle. How has that original assumption impacted your work?
A: That original vision has allowed us to build future-proof systems capable of handling data volumes greater than those which are observable today. Our systems scale from single-node applications to globally distributed architectures, and some of our clients run workflows across tens of thousands of cores in the Cloud. Knowing that kdb+ can natively take advantage of future advances in computing power allows us to design our systems for massive scale.
Q: There have been several notable events that have rocked the global financial system since you started your career, how have you experienced them as a Kx software engineer working on client sites?
A: As a Kx engineer, I’ve had the opportunity to work on some of the busiest trading desks in the world. Throughout this time, I’ve been involved in markets during major flash crashes caused by rogue trading algorithms as well as unforeseen geopolitical events. Whilst each event is different, we usually see a surge in volatility and this leads to huge increases in the throughput of data flowing through capital markets systems. We regularly stress test our systems so that we are confident we can continue to operate under these conditions, which in turn allows our clients to continue to make markets and provision services for their customers.
Q: How does your experience building systems for clients impact your role in product and solution design?
A: As a technology, Kx has been stress tested in the most demanding environments, and my time spent with customers in the field has taught me the importance of ensuring our mission critical systems are resilient enough to withstand black swan events.
The expertise we’ve gained from years of engineering fast analytics systems for huge quantities of real-time data is also incredibly valuable as we deploy Kx into increasingly diverse and data-hungry industries such as Formula One racing, space exploration, retail and telco.
Q: Since the financial crash in 2008, the financial services industry has entered a period of increased regulation, how has that impacted what you do?
A: Kx has been the time-series and analytics engine of choice for a long time in front office electronic trading environments helping banks to automate their pricing, trading and hedging businesses. But we’ve also been providing surveillance and regulatory solutions to the capital markets industry for a number of years. Whilst the use-cases vary, ultimately these systems have similar requirements when it comes to the datasets, the need for high-speed analytics, scalability, resilience, and the integrity and security of data.
Our surveillance solution contains a library of models designed to detect attempts to manipulate markets, whilst our MiFID II solution contains a comprehensive suite of pre and post-trade transparency reports. We are also building solutions for the forthcoming Consolidated Audit Trail regulation in the US, as well as Securities Financing Transactions Regulation in Europe, both of which are challenging the industry with complex data lineage requirements. Kx is a natural technology choice when it comes to implementing systems for the various regulatory regimes, since it handles a combination of streaming analytics, real-time, and batch based processing in a single platform.
Q. What have been the key considerations in defining products and solutions in the Kx software suite?
A. Our core product is kdb+, the time-series database and analytics engine, and a key consideration is to remain general-purpose in nature. This strategy gives our clients maximum flexibility, and it also allows us to create products in new markets relatively quickly. As our software solutions have matured, we recognised the need to build tools and platforms to automate many of the tasks associated with capturing, normalising, cleansing and distributing large volumes of real-time and historical data. When coupled with our suite of data science and visualisation products, we can create powerful solutions for our clients without reinventing the wheel each time.
Q: What do you think will keep Kx technology on the cutting edge in the future?
A: It is important that we stay true to our commitment to having a speed and performance advantage for managing big fast data streams, and that we continue to provide that advantage in a very small software footprint.
We are also now investing heavily in machine learning, having recently set up a research team in London where we partner with leading academics in this space. We plan to implement many machine learning algorithms natively in Kx and we’re also providing tight bindings to best-of-breed libraries such as TensorFlow for deep learning. As well as open sourcing our work under this initiative, we plan to integrate machine learning techniques into our product portfolio this year.
When it comes to the cutting edge, we’re working with our cloud partners to ensure we take advantage of the latest innovations in cloud technologies so that we can present new architectures and deployment options to our clients. And of course we always like to stay abreast of emerging technologies such as blockchain.