AN INTRODUCTION TO NEURAL NETWORKS WITH KDB+

BY James Neill

 James Neill, a Montreal-based kdb+ developer with First Derivatives,  wrote a paper about how kdb+ can be used in machine learning techniques.

Technical Whitepaper

An Introduction to Neural Networks with kdb+

Author:

James Neill works as a kdb+ consultant for one of the world’s largest investment banks developing a range of applications. James has also been involved in the design of training courses in data science and machine learning as part of the First Derivatives training programme.

1 INTRODUCTION

Due to the desire to understand the brain and mimic the way it works by creating machines that learn, neural networks have been studied with great interest for many decades. A simple mathematical model for the neuron was first presented to the world by Warren McCulloch and Walter Pitts in 1943.

With modern advances in technology and the computational power available, this field of research has had massive implications on how computers can be used to ‘think’ and learn to solve problems given an appropriate algorithm. A few interesting examples of this research in action include:

Aiding in medical diagnostics

Interpreting art and painting images

http://arxiv.org/pdf/1508.06576v1.pdf

Performing stock market predictions

A number of different algorithms have been developed around this field of research, and this paper is going to focus on the implementation of a feedforward neural network in kdb+. A feedforward network (also known as a multi-layer perceptron) is a type of supervised machine learning algorithm which uses a series of nonlinear functions layered together with an output. It can be used for classification or regression purposes and has been shown to be a universal approximator – an algorithm that can model any smooth function given enough hidden units1.

This design of feedforward networks can be represented through operations on matrices and vectors. Array programming languages such as q/kdb+ are well suited for computational implementations in this format due to the vectorised operations on lists.

All tests were run using kdb+ version 3.2 (2015.05.07)

1 Kurt Hornik, “Approximation Capabilities of Multilayer Feedforward Networks”, Neural Networks, Vol. 4, pp. 251-257, 1991

Read more here: Download PDF

Sponsored by  Kx Systems

You may also like...