When and where will this year’s summit take place?
The In-Memory Computing Summit North America will take place in Silicon Valley at the Hyatt San Francisco Airport, November 13th-14th.
What is the theme of this year’s event?
The theme of this year’s event is the role of in-memory computing in the digital transformation of enterprises. Within that overarching theme, sessions will cover a range of in-memory computing (IMC) technology and use case topics and provide practical advice. Speakers will share the latest advancements in in-memory computing architectures and the future direction of the technology, address the latest developments in streaming data use cases and technology, and discuss lessons learned implementing successful in-memory computing projects. We’re also excited this year to have a keynote about digital transformation and in-memory computing as it relates to companies utilizing mainframe computers for financial services use cases. The conference is the perfect forum for technical decision-makers, business decision-makers, architects, CTOs, developers, and others who want to learn how to leverage in-memory computing, Big Data, Fast Data, IoT, and HPC solutions to transform their business operations.
What stood out for you from last year’s In-Memory Computing Summit?
Last year, the North America Summit provided valuable insight into how digital transformations powered by in-memory computing are radically altering enterprise business models and impacting customer experiences. IMC experts and practitioners shared their experiences and offered practical advice based on how they use in-memory computing to accelerate their own organizations’ digital transformation initiatives. Nearly 450 people representing 227 organizations from 10 countries across 4 continents and 19 U.S. states registered to listen to speakers from American Airlines, ING, Intel, Oracle, Salesforce.com, Huawei, SNIA, Wellington Management, GridGain Systems and many more. This year, we anticipate similarly enthusiastic attendees and the speaker roster is equally impressive.
What are you most looking forward to about the 2019 North America conference?
I am most looking forward to seeing how the in-memory computing landscape has evolved over the past twelve months. In-memory computing is experiencing massive growth and adoption, especially among non-financial services companies. While we will hear about continued rapid growth in financial services and even get a glimpse into emerging mainframe-centric use cases, we will hear even more tales of success from non-financial services firms such as in the keynote by 24 Hour Fitness. We’ll also hear about the major change in memory technology that happened this year with the general availability release of Intel Optane. The past year has seen witnessed some fascinating changes in the in-memory computing landscape!
What experts are going to speak at this year’s event?
Keynote and conference breakout session speakers will include representatives from 24 Hour Fitness, AT&T, Intel, Oracle, IBM, The Storage Networking Industry Association (SNIA), and more. Some sample breakout sessions include:
- A Simple, Responsive In-Memory Architecture for Data Ingestion and Analytics, Using Apache NiFi and Apache Ignite – Ezat Karimi, AT&T
- Persistent Memory Use Cases in Modern Software Architectures – Olasoji Denloye, Intel
- Top 10 Ways to Scale Your Website with Redis – Dave Nielsen, Redis Labs
- GridGain Ultimate Edition in Action Aids Implementation of Multiple SaaS Systems and Replaces Traditional Databases – Craig Gresbrink, 24 Hour Fitness
- How In-Memory Computing Can Accelerate Your SQL RDBMS – Douglas Hood, Oracle
- Driving Efficient Mainframe Digital Transformation Leveraging GridGain/Ignite on z/OS – Mythili Venkatakrishnan, IBM and Glenn Wiebe, GridGain
- Build and Deploy Digital Twins on an IMDG for Real-Time Streaming Analytics – William Bain, ScaleOut Software
- Powering Digital Transformation with In-Memory Computing – Becky Wanta, One Degree World
- “Computational Memory” Computing, Tensor Flow + Hyperdimensional Computing – Gil Russell and Alan Niebel, WebFeet Research Inc.
- Diagnosing Memory Utilization, Leaks and Stalls in Production – Marcos Albe, Principal Support Engineer, Percona
- IoT Data Integration with StreamSets for In-Memory Analytics – Pat Patterson, Technical Director, StreamSets
How has the In-Memory Computing market changed with respect to last year?
First, in-memory computing platforms have continued to mature, introducing key advances that make it applicable for an increasing number of use cases. For example, in-memory computing platforms that include integrations with Apache Spark and data lake stores, such as Hadoop, make it possible to perform real-time operational analytics across combined data lake and operational data sets in order to power applications such as predictive maintenance or real-time 360 degree customer views.
Second, organizations in nearly every industry, from financial services and fintech, to IoT and SaaS, to ecommerce and healthcare, increasingly recognize they can no longer be constrained by the performance limitations of disk-based databases. And they can’t keep investing in an antiquated architecture that separates their transactional database from their analytical database and requires a costly and time-consuming extract, transform and load (ETL) process to copy the transactional data into the analytical database.
Businesses are finding that in-memory computing is the only relatively simple and cost-effective solution for obtaining the performance and scalability they need to deploy modern real-time, data-intensive applications to production.
What are some of the most compelling use cases you’ve seen for in-memory computing?
There are uses cases everywhere, including financial services, healthcare, software, telecommunications, pharmaceuticals, entertainment, transportation, shipping and logistics. A couple of the impressive uses cases we’ve seen are:
- A credit card company that updates its fraud detection model hourly instead of daily, reducing the vulnerability of banks to new fraud vectors.
- A pharmaceutical company that performs hundreds of thousands of analyses across multiple parameter sets and assumptions in just a few hours or even minutes—instead of weeks.
- A sports betting platform that instantly shares new betting opportunities and odds for multiple live events, while processing over 700 bets per second.
- An e-commerce site that improves the relevance of its recommendation engine by updating its ML-trained model more frequently.
- A large equipment manufacturer that leverages an in-memory computing-powered IoT platform and machine learning for a predictive maintenance application that reduces downtime and increases ROI.
- A healthcare institution that uses an IoT platform to improve patient monitoring and track the spread of disease.
What in-memory computing trends and use cases do you think will rise to the forefront in the coming years?
Digital transformation remains the trend that will have the most impact on businesses. I think interest in digital transformation will continue to grow over the next few years as companies begin to put theory into practice and gain an understanding of which in-memory technologies will have the most impact on their business processes.
For example, it’s common for companies to periodically move their operational data to a data lake, where data scientists access the data for analysis. But, increasingly, use cases call for analyzing data collected from devices in the field in real-time (such as the operating status of an airplane engine) with the historical data in the data lake. With the ability to drive real-time, automated business decisions based on real-time analysis across combined data lake and operational data, businesses can react immediately to a rapidly changing environment, which is precisely the promise of digital transformation.
In-memory computing platforms that include integrations with Apache Spark and data lake stores, such as Hadoop, can make those insights possible. The in-memory computing platform pulls the needed historical data from the data lake and maintains it in memory, along with the operational data, enabling real-time analytics across the combined dataset.
Terry Erisman, EVP of Marketing and Alliance, GridGain Systems
Sponsored by GridGain Systems