WEBINAR Build Better Multimodal RAG Pipelines with FiftyOne, Llama Index, and Milvus. Zoom Feb 15, 2024 09:00 AM Pacific, Feb 15, 2024 12:00 PM Eastern

Join the Webinar

About this session

Multimodal Large Language Models like GPT-4V, Gemini Pro Vision and LLaVA are ushering in a new era of interactive applications. The addition of visual data into retrieval augmented generation (RAG) pipelines introduces new axes of complexity, reiterating the importance of evaluation. In this webinar, you’ll learn how to compare and evaluate multimodal retrieval techniques so that you can build a highly performant multimodal RAG pipeline with your data. The data-centric application we will be using is entirely free and open source, leveraging FiftyOne for data management and visualization, Milvus as a vector store, and LlamaIndex for LLM orchestration.

Topics covered:

  • Applications of multimodal RAG
  • The challenges of working with multiple multimodality
  • Advanced techniques for multimodal RAG
  • Evaluating multimodal retrieval techniques

Speaker

Jacob Marks is a Machine Learning Engineer and Developer Evangelist at Voxel51, creators of the open source FiftyOne library for curation and visualization of unstructured data. The library has been installed more than 2M times, and helps everyone from solo developers to Fortune 100 companies to build higher quality datasets. At Voxel51, Jacob leads open source efforts in vector search, semantic search, and generative AI. Prior to joining Voxel51, Jacob worked at Google X, Samsung Research, and Wolfram Research. In a past life, he was a theoretical physicist: in 2022, he completed his Ph.D. at Stanford, where he investigated quantum phases of matter.

You may also like...