LLMs are now capable of translating user queries into small Python programs. This will make it easier for people to get started with new projects, assuming they can be decomposed into smaller (modular and composable) programs.
In this talk, we will look at how we create AI systems from text with LLMs and Hopsworks. We will decompose AI systems into modular feature, training, and inference pipelines and compose them together into an AI system with Hopsworks as a shared state layer. Each pipeline will be generated from user queries, helping improve python developer velocity in creating the first version of their AI system.
Jim will go through a working example to demonstrate the viability now of using AI to build AI systems. You will learn how to develop a minimal viable AI system using AI.
Register here