Hopsworks AI Lakehouse with

Kubeflow

Kubeflow can be used to orchestrate feature, training, and batch inference pipelines as Python programs that write/read to/from the Hopsworks Feature Store.

Hopsworks Integrations

Kubeflow can be used as the orchestration engine for all your machine learning pipelines (feature pipelines, training pipelines, and batch inference pipelines). These pipelines write/read their state to/from Hopsworks. Jupyter Notebooks can use Hopsworks to discover and use features to create training data.

Other integrations

Modal
Airflow
MongoDB