Hopsworks AI Lakehouse with

Dataiku

Dataiku can be used as a Data Science platform for developing and operating feature pipelines, training pipelines, and batch inference pipelines that read and write from Hopsworks Feature Store. Dataiku model deployments can be integrated with the Online Hopsworks Feature Store to provide near real-time precomputed features to model deployments.

Hopsworks Integrations

Dataiku can be used for development and for operations of for all your machine learning pipelines (feature pipelines, training pipelines, and batch inference pipelines), with features stored in Hopsworks. Dataiku can serve models that are connected to Hopsworks Online Feature Store, providing history and context features to operational models.

Other integrations

Parquet (Athena, S3, ADLS, GCS)
Snowflake
Modal