List of All Tutorial Notebooks
This provides a list of all tutorial Jupyter notebooks provided by C3 AI.
To see and use these notebooks, create an application with the root package tutorials. For more information on creating an environment and application, see Get Started with a Data Science Development Environment. For more information on configuring C3 AI-managed Jupyter notebooks, see Configure and Use C3 AI Managed JupyterLab Notebooks.
List of tutorial notebooks
Notebooks are grouped by C3 data science features.
Get started
C3 AI supports any Python library that is accessible by a conda channel (public or private). These notebooks help you get started on the C3 Agentic AI Platform for data science.
- Create, Update, and Delete Data from the Database (Python)
- Fetch and Filter Data From the Database - Tutorial
- Getting Started with C3 Datasets: Overview and Introduction to Data
- Work with the Python SDK on the C3 Agentic AI Platform - Tutorial
- Working with Runtimes
Data
The C3 Agentic AI Platform offers a single, unified, data interface for all data operations. This data interface is called C3 AI Datasets, and is available through the Data C3 Type.
You can start using C3 AI Datasets with Raw Data, or with an existing Application Model.
- Data Conversion Methods
- Data Exploration with C3 AI Datasets - Tutorial
- Working with the Data Interface - Tutorial
- Using Data#flatten() - Tutorial
Feature Store
A feature store is a centralized repository of materialized (pre-computed) feature data. It provides three main functions:
- Share and discover features across teams.
- Reuse named features in both training and prediction/inference contexts.
- See a point-in-time view of multiple features (for example, see the most recent data defined in each feature at a specific point in time).
Available notebooks are located here:
- Create Features from Python Functions using Lambda Feature Sets
- Feature Materialization and Snapshots
- Metrics and Features
Model Development
C3 AI Model Development allows you to share and re-use implementation across projects, and is integrated with C3 AI Model Deployment and C3 AI Model Registry. Use Model Development to scale and optimize performance by parallelizing pipeline operations, and optimizing data transfer operations between nodes of the pipeline directed acyclic graph (DAG) during execution.
- Developer Guide - Data PushStream and Stream in MlAtomicPipe
- Hyperparameter Optimization at Scale - HPO Tutorial
- KerasPipe Tutorial
- MlDynamicPipe Tutorial
- Interpretability Deep Dive - ShapInterpreter Authoring and Execution Tutorial
- MlLambdaPipe Tutorial
- Model Training Jobs - Tutorial
- Author Machine Learning Pipelines - MlPipeline Tutorial
- MlTemplate Pipe Tutorial
Model Deployment
The Model Deployment Framework (MDF) helps data scientists and application developers train, deploy, and manage the life cycle of machine learning models. Model deployment enables the flexible configuration of both simple single-model deployments and complex multi-model deployments.
- Tutorial Model Deployment - Predictive Maintenance for Wind Turbines
- Tutorial Model Evaluation - Evaluating the Performance of MlModels
- Model Monitoring - Monitoring the Health of MlModels
- Tutorial MlSubject Process Job
Model Registry
The C3 Agentic AI Platform provides you with robust ML model deployment and health checks. Use the platform to ensure that you can reproduce, deploy, and scale models from training. You can also share models across applications, with capabilities to register, load, and search for existing registered models from other users.