C3 AI Documentation Home

Upgrade and Test the C3 AI Model Registry

C3 AI Model Registry enables sharing models across applications with capabilities to register, load, and search models through the C3 AI Model Registry Service. For example, applications can train models outside of the production application and use the C3 AI Model Registry to move models to production.

However, it is critical to use best practices to optimize functionality and make sure the models are backward compatible. The following section provides instructions for upgrading the C3 AI Model Registry Service and ensuring change management in version control so that models remain backward compatible. As such, this topic also includes steps for testing the models in an application package to verify models are correctly loaded to the C3 AI Model Registry.

Upgrade the C3 AI Model Registry Service

To upgrade the C3 AI Model Registry Service and migrate the models, the Application Administrator follows the steps below:

  1. Upgrade the C3 AI Model Registry Service from version 8.3.3 to 8.4.

  2. Fetch all old entries from ModelRegistryService.Model and ModelRegistryService.Record.

  3. From the static console of the C3 AI Model Registry Service, run ModelRegistryService.ensureGlobalUriPrefix(). This completes the migration.

  4. Verify that entries are migrated by calling load and list APIs from a client application.

  5. Delete the old entries from Step 2 using removeBatch.

CAUTION: Only complete this step after thorough verification that model migration was successful.

Test registering and loading models from C3 AI Model Registry

For each model type in the Application package (MlPipe, MlPipeline, and MlModel), use the following steps to test that models were registered and loaded correctly to the C3 AI Model Registry service.

  1. Create and register the model to the C3 AI Model Registry using ModelRegistry.register*(). The models should register with no errors.

  2. Load the models from the C3 AI Model Registry using ModelRegistry.load*(). The models should load without errors.

  3. Generate predictions using the loaded models. Predictions should generate without errors.

Debug common errors and issues

Applications may see failures if they have extended MlModel or MlPipe to include entity references. To include the needed data when registering the MlModel or MlPipe, implement the replaceEntityReference function for the application.

Use replaceEntityReference function to register and load MlPipe objects with entity references

The C3 AI Model Registry supports customizing the model registry behavior for a new MlPipe with entity references. In these instances, use the replaceEntityReferences method to successfully register and load MlPipe objects from the C3 AI Model Registry, if there are additional fields in MlPipe that hold required entity references.

As an example, consider the following example code snippet in which a new MlTemplate.Pipeline is developed that references two MlPipe instances: convNet and scalerPipe.

Type
type ConvNetEnsemblePipeline<DO, DI> extends MlTemplate.Pipeline<{keras_input: Data, rfc_input: Data}, Data, {weights: Data}, DO, DI> type key 'CNEP' {

  @ML(containsHyperparams=true)
  convNet: !MlPipe

  @ML(containsHyperparams=true)
  scalerPipe: !MlPipe

  @ML(hyperparameter=true)
  numNets: int = 1

  beforeCreate: ~ py
  generatePipeline: ~ py
  generateTypeBindings: ~ py
  replaceEntityReferences: ~ py
}

By default, when registering an instance of the pipeline in the example above, only the references to the MlPipes defined on ConvNetEsemblePipeline will be included in the C3 AI Model Registry rather than the full MlPipe object.

To include the full objects stored in the fields scalerPipe and convNet, implement replaceEntityReferences. See the following example code snippet.

NOTE: It is required to invoke the replaceEntityReferences of the parent type using super in the overridden method at the start of the method definition.

JavaScript
def replaceEntityReferences(this):
    this = this.getMissing({'include': 'this'})
    this = this.super("MlTemplate.Pipeline").replaceEntityReferences()
    conv_net = this.convNet.getMissing({'include': 'this'}).replaceEntityReferences().withoutIdentity()
    scaler_pipe = this.scalerPipe.getMissing({'include': 'this'}).replaceEntityReferences().withoutIdentity()
    this = this.withConvNet(conv_net).withScalerPipe(scaler_pipe)
    return this

Test the backward compatibility of models

Occasionally, changes are made to the MlPipelines or MlModels defined in the applications that make them backward incompatible with new versions of applications that use the models. C3 AI recommends that application developers test and document backward incompatibility as part of the QA or CI processes to reduce the impact of these changes and help mitigate potential compatibility issues.

The following workflow is recommended for testing backward compatibility:

  1. Determine how many previous versions to test for backward compatibility. For example, if you decide to support the previous minor version, in version 8.5 you should test an application to verify whether models created on version 8.4 can be used in version 8.5. In this example, the C3 AI Model Registry service is using version 8.5.

  2. Start multiple applications for the versions to be tested against. For example, to test support of a model created in version 8.4 and whether it works in version 8.5, create an application with version 8.4 and another application in version 8.5.

  3. Register MlPipelines and MlModels from the previous version 8.4 to the C3 AI Model Registry using ModelRegistry.register*. The models should register to the version 8.5 C3 AI Model Registry without errors.

  4. From the version 8.5 application, load the models from the C3 AI Model Registry using ModelRegistry.load*. The models should load without errors.

  5. Create predictions with a loaded model using MlPipeline.forId().process(). Predictions should be generated without errors.

If any errors are created during the above, see the "Additional considerations" section below for troubleshooting considerations.

Additional considerations

When testing for backward compatibility, verify the following if errors occur as possible troubleshooting considerations.

  • When loading a MlPipeline from the C3 AI Model Registry, the required runtime (for example, py-data-science) must exist. If the applications change runtimes between releases, this could cause backward incompatibility.

  • When loading a MlModel to an application, the Feature Sets must already exist in the application.

See also

Was this page helpful?