InferenceInput.Params.Builder


public static final class InferenceInput.Params.Builder
extends Object

java.lang.Object
   ↳ android.adservices.ondevicepersonalization.InferenceInput.Params.Builder


A builder for Params

Summary

Public constructors

Builder(KeyValueStore keyValueStore, String modelKey)

Creates a new Builder.

Public methods

InferenceInput.Params build()

Builds the instance.

InferenceInput.Params.Builder setDelegateType(int value)

The delegate to run model inference.

InferenceInput.Params.Builder setKeyValueStore(KeyValueStore value)

A KeyValueStore where pre-trained model is stored.

InferenceInput.Params.Builder setModelKey(String value)

The key of the table where the corresponding value stores a pre-trained model.

InferenceInput.Params.Builder setModelType(int value)

The type of the pre-trained model.

InferenceInput.Params.Builder setRecommendedNumThreads(int value)

The number of threads used for intraop parallelism on CPU, must be positive number.

Inherited methods

Public constructors

Builder

Added in API level 35
public Builder (KeyValueStore keyValueStore, 
                String modelKey)

Creates a new Builder.

Parameters
keyValueStore KeyValueStore: A KeyValueStore where pre-trained model is stored. Only supports TFLite model now. This value cannot be null.

modelKey String: The key of the table where the corresponding value stores a pre-trained model. Only supports TFLite model now. This value cannot be null.

Public methods

build

Added in API level 35
public InferenceInput.Params build ()

Builds the instance.

Returns
InferenceInput.Params This value cannot be null.

setDelegateType

Added in API level 35
public InferenceInput.Params.Builder setDelegateType (int value)

The delegate to run model inference. If not set, the default value is InferenceInput.Params.DELEGATE_CPU.

Parameters
value int: Value is InferenceInput.Params.DELEGATE_CPU

Returns
InferenceInput.Params.Builder This value cannot be null.

setKeyValueStore

Added in API level 35
public InferenceInput.Params.Builder setKeyValueStore (KeyValueStore value)

A KeyValueStore where pre-trained model is stored. Only supports TFLite model now.

Parameters
value KeyValueStore: This value cannot be null.

Returns
InferenceInput.Params.Builder This value cannot be null.

setModelKey

Added in API level 35
public InferenceInput.Params.Builder setModelKey (String value)

The key of the table where the corresponding value stores a pre-trained model. Only supports TFLite model now.

Parameters
value String: This value cannot be null.

Returns
InferenceInput.Params.Builder This value cannot be null.

setModelType

Added in API level 35
public InferenceInput.Params.Builder setModelType (int value)

The type of the pre-trained model. If not set, the default value is InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE . Only supports InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE for now.

Parameters
value int: Value is InferenceInput.Params.MODEL_TYPE_TENSORFLOW_LITE

Returns
InferenceInput.Params.Builder This value cannot be null.

setRecommendedNumThreads

Added in API level 35
public InferenceInput.Params.Builder setRecommendedNumThreads (int value)

The number of threads used for intraop parallelism on CPU, must be positive number. Adopters can set this field based on model architecture. The actual thread number depends on system resources and other constraints.

Parameters
value int: Value is 1 or greater

Returns
InferenceInput.Params.Builder This value cannot be null.