Model Config

class ModelConfig(model_json, base_dir=None, labels=None, colors=None)

The model configuration parameters.

Parameters:
  • model_json (dict) – The parsed alwaysai.model.json.

  • labels (Optional[List[str]]) – The label list for the model.

  • colors (Optional[ndarray]) – The color list for the model.

classmethod from_model_id(model_id)
Return type:

ModelConfig

property config: dict

The config loaded from the model JSON file

property model_parameters: dict

The model parameters in the config

property id: str

The model ID

property label_file: str | None

Path to the label file

property colors_file: str | None

Path to the colors file

property model_file: str

Path to the model weights file

property config_file: str | None

Relative path to the model framework config file

property mean: Tuple[float, float, float]

The RGB/BGR mean values for the model

property scalefactor: float

The scale factor for the model input

property size: Tuple[int, int]

The input image size of the model

property purpose: SupportedPurposes

The purpose of the model

property framework_type: str

The framework type of the model

property crop: bool

Whether or not to crop the image prior to inferencing

property colors_dtype: str

The data type of the color values

property labels: List[str] | None

The labels of the model

property colors: ndarray | None

The colors for each label of the model.

Each array element is a 3 dimensional array of 8 bit integers representing red, green, and blue

property swaprb: bool

Whether to swap the red and blue channels of the image prior to inference

property architecture: str | None

The architecture of the model

property softmax: bool

Whether to perform softmax after the inference

property device: SupportedDevices | None

The device the model was built for

property output_layer_names: List[str] | None

The output layer names of the model

property hailo_quantize_input: bool | None

Whether to quantize input of Hailo model

property hailo_quantize_output: bool | None

Whether to quantize output of Hailo model

property hailo_input_format: str | None

Input format for Hailo model

property hailo_output_format: str | None

Output format of Hailo model

property dnn_support: bool

Whether DNN Engine supports the model

property dnn_cuda_support: bool

Whether DNN CUDA Engine supports the model

property tensor_rt_support: bool

Whether TensorRT Engine supports the model

property hailo_support: bool

Whether Hailo RT Engine supports the model

property qaic_support: bool

Whether QAIC RT Engine supports the model

property onnx_rt_support: bool

Whether ONNX RT Engine supports the model

property pytorch_support: bool

Whether PYTORCH Engine supports the model

property batch_size: int

Inference batch size of model

property hub_repo: str | None

Torch hub repo

property hub_model: str | None

Torch hub model

property hub_pretrained: bool | None

Torch pretrained model

property hub_force_reload: bool | None

Torch force reload the model