Classification
- class ClassificationPrediction(confidence, label)
A single prediction from
Classification
.- Parameters:
- class ClassificationResults(predictions, duration, image)
All the results of classification from
Classification
.Predictions are stored in sorted order, with descending order of confidence.
- Parameters:
predictions (
List
[ClassificationPrediction
]) – The list of predictions ordered by confidence descending.duration (
float
) – The duration of the inference.image (
ndarray
) – The image that the inference was performed on.
- property predictions: List[ClassificationPrediction]
The list of predictions.
- class Classification(model_id, model_config=None)
Identify the most prominent object in an image.
Typical usage:
classifier = edgeiq.Classification('alwaysai/googlenet') classifier.load(engine=edgeiq.Engine.DNN) <get image> results = classifier.classify_image(image) for prediction in results.predictions: print('Label: {}, confidence: {}'.format( prediction.label, prediction.confidence))
- Parameters:
model_id (
str
) – The ID of the model you want to use for image classification.
- classify_image(image, confidence_level=0.3)
Identify the most prominent object in the specified image.
- Parameters:
- Return type:
- property accelerator: Accelerator | None
The accelerator being used.
- property colors: ndarray | None
The auto-generated colors for the loaded model.
Note: Initialized to None when the model doesn’t have any labels. Note: To update, the new colors list must be same length as the label list.
- property labels: List[str] | None
The labels for the loaded model.
Note: Initialized to None when the model doesn’t have any labels.
- load(engine=Engine.DNN, accelerator=Accelerator.DEFAULT)
Load the model to an engine and accelerator.
- Parameters:
engine (
Engine
) – The engine to load the model toaccelerator (
Accelerator
) – The accelerator to load the model to
- property model_config: ModelConfig
The configuration of the model that was loaded
- property model_purpose: SupportedPurposes
The purpose of the model being used.
- publish_analytics(results, tag=None, **kwargs)
Publish results to the alwaysAI Analytics Service
Example usage:
try: inference.publish_analytics(results, tag='custom_tag') except edgeiq.PublishError as e: # Retry publish except edgeiq.ConnectionError as e: # Save state and exit app to reconnect
- Parameters:
- Raises:
ConnectionBlockedError
when using connection to the alwaysAI Device Agent and resources are at capacity,- Raises:
PacketRateError
when publish rate exceeds current limit,- Raises:
PacketSizeError
when packet size exceeds current limit. Packet publish size and rate limits will be provided in the error message.