Model Plugins

Currently, the logger function is not yet available in v3 for model plugins. Using it may cause issues. We will remove this notice once it is available again.

With the Model custom plugin, you can add a quick way for users to run your model on their project assets.

When users click to run your plugin, you will get one of their assets, feed them to your model, and return the model's results as pre-annotations to their assets.

Creating a Model Plugin

Following the steps outlined in this section, create a new plugin from the UI, choosing "Model" as the plugin type.

In the "Class Names" section, enter the names of the classes used by your model.

For example, assume that your model has three classes it returns, "person," "car," and "light." At a later stage, the user will be able to "link" (map) these classes to their own label categories representing the ideas of "person," "car" etc.

Then, create and run a Python script using the ModelPlugin class you can find in our imerit-ango Python package under imerit_ango.plugins.

You will need to add the imerit-ango package to your Python environment by running

pip install imerit-ango

Here is the class's documentation, and an example:

ModelPlugin

Parameters:

  • id: string

    • The plugin's ID. You may obtain this ID from the plugin's information box in the Development section of the Plugin page.

  • secret: string

    • The plugin's secret. You can think of this as a private key you'll need to be able to connect your script to the plugin. You may obtain this secret from the plugin's information box in the Development section of the Plugin page.

  • callback: Callable[[str, dict], Tuple[str, BytesIO]]

    • The callback function. This function will be run whenever a user asks for your model to be run using this plugin. More on the callback function below.

Callback Function

Parameters:

  • **data: dict

    • projectId: string

      • The ID of the project for which the plugin was run.

    • categorySchema: dict

      • The label categories that have been passed to this plugin when the plugin is being run.

      • For example, when creating the plugin, if in the "Class Names" section you had input the classes "Vehicle" and "Person", and when running, if they were both mapped to existing labeling tools, this is what you would get as input here:

      [{'schemaId': '797ea755f5693c0dc902558', 'modelClass': 'Vehicle'}, {'schemaId': '797ea755f5693c0dc902558', 'modelClass': 'Person'}]
    • assetId: str

      • The external ID of the asset to be sent to the model.

      • Example: my_external_id_1

    • apiKey: str

      • The API Key of the user running the plugin.

    • orgId: str

      • The Organization ID of the organization where the plugin is run.

    • runBy: str

      • The user ID of the user running the plugin.

    • session: str

    • logger: PluginLogger

    • batches: List[str]

    • configJSON: str

      • The config JSON your users will pass to you through the Config JSON text field when running the plugin. Warning: the JSON will be passed as a string so you will have to destringify it. Example code to obtain the original JSON as a Python object:

    def sample_callback(**data):
        config_str = data.get('configJSON')
        config = json.loads(config_str)

Returns:

  • annotation_json: dict

    • A message to show users when the plugin finishes running.

    • Example:

{
  "data": "https://angohub-public-assets.s3.eu-central-1.amazonaws.com/bb26ccc1-3a9f-4e41-bb01-df285b0c5bd5.jpg",
  "answer": {
    "objects": [...],
    "classifications": [...],
    "relations": [...]
  }
}

Sample Python Script

In this script, for the sake of brevity, instead of running the asset through a model, we add a bounding box to it as pre-label, "simulating" the model's results. We then upload these annotations to the user's assets with a "To-Do" status as prelabels using the SDK.

Find the sample code here:

Running a Model using the Plugin

Your Python script needs to be running for users to be able to run your plugin.

From Settings

  1. Ensure the plugin is activated in your organization.

  2. From your project, enter Settings -> Plugins, then click on Open on the plugin you wish to run. This dialog will pop up:

  3. Perform your class mappings, and provide the external ID of the asset you wish to annotate.

  4. Click on Run. Your code will be run on the asset you provided.

From the Labeling Editor

  1. Perform steps 1 and 2 from the instructions above, until you reach the Run Plugin dialog.

  2. Perform the class mappings for your project.

  3. From your project's Asset or Task lists, open a labeling task.

If you wish to one-time run the plugin with different mappings, click on the three dots on the right of the plugin's name, set your temporary mappings, and click on Run.

Last updated