# OpenAPI Support MLServer follows the Open Inference Protocol (previously known as the "V2 Protocol"). You can find the full OpenAPI spec for the Open Inference Protocol in the links below: | Name | Description | OpenAPI Spec | | -------------------------- | ---------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------- | | Open Inference Protocol | Main dataplane for inference, health and metadata | {download}`dataplane.json <../../openapi/dataplane.json>` | | Model Repository Extension | Extension to the protocol to provide a control plane which lets you load / unload models dynamically | {download}`model_repository.json <../../openapi/model_repository.json>` | ## Swagger UI On top of the OpenAPI spec above, MLServer also autogenerates a Swagger UI which can be used to interact dynamycally with the Open Inference Protocol. The autogenerated Swagger UI can be accessed under the `/v2/docs` endpoint. ```{note} Besides the Swagger UI, you can also access the _raw_ OpenAPI spec through the `/v2/docs/dataplane.json` endpoint. ``` ![](../assets/swagger-ui.png) ## Model Swagger UI Alongside the [general API documentation](#Swagger-UI), MLServer will also autogenerate a Swagger UI tailored to individual models, showing the endpoints available for each one. The model-specific autogenerated Swagger UI can be accessed under the following endpoints: - `/v2/models/{model_name}/docs` - `/v2/models/{model_name}/versions/{model_version}/docs` ```{note} Besides the Swagger UI, you can also access the model-specific _raw_ OpenAPI spec through the following endpoints: - `/v2/models/{model_name}/docs/dataplane.json` - `/v2/models/{model_name}/versions/{model_version}/docs/dataplane.json` ``` ![](../assets/swagger-ui-model.png)