Skip to content

Commit

Permalink
update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
julienkay committed Jan 21, 2024
1 parent b95d5c7 commit fde34a1
Show file tree
Hide file tree
Showing 2 changed files with 11 additions and 2 deletions.
Binary file added docs/images/model_samples.webp
Binary file not shown.
13 changes: 11 additions & 2 deletions docs/manual/models.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# MiDaS Models

There are variety of different MiDaS models available. To be able to use them with Unity Sentis, the official models were converted to ONNX using [this colab notebook][colab]. You'll find links to the pretrained models in ONNX format below.
There are variety of different MiDaS models available. To be able to use them with Unity Sentis, the official models were converted to ONNX using [this colab notebook][colab]. You'll find links to the pretrained models in ONNX format below or on the [GitHub Release page][gh_release].


## Overview
Expand All @@ -21,7 +21,7 @@ There are variety of different MiDaS models available. To be able to use them wi
| [dpt_large_384][12] | 1.27 GB | 3.0 |


## Usage
## Get The Models

To keep the package size reasonable, only the @Doji.AI.Depth.ModelType.midas_v21_small_256 model is included with the package when downloading from the Asset Store. To use other models you have to downloaded them first.

Expand All @@ -32,6 +32,8 @@ Otherwise you can always manually download the ONNX models from the links above

## Which Model To Use

### Overview

You should choose the appropriate model type based on your requirements, considering factors such as accuracy, model size and performance.

The available models usually have a tradeoff between memory & performance and the accuracy/quality of the depth estimation.
Expand All @@ -42,7 +44,14 @@ The [official MiDaS documentation][docs_official] does a fairly good job of show

Generally, if you have a hard realtime requirement, e.g. when you want to do depth estimation at runtime or on mobile devices, you may want to use smaller models like midas_v21_small_256 or dpt_swin2_tiny_256. If you need best quality (editor tools) or depth estimation needs to happen just once, you might be able to use models like dpt_swin2_large_384 or dpt_beit_large_384. Keep in mind the larger memory requirements for these model though.

### Examples

The comparison below is only meant to give a rough overview of the capabilities of each model.
Judging the quality of a depth map from just a 2D image is hard. Consider using the WebcamSample that is included with the package to see how the different models perform when when projecting the estimated depth back into 3D as a point cloud.
![samples](../images/model_samples.webp)

[colab]: https://github.com/julienkay/com.doji.midas/blob/master/tools/MiDaS_ONNX_Export.ipynb
[gh_release]: https://github.com/julienkay/com.doji.midas/releases/tag/v1.0.0
[docs_official]: https://github.com/isl-org/MiDaS
[1]: https://github.com/julienkay/com.doji.midas/releases/download/v1.0.0/midas_v21_small_256.onnx
[2]: https://github.com/julienkay/com.doji.midas/releases/download/v1.0.0/midas_v21_384.onnx
Expand Down

0 comments on commit fde34a1

Please sign in to comment.