From 1db4d8215146affb8112863b3910e361881d5516 Mon Sep 17 00:00:00 2001 From: Lihe Yang Date: Fri, 26 Jan 2024 16:12:20 +0800 Subject: [PATCH] Update README.md --- README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index a6de0ec..165c8f0 100644 --- a/README.md +++ b/README.md @@ -19,7 +19,7 @@ This work presents Depth Anything, a highly practical solution for robust monocu ## News -* **2024-01-25:** Support [video depth visualization](./run_video.py). +* **2024-01-25:** Support [video depth visualization](./run_video.py). Our [online demo](https://huggingface.co/spaces/LiheYoung/Depth-Anything) also supports video input. * **2024-01-23:** The new ControlNet based on Depth Anything is integrated into [ControlNet WebUI](https://github.com/Mikubill/sd-webui-controlnet) and [ComfyUI's ControlNet](https://github.com/Fannovel16/comfyui_controlnet_aux). * **2024-01-23:** Depth Anything [ONNX](https://github.com/fabio-sim/Depth-Anything-ONNX) and [TensorRT](https://github.com/spacewalk01/depth-anything-tensorrt) versions are supported. * **2024-01-22:** Paper, project page, code, models, and demo ([HuggingFace](https://huggingface.co/spaces/LiheYoung/Depth-Anything), [OpenXLab](https://openxlab.org.cn/apps/detail/yyfan/depth_anything)) are released. @@ -38,7 +38,7 @@ This work presents Depth Anything, a highly practical solution for robust monocu - **Better depth-conditioned ControlNet** - We re-train **a better depth-conditioned ControlNet** based on Depth Anything. It offers more precise synthesis than the previous MiDaS-based ControlNet. Please refer [here](./controlnet/) for details. You can also use our new ControlNet based on Depth Anything in [ControlNet WebUI](https://github.com/Mikubill/sd-webui-controlnet). + We re-train **a better depth-conditioned ControlNet** based on Depth Anything. It offers more precise synthesis than the previous MiDaS-based ControlNet. Please refer [here](./controlnet/) for details. You can also use our new ControlNet based on Depth Anything in [ControlNet WebUI](https://github.com/Mikubill/sd-webui-controlnet) or [ComfyUI's ControlNet](https://github.com/Fannovel16/comfyui_controlnet_aux). - **Downstream high-level scene understanding** @@ -176,7 +176,7 @@ depth = depth_anything(image) ``` -### Do not want to define image pre-processing and download our model definition files? +### Do not want to define image pre-processing or download model definition files? Easily use Depth Anything through ``transformers``! Please refer to [these instructions](https://huggingface.co/LiheYoung/depth-anything-small-hf) (credit to [@niels](https://huggingface.co/nielsr)).