Instructions to use sign/sd-controlnet-mediapipe with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Diffusers
How to use sign/sd-controlnet-mediapipe with Diffusers:
pip install -U diffusers transformers accelerate
from diffusers import ControlNetModel, StableDiffusionControlNetPipeline controlnet = ControlNetModel.from_pretrained("sign/sd-controlnet-mediapipe") pipe = StableDiffusionControlNetPipeline.from_pretrained( "runwayml/stable-diffusion-v1-5", controlnet=controlnet ) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- Draw Things
- DiffusionBee
controlnet-sign/sd-controlnet-mediapipe
These are controlnet weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.
You can find some example images below.
prompt: Maayan Gazuli performing sign language in front of a green screen.
prompt: Maayan Gazuli performing sign language in front of a green screen.
prompt: Barack Obama performing sign language in front of a green screen.

- Downloads last month
- 8
Model tree for sign/sd-controlnet-mediapipe
Base model
runwayml/stable-diffusion-v1-5