🤗 Diffusers provides a Stable Diffusion pipeline compatible with the ONNX Runtime. This allows you to run Stable Diffusion on any hardware that supports ONNX (including CPUs), and where an accelerated version of PyTorch is not available.
The snippet below demonstrates how to use the ONNX runtime. You need to use OnnxStableDiffusionPipeline
instead of StableDiffusionPipeline
. You also need to download the weights from the onnx
branch of the repository, and indicate the runtime provider you want to use.
# make sure you're logged in with `huggingface-cli login`
from diffusers import OnnxStableDiffusionPipeline
pipe = OnnxStableDiffusionPipeline.from_pretrained(
"runwayml/stable-diffusion-v1-5",
revision="onnx",
provider="CUDAExecutionProvider",
)
prompt = "a photo of an astronaut riding a horse on mars"
image = pipe(prompt).images[0]
The snippet below demonstrates how to use the ONNX runtime with the Stable Diffusion upscaling pipeline.
from diffusers import OnnxStableDiffusionPipeline, OnnxStableDiffusionUpscalePipeline
prompt = "a photo of an astronaut riding a horse on mars"
steps = 50
txt2img = OnnxStableDiffusionPipeline.from_pretrained(
"runwayml/stable-diffusion-v1-5",
revision="onnx",
provider="CUDAExecutionProvider",
)
small_image = txt2img(
prompt,
num_inference_steps=steps,
).images[0]
generator = torch.manual_seed(0)
upscale = OnnxStableDiffusionUpscalePipeline.from_pretrained(
"ssube/stable-diffusion-x4-upscaler-onnx",
provider="CUDAExecutionProvider",
)
large_image = upscale(
prompt,
small_image,
generator=generator,
num_inference_steps=steps,
).images[0]