Example of the idea... we start with a picture of a 'mountain', then we 'outpaint' by 126 pixels around the edge and resize back to the original image size. We then outpaint again by another 126 pixels... and repeat. You can see this for 8 frames using the stable diffusion 1.0 model.
Zooming with Outpainting
In this little article, we'll have fun with the 'stable diffusion' (sd) library for creating an infinite zoom effect - from nothing more than a single starting image (and a description)!
If you've not heard of 'stable diffusion' - it's one of those AI libraries that allow you to replace or modify images using a trained neural network (when I say 'trained' ... I mean trained on millions of images). In this article, we're going to use this feature to 'extend' an image repeatedly.
The feature of 'extending' (or uncropping) a picture - is known as 'outpainting' - it's termed this because the model is 'painting' (or repainting) parts of image. It's a really cool tool - and very customizable - you can tell the model which areas to add and which areas to remain unchanged - for our zooming example... we want the model to paint the edges of the image (not the middle) ... we define this using an image mask.
Take a look at the following example, we start with a small picture of a 'head' - we then create a mask for the outer edge (area to be painted) - this is what we want our model to fill in - then we get the final image on the right. To prevent the generated images from being crazy - we also add in a 'prompt' to help guide the model - in the example image below ... we specify an office-like scene.
Outpainting an image - taking an image and a mask - mask helps the stable diffusion model know what to replace and what to leave alone.
You might be thinking - this is old news - stable diffusion has been around for ages... what does this have to do with zooming?
Well, we can take this concept further - and keep applying the 'outpainting' effect repeatedly .... outpainting more and more and more - to create the illusion that we're moving outwards - away from the target.
Setting up (Sandpit)
This isn't a theory article - it's a hand-on ... 'how-to' approach. First thing, let's setup a sandpit so we can install any Python libraries or tools. I'd recommend using Anaconda - so you can have multiple projects and library versions and they won't interfere with one another.
For this implementation, we're going to create the outpainting images repeatedly in a loop - which will be used to create a 'video' - so you can see the zoom effect over multiple frames.
The different outpainting (or inpainting) models perform differently as you'll see - the quality and 'smoothness' of the generated images vary.
To setup a working sandpit in Anacoda with all the libraries we'll define an environment.yml below. I've done a 'cpu' version by default - and commented out at the boottom are a few extra lines .... so you can try this out even if you don't have a high spec GPU.
You can test the zooming out effect on your local computer - even if you don't have CUDA support.
Just to give you an idea of how easy it's to setup and run an outpainting algorithm - here's a minimal working code - it can be used as a test to make sure everything is setup and working. Just 'outpaints' a single image and saves it to a file.
Visitor:
Copyright (c) 2002-2025 xbdev.net - All rights reserved.
Designated articles, tutorials and software are the property of their respective owners.