Are you tired of spending countless hours creating UI icons for your games using traditional methods? Are you looking for a faster, more efficient way to generate stunning icons? Look no further than AI!

Untitled

Creating a set of icons from scratch can be time-consuming. That's why I turned to Stable Diffusion, which allowed me to generate a complete icons pack in just one hour. The best part is that I didn't use the traditional pipeline with vector or sketch + 3D. In fact, I didn't even use Photoshop for these icons.

Brief Overview of Stable Diffusion

As an example of Stable Diffusion's capabilities, I used the network to generate cute cat images. The results are impressive, as you can see from the interface screenshot.

Untitled

Stable Diffusion is an open-source neural network that has gained popularity for its ability to generate images just from a prompt (text description). One of its main advantages over other networks is its web editor, provided by AUTOMATIC1111, which offers advanced image-to-image techniques such as ControlNet and background remover.

Another advantage of Stable Diffusion is that it can be trained to generate images that match a specific style. While this is not covered in this article, it's worth noting that this feature is handy for artists who want to maintain consistency throughout their work.

In this article, I used the DreamShaper – a fine-tuned version of Stable Diffusion 1.5 that works well as a baseline for many games.

Generation Process

This section will explore the whole pipeline by generating an envelope icon.

Untitled

Depth mask

Untitled

To start, I used a stock image as a reference for the position of the envelope in the scene. I then generated a depth mask using ControlNet. This was done by uploading the image, selecting the preprocessor "depth," and the model "depth." Clicking "Preview annotator result" generates the depth mask, which takes about a minute the first time due to model loading, but later it works in seconds.

Untitled

Untitled

Untitled