Bloomhell Burning Shot Process
Aug 21, 2025

Pedro Paulo
VFX/Tech Artist at Dardo Studios
With the FPS section of the Bloomhell project, two of the VFX needed very stylized, somewhat 2D looking explosions that still worked in a three-dimensional space. I had to make sure I didn’t end up with an explosion that looked too much like a billboard, nor one that looked too much like a spherical mesh. For this brief breakdown, I will focus on the Burning Shot’s explosion.

The first step for this effect was the explosion itself. The central piece of the VFX needed to keep the 2D-like shapes of the concept within a sphere mesh, so that it properly worked in the 3D space. It also had to break the silhouette of the sphere in a way that looked seamless, as if it was all a single element. After testing some possibilities out, I landed on a solution: to map the visuals of the sphere and its related FX onto a screen space. This would allow for different VFX to perfectly blend together, and would help maintain the more 2D look of the textures themselves.
I applied a very simple screen space UV to test out the idea, and although it had potential, the flaws and issues with it were obvious:
The UV was centered on the screen center, and I wanted to make sure it was always centered on the object center while still mapped in screen space
The UV stays fixed in its scale regardless of proximity to the mesh, making it break visually if you get too close or too far away
The UV cannot be scaled easily
The UV has strange scaling issues based on the current game/editor window size
Video showing how the UV was working when I first started, VFX at the end.
To fix these issues, there were a couple of addendums to the UV calculation that had to be made. Additionally, I shifted into working within a Material function, so I could re-use this elsewhere.
The first step was to extend a couple of adjustments from our ScreenPosition node. First using the ConstBiasScale node set to (-0.5 and 2), I offset the UV so that it was essentially centered on the screen, instead of starting at the top left.
I then generated an ‘Aspect ratio’ value from the ViewSize node, which I used to adjust the generated UVs with, removing any issues from different screen or window sizes.

In parallel, I get a different value that will be joined with the UVs later; this will be a scaling value. First the distance of the Object to the camera is used to make sure the textures will not be distorted when you go too close or too far from them. This might look different to usual distance calculations, but since we are converting things to view space, the B/Z channel of the position acts as a ‘forward’ or a ‘distance to origin’ by itself.
I also scaled this value with the radius of the object to make sure that the material can be applied to objects that have changing scaling without breaking. I used an input scalar, so that I can fine tune the scaling parameters as needed.

And finally, I got the object’s position, turning it into clip space to use it to ‘offset’ the center of the UVs. This ensures it will always be centered on the object’s position within the screen space mapping.
All of this is joined, scaling and offsetting the UVs into the desired output UVs that can be used in any materials which require this sort of mapping.
After this, with the adequate UV, I could move onto the visuals themselves, which would be relatively simple now that the screen space issue was out of the way. However, they still posed some interesting challenges to overcome.

I knew that I wanted the UVs to be polar-coordinates for the burning shot, but I ran into an issue when combining the screen space UVs with polar coordinates. A mysterious vertical line appeared, some sort of separation in the tiling or something similar.
After considerable time searching, I found that the issue was in regards to Unreal’s Mipmapping system, which has a tendency to cause issues when using polar coordinates. The screenspace UVs seemed to make it worse. The fix, rather esoterically, was to manually get the DDC and DDY for the UVs, and plug them into the texture with a ‘ExplicitDerivative’ option. This allowed the Mipmaps to use the right UVs and not create these seams.

With that out of the way, I had the base 0-1 values for my texture, now I needed to map it to colors and dissolve it. For the colors, it was a simple matter of plugging into the UV slot for a color ramp texture, mapping the textures left-right positions to the 0-1 values, like a gradient map filter in Photoshop.

At the same time, I took the 0-1 values of the texture, and two different values based on the particle alpha, to create two ‘moments’ of dissolving. The first is how the object ‘appears’, and the second is how it ‘disappears’, as they are not the same. This was done with two parallel sequences of Smoothstep-based dissolving that are joined at the end.

With that, the base material was complete. I used this not only for the main core of the explosion, which was composed of two parts, but also additional shapes on the outsides of the core; allowing the effect to have smooth and seamless breaks on the silhouettes. An alternate version of this material was also used for decals that are projected onto objects the VFX touch.
Now that I had the core, I could build the effect around it and add the relevant details with the base material acting as a strong foundation for it all. This specific material was used for the “burning shot”, but the screenlocked UV node was used for the “punk grenade” as well. The Gradient map method was also used for many things.
Overall, when making any sort of project, there’s always several challenges that connect to each other and also end up being useful for other effects. So, even just understanding these few solutions is inevitably useful in the future–especially with things like the weird mipmap bug. That's the sort of bug that us VFX artists will deal with on a regular basis.
See the effect in action: Bloomhell, a VFX reel by Dardo Studios.