How to use Path-Tracing in 360/ 180 Degree Renders in Unreal Engine

How to get the best quality 360 and 180 Degree renders from Unreal Engine using Path-Tracing

Last updated 2 days ago

Overview

Path-Tracer continues to be improved with each new Unreal Engine version so for the best results we recommend to use the latest Engine release.

Path-tracing is recommended for more advanced users as it requires a lot more manual configuration than Lumen, is much harder to previsualise and so takes a lot longer to get a perfect final image.

Unreal Engine’s Path Tracer generates images by calculating the actual physics of light rays bouncing through your scene rather than guessing pixels using intelligent algorithms like Lumen:

  • It is useful for rendering images that match physical light in a scene and is particuarly useful for scenes that require complex refraction, translucency, shadows and light stability.

  • It can also be used to double-check your Lumen output, which uses algorithms to guess the rendered content, and adjust Lumen settings to be closer to natural light if necessary.

Path-traced renders take a lot longer >10x than Lumen because of the time taken to calculate the light-rays and so is generally reserved for content like Architectural Visualisation, where its specific capabilities are required.

Getting the best image results requires combining settings in a Post Process Volume, Anti-Aliasing and Denoiser. These work differently than in Lumen renders:

  • Your Post Process Volume is used to set up which elements of your scene will be path-traced and your settings for how your light-rays will behave.

  • Anti-aliasing determines the actual number of light rays you will use for each pixel.

  • The Denoiser is a predictive algorithm used to clean up the final image created by your light rays, removing excessively bright spots (fireflies) and smoothing over pixels for which you don’t have enough light rays to create precise color information.

Editor Set-Up

Required Settings

  1. In Project Settings

    1. Search RHI and select DirectX12.

    2. Search Ray Tracing and enable Path Tracing and Support Hardware Ray Tracing.

    3. You will need to restart Unreal if these are not already selected.

  2. Go to Edit> Plugins and search Denoise and ensure these plugins are ticked. Each should be used in different content scenarios which are explained below:

  3. In Editor, go to View Modes and select Path Tracing:

  4. You will see a progress bar in your Viewport as the light samples calculate:

  5. When the progress bar finishes, you will see a final path-traced image in your Viewport. Each you move your camera, this will have to recalculate:

Skylight, Fog, Atmosphere

There are different options for how to control these elements, depending on your needs/ preferences, which you can see the Unreal documentation here.

Post Process Volume

If you are only using the OWL 360 Camera/ Component (and not the Viewport) for previz then you can change the Path-Tracing settings below in the 360 Camera Details panel.

We recommend to use a Post Process Volume because it means the Viewport and the 360 Camera look the same, which makes content creation easier:

  1. Either add a new Post Process Volume from Place Actors , or search for your existing one in Outliner.

  2. In its Details panel search infinite and tick Infinite Extent (Unbound)

  3. Search Exposure, change Metering Mode to Manual and change the Exposure Compensation to the same value as your 360 Camera/ Component (by default this is set to 10):

    1. This will make your Viewport preview look the sameas your 360 camera.

    2. This is also important for your final render because if you have auto-exposure enabled the Viewport can look different to Movie Render Queue.

  4. Now search for path-tracing in the Details panel to change your settings which will let you preview your output in Viewport and the OWL 360 Camera/ Component.

  5. All these settings will pass through to Movie Render Queue apart from Samples Per Pixel, which is the same as spatial samples, and will be overriden by your anti-aliasing settings in MRQ:

    1. Max Bounces: This determines how many bounces each light ray makes before ending it’s journey.

      1. 32 is already a high value.

      2. You can increase this number if you need more light in dimly lit areas.

    2. Samples Per Pixel: This determines how many light rays are fired into each pixel, the more samples you add, the more detailed (less grainy) your image will be.

      1. This value is only used in your Editor preview, in Movie Render Queue you set samples in your anti-aliasing settings and this value is ignored.

      2. This value is equivalent to the spatial sample value you will set in Movie Render Queue and so is a very effective preview for that.

      3. However, it includes no temporal samples (because you are only looking at a single frame in your preview).

      4. 2048 is already a high value.

      5. Each doubling of samples will roughly double your render time.

      6. You can use the denoiser to smooth over images instead of adding more samples to reduce your render times.

    3. Maximum Path Intensity: This sets the maximum brightness value of each light-ray which prevents a ‘firefly’ effect where some pixels become bright white dots.

      1. Ensure that your Exposure is set to your desired final value in the 360 Camera/ Component before adjusting this intensity setting because it will affect the visibility of any ‘fireflies’.

      2. Also ensure that you have the Denoiser you will use selected, because this will remove ‘fireflies’ automatically.

      3. Once you have those two selected, you can modify this value depending on your visual output (use the 360 Camera/ Component Render Target to preview).

        1. A value of 24 to 64 is generally recommended.

        2. If you reduce this value too low then it will dull your image (because no pixel is allowed to be too bright).

        3. If the value is set too high, you will get a white ‘static’ effect in your output.

        4. The maximum value should be just below your Max EV100.

      4. If you are using the NFOR denoiser, you need to adjust this setting more proactively because it doesn’t remove fireflies automatically.

    4. Emissive Materials: When ticked, this makes any emissive materials in your scene emit light rays that will be calculated by the path tracer.

      1. In principle, this sounds great because it will create accurate light around screens, panels, glowing objects etc.

      2. However, it is very prone to create ‘fireflies’ due to the difficulty of calculating such intricate light rays and so needs to be managed with Samples Per Pixel, Maximum Path Intensity and your denoiser to balance render times and a high quality visual output.

      3. It’s often better to ‘fake’ the emission by adding a Rect Light and setting low intensity on the emissive material rather than high intensity on the emissive material which can require very high samples per pixel.

    5. Reference Depth of Field: This makes the path tracer use a simulated lens with a virtual diameter instead of a single point of light emission.

      1. In a 360 camera, the depth of field is uniform from all points, but the lens simulation here results in a higher quality output.

      2. However, it generally requires more samples and so will increase render times.

    6. Reference Atmosphere: This tells the Path Tracer to physically calculate the sun and sky rather than use your Sky Light.

      1. Any Sky Light component is ignored when this setting is ticked. 

      2. This setting makes your path-tracing calculations very slow so it’s to be used with care but it may be necessary for sunset/ sunrise shots.

    7. Denoiser: This is a predictive algorithm that cleans up the image that has been created from your light rays, removing fireflies (bright spots) and smoothing over pixels for which you don’t have enough color information.

      1. Unless you have an extremely high number of Samples Per Pixel this should be enabled.

      2. You can select between different denoisers using console variables (explained below).

Denoiser

Denoisers are predictive algorithms used in path-tracing to reduce render times and increase image quality:

  • In an ideal world, you could use an near-infinite number of light rays to create your final image, but in this is impractical because your render time might be months for a single frame, so you are forced to use a lower sample count (less light rays).

  • Denoisers reduce the number of light rays needed (and therefore your render times) to get a high quality image by using a variety of predictive algorithmic methods to reconstruct pixels for which there isn’t enough initial light information.

  • Different denoisers have different specialisations and so should be used in Unreal for different purposes (fast previsualisation, static rendering, moving image rendering etc.)

Unreal Denoser Options

Unreal has the option of four different denoisers each of which have different pros and cons:

  • NVIDIA OptiX: This is an NVIDIA accelerated denoiser which runs on the GPU (NVIDIA only):

    • It can be useful for pre-visualisation because it can be faster than the other options.

    • It can smudge textures and so normally isn’t as good as NNE for final renders.

    • It is normally less effective than NFOR for moving renders.

    • It is less capable of predicting pixels in 360/ 180 projection formats than NFOR.

  • Intel OIDN: This is an Intel created denoiser which runs on the CPU (Intel and AMD compatible):

    • It is implement in Unreal as NNE (below) to run on the GPU which is recommended to use in most cases because it will render much faster.

    • However, if you lack VRAM capacity and don’t mind much longer rendering times then it can be a useful fallback.

    • It normally produces a better visual output than Optix.

    • It is normally less effective than NFOR for moving renders.

    • It is less capable of predicting pixels in 360/ 180 projection formats than NFOR.

  • NNE (Neural Network Engine): This is an implementation of OIDN in Unreal which runs on the GPU (NVIDIA and AMD compatible):

    • It is much faster than OIDN and so is recommended unless you have VRAM constraints.

    • It is normally the best denoiser to use for static images.

    • It is normally less effective than NFOR for moving renders.

    • It is less capable of predicting pixels in 360/ 180 projection formats than NFOR.

    • It can be augmented/ modified in your Plugins settings with custom training models.

  • NFOR (Native Fast-Convergence Online Reconstruction): Is Epic’s own denoiser (developed from Disney research) intended for moving images:

    • It looks at past and future frames to create temporal data on pixels.

    • It is optimised for 360 degree content.

    • It is VRAM intensive and relatively slow because it requires a high number of temporal samples.

    • It can’t remove fireflies, so they need to be controlled via Max Path Intensity.

Selecting Denoisers in Unreal

If you have all the denoisers above ticked in your Plugins then you can use Console Variables to select between them in Editor and Movie Render Queue:

  • To switch between temporal (moving shots) and spatial (static shots) denoisers:

    • Spatial samples: Input r.PathTracing.SpatialDenoiser.Type 0 (this is the default).

      • If this is selected, the default denoiser will be NNEDenoiser.

    • Temporal samples: Input r.PathTracing.SpatialDenoiser.Type 1.

      • If this is selected, the default denoiser will be NFOR.

  • Then, to switch between the individual denoisers use the command

    • r.PathTracing.Denoiser.Name (spatial) or

    • r.PathTracing.TemporalDenoiser.Name (temporal)

  • Followed by:

    • NNEDenoiser

    • OptiX

    • OIDN

    • NFOR

  • Example: r.PathTracing.TemporalDenoiser.Name NFOR

  • More Console Variables can be added to refine the function of the denoisers as explained below.

360/180 Degree Preview

The OWL 360 Camera Actor/ Component is integrated with Unreal’s Path Tracer pipeline allowing you to see your Projection Type output in its Preview and Render Target:

  1. In the Details panel of the OWL 360 Camera/ Component check that Path Tracing is ticked:

  2. In the Render Target (unpaused) you will the path-tracer loading across the cube-faces:

  3. Eventually you will see your final image load:

  4. You can use the Snapshot function in the details panel to:

    1. Capture and save shots at different angles; or

    2. Capture the same shot with different settings to manage denoising or light rays.

Movie Render Queue Set-Up

OWL 180/ 360 Settings

  1. To enable Path-Tracing in the OWL 180/360 Render Pass please tick the box in settings: 

  2. Reference Motion Blur: This determines whether temporal samples are taken before or after the denoiser is run. For moving shots it’s highly recommended to have this ticked:

    1. When ticked, any post-process motion blur is disabled and the path-tracer runs all spatial and temporal samples and then sends them to the denoiser.

      1. The benefit of this is that you get real cinemative motion blur which appears from averaging all the images taken through temporal samples

      2. The downside is that any lack of detail and/ or excess brightness in the spatial samples can multiply into the temporal samples creating errors in the denoiser.

      3. To manage this, you need to set a large number of temporal samples, because this naturally results in cleaner pixels (so you depend less on the denoiser).

      4. The generally recommended setting is 2048 temporal and 1 spatial samples (you will take a spatial sample at each temporal point so you don’t need more than 1).

      5. This can be very VRAM heavy because all samples have to held in memory before being sent to the denoiser.

    2. When unticked, spatial samples are taken and then denoised and then temporal samples are taken.

      1. This means that the pixel data from the spatial samples gets cleaned up by the denoiser before it’s used for the temporal samples which removes the natural motion blur.

      2. You should only use this if ticking Reference Motion Blur gives you artefacts (such as problems with particles) that can’t be resolved by adding more temporal samples.

Anti-Aliasing

Anti-aliasing settings in path-tracer are more simple than in Lumen renders because they just determine the sampling of light rays from each pixel.

The algorithmic work that is done by the anti-aliasing methods in Lumen instead happens in the denoisers in path-tracing.

  • You need to add the anti-aliasing section in Movie Render Queue settings where you can control individual values (these settings will not be taken from your post process volume):

  • Spatial samples determine the number of light rays that are fired at each pixel.

    • The more like that reaches each part of a pixel, the more information there is and therefore the higher precision of detail.

  • Temporal samples create multiple versions of each pixel at sub-frame time periods into which the number of spatial sample light rays you have selected are then fired.

    • The more of these different sub-frame versions you have the more accurately your image will reflect the movement of light rays through your scene over time.

  • Override Anti Aliasing: This should be set to None (you are using the path-tracer render pipeline).

  • Use Camera Cut for Warm Up: This should be ticked and you need to use your Camera Cut to add frames before your render but you need far less frames (single digits) than in Lumen.

  • Render Warm Up Frames: This should be ticked (so you actually render and discard the warm up frames) to ensure image stability.

  • Advanced: These settings should both be used. In path-tracer they are very important because they allow simulation of important visual elements without having to render every warm up frame fully, which would take far too long:

    • Render Warm Up Count: This sets the number of GPU cycles which your warm-up will run through before you start your render and so would be used to get the denoiser ready.

    • Engine Warn Up Count: This sets the number of CPU cycles which your warm-up will run through before you start your render and so would be used to get Niagara or physics moving before your render begins.

Warm-Ups

Warm-ups are essential for getting a high quality render, especially if you have moving images, complex lighting or physics based effects.

In path-tracing you need to use warm-ups differently to in Lumen because it takes so long to render a single frame:

  • You should still tick Use Camera Cut for Warm Up but only increase your camera cut in front of your first rendered frame by a single digit number of frames.

    • This ensures that textures and assets specific to the camera frustrum are loaded before the render begins.

    • If you are rendering a static shot then only 1 frame should be fine.

    • If you are rendering a moving shot, then you need at least the number of frames set in r.NFOR.FrameCount and may need more for a perfect first rendered frame.

  • You should tick Render Warm Up Frames so that the pre-render frames in your camera cut actually render which is required to give the denoiser sufficient data.

    • You need to combine this with a console variable to ensure that not all warm-up frames are fully rendered otherwise your warm-up period could be hours.

      • MovieRenderPipeline.NumWarmUpFramesWithTemporalSampling sets the number of frames for which you will actually take temporal samples using the path-tracer.

        • You can set this as low as 1 and then increase if moving shots require it.

      • MovieRenderPipeline.NumWarmUpFramesWithSpatialSampling sets the number of frames for which you will actually take spatial samples using the path-tracer.

        • 1 will almost always be enough for static shots.

  • Render Warm Up Count is an important measure for static shots but is very time consuming and so should normally be set to a low value, often 1.

    • It triggers a set of processes on the GPU which primes lighting, textures and geometry before the first rendered frame begins.

    • This is a very expensive measure because it has to run the path-tracer for each frame

    • It is only used for static renders and shouldn’t be used with the NFOR denoiser.

  • Engine Warm Up Count is an essential measure that should be set to a relatively high value (90 - 120).

    • It ensures that particles and physics based objects have a history of movement before your first rendered frame begins without requiring you to physically render all the previous frames.

    • It should be used for both static and moving shots.

Console Variables

You need to use console variables when using path tracing to refine your settings because they are not all shown in the Movie Render Queue or Editor GUI.

You add these in the console variables settings in Movie Render Queue using the + button for each new variable:

General Recommendations

  • r.PathTracing.MaxBounces 8

    • Setting it to 8-16 speeds up renders significantly with almost no visual loss. If you aren’t getting the lighting you need in darker corners then increase this number.

  • r.PathTracing.FireflySuppression 1

    • Eliminates single white pixels.

  • r.PathTracing.FilterWidth 1.5 to 2.0

    • 1.5 is sharper; 2.0 is smoother.

  • r.Streaming.FullyLoadUsedTextures 1

    • Forces the engine to wait until every 16K texture is in VRAM before firing rays. Very important for high resolution renders.

  • r.PathTracing.RayBias 0.01

    • Fixes weird black dots on curved geometry.

  • r.PathTracing.VisibleLights 1

    • Ensures that lights actually cast light and are visible in the path-traced reflections.

  • r.D3D12.GPUTimeout 0

    • Stops the GPU timing out if a frame takes too long. This might lead to the machine being frozen for a while but normally the frame will get rendered. Very important for high resolution renders.

  • r.PathTracing.GPUStackSize 8000

    • If you have very complex materials (glass inside glass), increasing this prevents "black spots" where the light ray gives up.

Suggested Settings

In most renders the following combination of anti-aliasing samples and denoiser will be the most effective.

These settings are in addition to the general console variables recommended above.

Moving Shots:

Moving shots have more straight-forward settings recommended by Epic:

  • Use NFOR as denoiser and in console variables set:

    • r.PathTracing.SpatialDenoiser.Type 1: (temporal denoiser is selected)

    • r.PathTracing.TemporalDenoiser.Name NFOR : (NFOR is selected)

    • r.NFOR.FrameCount 2 : (it will reference 2 frames behind,2 in front and the current frame to create samples).

    • r.NFOR.Temporal.NumSamples 8 : This determines the number of samples that the denoiser takes for each pixel from surrouding pixels/ frames.

      • For high speed or high resolution content you may need to increase the value to 16.

  • Spatial samples = 1, Temporal samples = 2048.

  • Reference Motion Blur is ticked.

  • Use Camera Cut for Warm Up is ticked with 2 pre-render frames in the camera cut.

    • The number of frames has be at least the value set in r.NFOR.FrameCount.

    • If your first frames don’t capture the motion like you need you will need to make your pre-render camera cut longer.

  • Render Warm Up Frames unticked.

  • Render Warm Up Count: 0

  • Engine Warn Up Count: 120

  • MovieRenderPipeline.NumWarmUpFramesWithTemporalSampling 2

  • MovieRenderPipeline.NumWarmUpFramesWithSpatialSampling 2

Static Shots:

Static shots can be approached in different ways and can often still benefit from using temporal samples/ denoiser and camera cuts so that the position of moving elements like particles, foliage and lights looks more natural in the shot.

Approach 1: All Static:

This approach uses the static denoiser with only spatial samples, but still has warm-up/ camera cut. This will be very VRAM heavy.

  • Use NNE as denoiser and in console variables set:

    • r.PathTracing.SpatialDenoiser.Type 0 (spatial denoiser is selected).

    • r.PathTracing.Denoiser.Name NNEDenoiser (NNE is selected).

  • Try spatial samples of 1024 and see if the result is what you need, if not increase to 2048.

  • Temporal samples = 1

  • Reference Motion Blur is unticked.

  • Use Camera Cut for Warm Up is ticked with 1 pre-render frame in the camera cut.

  • Render Warm Up Frames unticked.

  • Render Warm Up Count: 1

  • Engine Warn Up Count: 200

  • MovieRenderPipeline.NumWarmUpFramesWithSpatialSampling 1

Approach 2: Static and Temporal:

This approach uses a combination of spatial and temporal samples with the temporal denoiser to better capture the position of moving elements:

  • Use NFOR as denoiser and in console variables set:

    • r.PathTracing.SpatialDenoiser.Type 1 (temporal denoiser is selected).

    • r.PathTracing.TemporalDenoiser.Name NFOR (NFOR is selected).

    • r.NFOR.FrameCount to 1 (it will reference 1 frames behind,1 in front and the current frame to create samples).

  • Spatial samples = 32, Temporal samples = 32

  • Reference Motion Blur is ticked.

  • Use Camera Cut for Warm Up is ticked with 1 pre-render frame in the camera cut.

  • Render Warm Up Frames unticked.

  • Render Warm Up Count: 0

  • Engine Warn Up Count: 120

  • MovieRenderPipeline.NumWarmUpFramesWithTemporalSampling 1

  • MovieRenderPipeline.NumWarmUpFramesWithSpatialSampling 1

Trouble-Shooting

Console Variables

Epic says "A lot of people don’t realize that everything in the “Game Overrides” section in MRQ is having an effect on your render even if you don’t have it active in the GUI."

Epic recommends:

  1. Remove all Console Variable overrides unless you have a specific reason you are including them.

  2. If you see any large low-frequency noise, that's from Denoisers. You have to find the right Denoiser (Ambient Occlusion, Global Illumination, Reflections, etc.) and shut it off.

  3. If you disable all the options in the Game Overrides section, aside from Flush Grass Streaming, that will get you as close as possible to what you are seeing in the Editor Viewport (assuming you don’t have any CVar overrides in MRQ).

  4. The geometry LOD settings here will NOT override anything Nanite. They are only for non-nanite actors.