OWL 360/ 180 for Movie Render Queue

You can now render 360 and 180 degree video direct from Unreal Engine Movie Render Queue at ultra high resolutions (+32K) with precise color accuracy and professional compositing capabilities

Last updated 1 day ago

Overview

The OWL 180/ 360 Render Pass for Unreal Engine Movie Render Queue is a custom render pass that lets you render a wide range of different projections at ultra-high resolutions, using the extensive features and capabilities of Movie Render Queue.

It is primarily used for rendering content such as:

  • Ultra high resolution domes and planetaria.

  • Immersive experiences for venues and attractions.

  • Entertainment and educational content for VR Headsets (Oculus, Vision Pro etc.)

  • Previsualisation for architecture and digital twins.

  • Backplates and driving plates for virtual production.

  • 180 and 360 degree film making.

Basic Set-Up

  1. Ensure that Movie Render Queue is enabled in your Unreal project, by going to Edit>Plugins and search Movie Render:

  2. You need an OWL 360 Camera or Capture Component in your scene, added to a sequence in Unreal Sequencer and selected in Camera Cuts:

  3. Click on the three dots next to the Clapperboard Icon and make sure Movie Render Queue is set as the default rendering option:

  4. With the Movie Render Queue window open, click Unsaved Config:

  5. Press +Setting and select OWL 180/ 360 Rendering:

  6. Add the settings for your render (Deferred Rendering can be switched off):

  7. You should set the Output Directory, Frame Rate and other details as normal in the Output section.

  8. You can add other +Settings such as Anti-Aliasing, Color Ouput, Export Formats etc as you would do for a normal render.

  9. Once you have added your Settings, click Accept and you can initiate your Render:

Advanced Rendering

Rendering from the Command Line

More advanced users can run an instance of Unreal Engine from the command line which enables just the render to run remotely without any instance of the Unreal Editor being opened, this can save (V)RAM and speed up renders:

  • You need a render job already saved in Movie Render Queue.

  • The Output Directory of the render is whatever directory you have saved in Movie Render Queue.

  • After running the command in the Windows command shell, an instance of Unreal Engine will open, and the render queue will be executed.

How to Initiate the Render

Use a command in Windows Powershell to execute a render job:

"C:\Program Files\Epic Games\UE_5.6\Engine\Binaries\Win64\UnrealEditor-Cmd.exe" D:\UEProjects\MRQCommandLine\MRQCommandLine.uproject
Minimal_Default1 -game -MoviePipelineConfig="/Game/Cinematics/myRenderQueue" -renderoffscreen -Log -StdOut -allowStdOutLogVerbosity -Unattended

Here is a breakdown of what each part of this command means and how to customize it for a render:

  • path\to\your\UE5.0installation\Engine\Binaries\Win64\UnrealEditor-Cmd.exe

    • The first entry in the command line defines where the installation of the Unreal Engine is on the rendering computer.

    • Note: The reason this is in quotes (") is that Windows does not handle the space in the Program Files path elegantly. Any path with a space will need to be encapsulated in quotes so Windows does not interpret it as a parameter we are passing in.

  • \path\to\your\project\project.uproject:

    • Next, we pass the path to the uproject to work with so it knows what to load. Notice the backslashes ("\") as I am working with Windows.

    • Other operating systems may need a forward slash ("/").

  • Map_To_Load:

    • Defines the Map to load when loading the project.

Here is a list of the parameters:

  • -renderoffscreen:

    • This tells the engine to initialize the GPU and render the frames as requested, but it suppresses the creation of any desktop windows.

    • This is the standard "headless" mode for rendering and is significantly more resource-efficient than running with a preview window.

    • The alternatives to this would be:

      • -windowed which sets the game to run in windowed mode.

      • -fullscreen which will take over the entire display, not normally desired for remote renders.

  • -game:

    • Launches the editor using uncooked content, allowing us to generate the render with the Editor's content

  • -log:

    • When used as a switch (-log), opens a separate window to display the contents of the log in real-time.

    • When used as a setting (LOG=filename.log), tells the engine to use the log filename of the string that immediately follows.

    • If working in an environment where capturing the log to a file path, this is where that is defined.

  • -StdOut:

    • Giving the parameter following Log tells the command to output the log details to StdOut, a buffer that many rendering systems consume and capture details of the job.

    • This helps customize job validation in the render manager of your choice.

  • -allowStdOutLogVerbosity:

    • This parameter enables the StdOut log to be verbose and give more details.

  • -Unattended:

    • This informs Unreal Engine that it is being run from the command line and there is no active user.

    • It is possible to hook into this mode with your blueprints using the isUnattended node.

  • -MoviePipelineConfig="/Game/path/to/the/myRenderQueue":

    • The MoviePipelineConfig parameter informs the rendering system to operate on the specific Movie Pipeline Queue, saved from the Movie Render Queue.

    • This path points to the name of the Movie Render Queue; in the example below, it is /Game/Cinematics/myRenderQueue.

Deadline and Remote Rendering

If you are using our tiling solution for ultra-high resolution renders, we recommend to allocate the rendering of the tiles and the stitch job for a certain number of frames to a single machine in order to limit issues that can arise from read/write of sending very high resolution files between different disks.

  • The OWL 180/ 360 Render Pass for Movie Render Queue generates standard render jobs in Unreal.

  • These can be allocated to different machines using remote rendering solutions like Deadline.

Rendering Hooks

  • This is a PRO feature which allows you to use Hooks to gain low-level access to sections of the OWL rendering pipeline.

  • You can also use this feature to add a burn-in to your renders (see below)

C++ Header File

See the full C++ header file OWLMRQ360Hooks.h in the Source/OWLMRQPipeline/Public/OWL360Camera folder

Blueprints

While this class is blueprintable, only a subset of the methods are available in blueprints because of the Unreal restrictions that blueprints only be executed on the game thread:

  • Setup_GameThread - called while the pipeline is being initialised.

    • This is useful for setting up any objects or variables required in your class later.

  • TearDown_GameThread - called when the movie pipeline is being destroyed.

    • For cleaning up variables / data.

  • PreRenderCamera_GameThread - Called on the game thread for each camera found.

    • Be aware that cameras are sometimes not rendered (e.g. when Fade is 0 ) and so it's important to check the bSkipRendering flags.

    • Also note that when using OWL Compositing outputs, this hook will be called for every camera, however the PostComposite_TaskThread hook will only be called once for the composited output.

    • You can check the setting OWLPass→Settings.Output.bCompositeCameras for this.

  • ModifyTilePostProcess_GameThread - called for each file of each face on each camera.

    • This exposes mutable Post Process Settings struct which can be modified, like for example by adding a custom Blend Material with face or tile-specific material parameters.

C++

C++ can call all of the above methods by overriding the above methods with the _Implementation suffix. Additionally there are the following methods:

  • PostStitch_TaskThread - this is run concurrently on different task threads and enables modification of the raw pixel data.

  • PostComposite_TaskThread - run concurrently on different task threads after stitching and any compositing or adjustment due to Fade values.

  • PostTileRender_RenderThread - this is executed on the render thread after each tile / face is rendered with an FTextureRHIRef Tile that can be used in custom C++ shaders etc.

    • This happens before the surface is read

Adding passes / Submitting extra layers to output

It is trivial to add extra passes to the render. These passes are pure C++ and can perform calculations or append layers.

  • To submit a pass call one of the three AddRenderPass_... methods below from inside your class:

    • AddRenderPass_8Bit

    • AddRenderPass_16Bit

    • AddRenderPass_32Bit

  • All 3 methods take the PassIdentifier which is an unreal struct for EXR / output file labelling and a callback with the following typing:

bool YourCallback32Bit( const FMoviePipelineRenderPassMetrics& SampleState, const FIntPoint& OutputResolution, TArray64<FLinearColor>& OutData) { return true};

Rendering from Packaged Projects

You need a 360 Camera Machine license on the machine running the app for this to render without a watermark.

Once you have a set up a 360 Render in Unreal Engine you can trigger the render from inside packages projects/ games.

  • You need to call the Movie Pipeline Queue Engine Subsystem with a trigger such as a key press.

  • The Allocate Job node is where you select the Sequence you would like to render (this can be hard-coded or you can create an option to choose).

  • The Set Configuration node is where you select the Settings you have saved in Movie Render Queue.

  • The Render Job node will initiate your render.

Previsualisation

  • You can use the OWL 360 Camera/ Component to pre-vis your renders because it color-matches your Movie Render Queue output, while also rendering in real-time.

  • So instead of waiting for hours/ days to see what your render will look like, you can preview it in real-time on a screen or headset to get your shots perfectly as you want them before initiating your final pixel high-resolution output.

  • This is particularly useful for thinking about audience details like the proximity of objects to the camera, scale of your scene to the viewer and the pace of the camera through your levels.

  • Once you have previewed your sequences, you can then render short frame ranges at your target resolution to:

    • Gauge how the final encoding settings, color pipeline settings and frame rates feel in the final output format.

    • Get an idea of how long the full render will take by multiplying the custom range’s render time up to the full frame range amount.

    • Establish your pipeline with any post-processing software like Premier Pro, or Davinci considering Bitrate, Frame Rate and Color Conversion so you know all your required settings before initialy your full render in Unreal.

Previewing in a VR Headset

  • You can live-stream the Render Target of the camera through our Spout/ NDI Senders to either a headset or venue previsualisation software.

  • If you are outputting VR content then you can use the VR.NDIUntethered app in Oculus to receive a live NDI feed direct to the headset (with stereoscopic support).

    • With DLSS enabled, it should be possible to have 4K resolution live output at 20-30FPS which is very effective for previs.

Venue Digital Twin

  • If you are making content for a Dome/ immersive venue, you can use the DomeViewer software both in Oculus or Windows to preview your content in a digital twin of your venue (either birds-eye or as an audience member):

  • You can also build a digital twin of your venue in Unreal and use the Media Input Wizard to set up an Unreal Media Plate with the correct Mesh of your venue’s screen.

    • You can then stream the 360 Camera/ Component output into your digital twin via the OWL Spout/ NDI Senders and it will play with perfect color accuracy in your virtual venue.

Performance

Resolution

  • Unreal has a ceiling of about 8Kx8K, depending on your GPU, before you are likely to get VRAM crashes.

  • For higher resolutions 12k, 16k, 32k etc., you will need to use our High-Resolution Renders tiling solution (see section below).

  • If you are using Post Process Materials or Stencil Clip Layers, each additional output counts as another render that needs to be held in memory so can have a large VRAM impact.

  • For maximum capacity, close all resouces that are using VRAM (Viewport, Render Targets) and then close and re-open Unreal.

    • This will flush your VRAM and significantly increase your render speed and resolution capacity.

  • Disable Parallel Rendering (PRO license only, see Features below) renders each frame sequentially reducing the amount of data that needs to be held in memory, so can increase the maximum resolution possible on a single machine (at the cost of slower rendering).

  • If you are using Lumen, you can use upsampling algorithms (TSR/ DLSS) to render at a higher resolutions but these can cause visual artefacts.

  • You can also optimise your level using the guide here.

Speed

  • The OWL 180/ 360 Rendering pass is very fast and should provide at least the same speed, if not better, than Unreal’s panoramic renderer with much higher quality visual results.

  • Render speed is highly proportional to your VRAM capacity so you should follow the tips in the section above to flush your VRAM by closing pixel rendering assets and the reopening Unreal before initiating your renders.

  • You can achieve similar performance using headless rendering via the command line as explained in the Advanced Rendering section above.

  • High resolution rendering using tiling isn’t necessarily slower than rendering to a single texture due to the lower VRAM usage in rendering each tile.

  • If you have a more powerful CPU than GPU then tiling may be faster due to the CPU efficiency in stitching the tiles and the GPU headroom from each tile requiring less VRAM to render.

  • In Lumen renders, optimising speed is getting the right balance of level optimisaton (to reduce VRAM usage), anti-aliasing samples and upscaling/ downscaling.

  • Adding unnecessary anti-aliasing samples can massively bloat render times, see our guide on how to avoid this.

  • In path-traced renders, optimising speed is about balancing your light samples and your denoiser, trying to get the lowest number of samples that the denoiser can clear up to get the final image quality you need. See our guide on how to manage this.

  • In additon, certain path-traced lighting systems like Reference Atmosphere can have a strong visual impact but a massive impact on render times so need to be chose carefully.

  • Having more VRAM can give you more capacity but won’t necessarily increase your speed. For example, a 5090 will normally render faster than an RTX 6000 due to other GPU optimisations for games content.

  • A suggested local machine would be: 5090 NVIDIA GPU, 9950X3D AMD CPU, 96GB DDR5 AMD EXPO compatible RAM >5600Mhz (2 DIMM sticks) plus other components with comparable performance.

Image Quality

  • You can achieve an exceptional image quality with the OWL 360 Camera and Movie Render Queue without needing much configuration.

  • By default your colors should match your Viewport, although you will need to check your Bloom setting as by default this is set to 0 on the 360 Camera to avoid seams.

  • Otherwise, the methods to achieve cinematic quality imagery are similar to normal unreal renders.

  • For the best visual output, you will likely need to modify anti-aliasing settings and follow our guidance on post-process effects (see Trouble-Shooting below or our full guides).

  • You can try DLAA in place of TSR, particularly for optimising motion blur.

  • Supersampling (TSR plus r.screenpercentage console variable) is a very effective way of increasing detail and clarity in your renders.

  • Lumen renders can sometimes have seams, due to differences in light between the cube-faces of the camera, but almost all cases, these can be managed using the tips in Trouble-Shooting below.

Lumen vs. Path-Tracing

  • The OWL 360 Degree Camera supports both Lumen and Path-Tracing render pipelines.

  • Lumen is the default Global Illumination method for Unreal Engine and is highly recommended.

  • It renders significantly faster (>10x/ frame) than Path-Tracing and can be combined with Upsampling to increase this even further.

  • Path-Tracing doesn’t offer any advantages over other CPU-based renderers like Arnold, V-Ray or Corona, but can be very useful in renders where lighting, reflections, refractions or shadows are very important (particuarly architectural stills).

Output Formats

  • For professional content creation and color grading we always recommend .exr file format with at least 16bit color bit depth and Disable Tone Curve ticked in the Color Output Movie Render Queue settings.

  • For quick renders, or if you are very limited on file size jpg, mp4 or png (if you need alpha channel) can be fine, but these are limited to 8bit color space.

  • ProRes can offer a better video preview format, since it has 10bit color, alpha channel support and offers instant playback capabilities without needing to be encoded into video and can be encoded to higher resolutions than h.264, which is limited to 4096×4096.

Features

Resolution

  • Select the Resolution you need for your render here and always use this box to make any changes.

  • Do not use not the main Movie Render Queue Output section resolution setting, which will be updated automatically.

Override Internal Face Dimensions

  • This setting lets you increase the pixel density of the individual cube faces that make up your render so that you can render at a fixed resolution but get a level of detail that would otherwise only come with a higher resolution.

  • For moving images, it’s generally not recommended to increase the default value much because you will create an ‘over-sampling’ of pixels per frame that lack temporal information and so will create jitter.

  • In general, it’s much better to use supersampling via TSR and the r.screenpercentage console variable than to change this value (because that method uses temporal samples).

Alpha Channel Output

N.B. With the Show Only method on the 360 Camera/ Component , Unreal Engine will not render any pixels (bloom, particles etc) around the selected Mesh.


If you want correct colors then you should use Stencil Layers (below) with .exr output rather than Alpha Settings.

  • Ensure you have Alpha Settings configured in your OWL 360 Camera/ Component (using the Show Only list). Both alpha and inverted alpha can be rendered.

  • Tick the Include Alpha box in the OWL 180/ 360 Rendering settings:

  • We recommend to use PNG output from Movie Render Queue with Write Alpha ticked:

Mask

  • If you have have set up a custom projection to an output mask in the 360 Camera/ Capture Component you need to add the .png file here again here to render it.

  • There is an option for Absolute and Relative file paths, so, for example if you are sharing the file across a team, you may need to set a relative file path so that the Mask file is picked up correctly.

Compositing

N.B. All cameras must have the same projection type for Compositing to work.

It is not possible to render multiple cameras to EXR layers while compositing is active, all cameras outputs are merged.

There is no cap on the number of 360 Cameras/ Components you can composite in Movie Render Queue but adding more with have an impact on V/RAM.

  • Compositing lets you composite between different OWL 360 Cameras or Components in Unreal for effects like cross-fading.

  • Please see the section guide in the OWL 360 Camera/ Component docs for how to use this capability.

  • If you are using the OWL Compositing Track in Sequencer then you need to tick Composite Cameras here to render it correctly.

Stereo

If you tick this option you must ensure that all dynamic elements such as particles or ultra-dynamic sky are sequencer rather than random, otherwise they will be different in each of the two eyes.

  • If you want to Render Only One Eye of your Stereoscopic projection at a time, you can tick this option.

  • This is if you will use an external compositing/ stitching solution to create your final pixel output:

Error Handling

  • This allows you to chose to exit from the render if one of the frames is corrupt.

Path-Tracing

  • Tick this option if you have selected Path-Tracing in your Movie Render Queue +Settings

  • You will need to change the Anti-Aliasing settings in Movie Render Queue as well.

  • See guide here for how to set up Path-Tracing sucessfully in your renders.

Intermediate Bit Depth

  • This is the Color Format of your render.

  • Basic licenses are limited to 8bit but Pro licenses can use 16/ 32 bit. See guide here

  • A higher bit depth will create a significantly richer color range in your renders.

Disable Parallel Rendering

  • This is a PRO feature which lets you render frames in sequence rather than in parallel.

  • This is useful in heavy renders where parallel rendering can lead to a VRAM crash.

Interpolation Type

  • To form each projection, a cubemap of different faces is projected onto another shape (sphere, hemisphere etc). This requires sampling from the cubemap.

  • Bilinear Sampling is recommended to get a higher quality image with less jaggedness although it can result in slightly longer render times.

  • This setting can be used in combination with the Override Internal Face Dimensionssetting if you are trying to increase the detail or sharpness of an image while not increasing your output Resolution.

Burn-Ins

You can use the Rendering Hooks section to add a blueprint to make a Burn-In/ HUD on your 360 renders:

  • This only works when using the Stitch Pass in High Resolution Rendering.

  • You should use this instead of the UI Render in Movie Render Queue, which gets distorted by the 360 output.

  • You can select the size of the burn-in (max 8192x8192) and it's position on your render using the tools below.

  • By default the burn-in isn’t full screen, but if your render is less than 8192×8192 then you can position it full-screen to make a 180/360 Degree HUD.

Adding the Burn-In to your Render

  1. In Rendering Hooks add an Array element and select a BurninHook blueprint (an example is there by default):

    1. You can add more than one if you need.

Modifying the OWL Burn-In Blueprint

  1. Click the File Explorer icon to find the OWL_MRQ_Example_BurninHook Blueprint Class in your Content Browser and then double click it:

  2. Click on Class Defaults to open the Details panel on the right:

  3. In the Details panel that emerges you can configure:

    1. Alignment: lets you choose where on the rendered frame the burn-in is rendered.

    2. Width and Height: Of the burn-in is in pixels and is max dimensions 8192x8192.

    3. Export as Separate Pass: Tick this if you want the burn-in as a separate EXR layer (uncomposited with transparency) instead of composited on top of the rendered frame.

    4. Widget Type: Use this to select the Widget Blueprint that you want to be rendered as the Burn-In (normally we recommend to just modify the OWL_360_Burn_In_Base which is selected here (see below).

Updating Widget Data

  • In order to update the data on the widget, your Blueprint Class must have a public method on it with a parameter for the data.

  • In our example Widget Blueprint, this method is called Set Data:

  • At runtime the Widget Type selected above is instantiated to the Widget To Render.

  • In your blueprint, ensure that this is valid, the cast it to your Widget Blueprint Class and then call any relevant public methods on your blueprint class.

  • You can copy and customise the OWL_360_BurnIn_Base Widget Blueprint as you need:

    • Each element of data is a public variable.

    • Explore the Graph section of the Widget Blueprint, navigating to the Set Data public method.

      • There you will see the list of available data and how it has been connected to the variables on the UI.

Creating your Own Burn-In Blueprint

You can either modify the existing OWL_360_BurnIn_Base Widget Blueprint or create a copy to modify separately.

  1. Select the OWL_360_BurnIn_Base Widget Blueprint in Content Browser and double click to open:

  2. In the Widget Blueprint Designer interface make any changes you need. (Then save/ compile):

    full-image
  3. If you have only modified the existing OWL_360_BurnIn_Base Widget Blueprint then it will work automatically.

  4. If you have created a new Widget Blueprint (by copying the original) you need to open the OWL_MRQ_Example_BurninHook Blueprint Class and in the Details change the Widget Type to the name of you new one:

Maintain Frame Order

N.B. In order to correctly sequence frames, completed frame data is stored in memory until previous frames have been rendered. This will increase memory usage.

  • This is a PRO feature relating to Tiling/ High Resolution Rendering.

  • If you are using custom outputs like, for example, the FFmpeg output and require frame order synchronisation, then check this box.

  • Otherwise, by default the OWL Stitch Pass runs in parallel and exports frames to disk as soon as they ready (so it is possible that some frames arrive out of sequence).

High-Resolution Renders

If you are using Tiling, then all dynamic content (particles etc) must be cached or sequenced and share the same random seed. Otherwise when you stitch the tiles together there will be differences between them.

This is a PRO feature which allows you to split your render into smaller Tiles and then use a Stitch pass to stitch them back together.

This permits rendering a huge resolutions 12K, 16K, 24K, 32K etc.

GPU vs CPU Usage

  • Tiling will typically use less VRAM and GPU capacity because you are rendering smaller resolution jobs.

  • However, the stitching of tiles is all done on the CPU so benefits from the most Logical Processors/ Cores possible.

  • The Stitch Pass can be initated on a CPU only machine with a -nullRHI flag via the command line.

  • So if you have a less powerful GPU than CPU, tiling may well render more quickly.

  • In general, tiling doesn’t take much longer than normal rendering.

Important

  • Ensure that all your Output Settings (Frame Rate, File Directory, all 360 Render settings) for your render are all set because each Tile will be generated as a separate render job which will inheirit these settings when you click Generate Jobs in this section.

Select High-Resolution Render Settings

  • This section lets you define the settings for the Tiles as well as the final Stitch Pass.

  • Intermediate Output Directory: This is the file location where you will store all the tiles you render.

  • Intermediate File Prefix: This is automatically generated but can be modified. This is important because if a file renders incorrectly, you can re-start your render from a specific tile so you need to be able to pinpoint the exact tile for rendering efficiently.

  • Job Name Prefix: This is combined with the Intermediate File Prefix above to give a final name to each of the tiles that you render.

  • Tiles: Here you select the number of tiles you want create.

    • Your output will be formed by a certain number of cube-faces depending on your Projection Type (you can use Face Control or Debug Face Colors in the OWL 360 Camera/ Component to see this).

    • The value you set in Tiles will split each of these cube faces horizontally and vertically.

      • If you select 1-1 and you have a standard CubeMap or Equirerectangular 360 projection then you will output six Tiles, once for each of the cube faces.

      • If you select 2-2 then you will split each of the cube faces into two horizontal and two vertical Tiles, so you will output four tiles for each of the six cube faces, 24 total tile jobs (plus the stitching job to join them together).

    • Depending on the complexity of your scene, you will need a different number of tiles for different resolutions.

      • The best way to estimate this is to look at Override Internal Face Dimension above to see the resolution you have set for each of the cube faces.

      • Your Tiles will split that resolution.

      • A resolution for each tile of 3Kx3K should be very comfortable (unless you have a very intense scene.)

  • Tile Overlap Percent: This creates a blend between each of the tiles to ensure that post-process effects don’t create seams.

  • Minimum Tile Dimension: In certain projections, such as Domemaster, you will have a large central cube-face and smaller faces on the edges.

    • If you tile those smaller faces, then it can create tiles that are too small in dimension to include all your post-process effects.

    • Increase this value if you see this issue.

  • Delete Intermediates After Stitch: If you are rendering at a high resolution, the Tiles will consume a lot of disk space.

    • Unless you need the Tiles, you should tick this option as it will delete them after you have stitched your final frame so that they don’t clog up your disk.

    • The Tiles will only be deleted if your frame has been stitched successfully.

  • Max Concurrent Stitches: Stitching is done on the CPU so this value can be set as high as the total number of Logical Processors your CPU has.

    • For example, if you have 16 cores and 32 logical processors, you can set this value to 32, to stitch 32 frames simultaneously.

  • Break Jobs by Frame Range: This lets you set the number of Tiles that are rendered before you stitch them together.

    • It is better to set a lower value, such as 200 Frames, so that you stitch your final frames and delete the intermediate tile files, otherwise you can run out of disk space.

    • It is also best practice in remote rendering to distribute sets of 200 Frames or so tiles plus a Stitch Pass to each rendering machine as this is the optimum way to render your final frames.

  • Write Intermediate Debug Files: If you want to check the individual Tiles for debugging purposes, then you can tick this option to output the tiles as .exrs.

  • Source Job: This needs to select the Movie Render Queue job that references the Sequence you want to render.

    • It should pick that up automatically but if it doesn’t you can select it here.

Generate your Render Jobs

  • Once you have input all the settings you need in this section (and in the other Movie Render Queue sections):

  • Click Generate Jobs this the individual tile renders and the stitch jobs will be generated in your Movie Render Queue jobs list:

    • The original render is disabled (because that would be too high resolution to do as a single pass).

    • Each of the jobs represents the rendering of an individual tile.

    • The job FinalStitch is the stitch phase which join the tiles together into your final frames.

    • If you have set Break Jobs by Frame Range then you will see multiple sets of tile renders and stitch passes in your render job list.

Save your Render Jobs

  • You should save the Render Queue as an Asset in your Content Browser like this, so you can redo the render without having to Generate Jobs again:

Re-Rendering Tiles

  • You can initiate a render from any specificframe and tile in a Render Queue that you have saved and final stitch will automatically pick up the re-rendered files.

  • For example, if you have 200 frames with tiles that you are rendering before a stitch pass and you have an issue at frame 150, then you can just re-render from frame/tile 150-199 and the stitch pass will be able to stitch all 200 frames.

  • Tiles will not be deleted unless the stitch pass for that range of frames/ tiles has been fully completed.

Re-Starting a Render from a Specific Tile

To re-render frames/ tiles:

  1. Go to your intermediate file folder (this is the repository you selected to store your tiles)

  2. The file name for each tile is formatted as follows: LS_02_SU_020_E_c_000.fm_000002.sv_00.e_0.f_0.t_0-0

    1. LS (Level Sequence): This is the sequence you are rendering

    2. c (camera): This is the 360 Camera/ Component being rendered.

    3. fm (frame): This is the frame number.

    4. sv (Scene view): This relates to any post-process passes or stencil layers selected.

    5. e (eye): There will be two for stereoscopic renders.

    6. f (Face): There will normally be 4-6 cube faces depending on your Projection Type.

    7. t (Tile): The tiles are numbered sequentially for each cube face.

  3. Use the format details above to find the tile that you want re-render. Open the tile in your saved Render Queue and set the frames you want to render for that tile in Custom Start/ End Frame. Click Accept and to your Render Queue:

  4. In your saved Render Queue, you can deselect any render jobs for tiles that are not needed using the yellow switch to the left of each job.

Re-Rendering a Set of Tiles/ Frames for an Existing Stitch Job

  1. If you want to render all your tiles again from a certain frame number then you will need to:

    1. Go back to your original render job (before you generated the tiles) and reset the frame-range you need in your Output settings.

    2. Click Generate Jobs to create a new set of render jobs with the new frame range.

    3. If you have existing tiles you have already rendered (for example for frames 0-149 of a 200 frame job) and you just want to re-render frames 150-199, then you can keep your existing Stitch Job but move it to the bottom of all your render jobs (so it will happen after your new tiles have been rendered).

Disable Multisample Effects

  • Stitched images can cause visible seams when using post-processing effects that blend pixels together.

  • These include effects like Depth of Field, Temporal Anti-Aliasing, Motion Blur and Chromatic Aberration.

  • When these Post Processing effects are used then each final output pixel is composed of the influence of many other pixels, seams can occur in a stitched image.

  • This can be mitigated by a Face Blend Percentage, but can also be turned off on a pre-render basis instead of having to disable each setting in-editor.

Post-Process Materials

N.B. There is a performance impact to adding multiple Post Process Materials so you may need to create multiple renders, each of which with different passes.


If you do so, all dynamic content (particles etc) must be cached or sequenced and share the same random seed.

You need to convert some Material settings of Post Process Materials to remove seams and ensure responsiveness to depth-of-field (see Trouble-Shooting).

With our integrated EXR output you can add Post Process Materials as layers within a single EXR file of your OWL 360 Degree render to be used with color grading workflows.

This makes it possible to do advanced compositing in Post Production software like Davinci, After Effects, Natron or Nuke.

  1. In your Movie Render Queue Settings add an EXR Sequence, select a Compression Type and make sure Multilayer is ticked:

  2. In OWL 180/ 360 Rendering, scroll down to Deferred Renderer Data: Additional Post Process Materials:

  3. Add each Additional Post Process Material as an Array element.

    1. The defaults are World Depth and Motion Vectors:

  4. To add another Post Process Material, in the Material section, search in the Browse box (see options below):

  5. You can also add the following options:

  6. Once you have rendered your EXR, in a viewer such as DJV you can navigate to File>Layers to preview these Post Process Materials:

Cryptomatte

Unreal’s native Cryptomatte pass isn’t currently supported for 360 Rendering but you can achieve the same functionality using a custom Post Process Material as follows:

  1. In Project Settings ensure (you will need to save and restart your project for these to take effect):

    1. Alpha is ticked:

    2. Custom Depth-Stencil Pass is set to Enabled with Stencil:

  2. Open the Content Browser and search for Custom Stencil in the Engine folder, Duplicate it (so you don’t modify the original material) and name it as you need:

  3. Open it to edit its Material Graph:

  4. Remove the section which applies a number to the Custom Stencil Selection by removing all nodes relating to the Lerp :

  5. Connect the Multiply node in to both inputs of the Brighten node:

  6. In the Blendable Location in the Post Process Material section in the Material Details panel select Scene Colour Before DOF.

  7. Now in your OWL 360 Camera/ Component/s or Post Process Volume go to the Details> Post Process Materials and add the modified Custom Stencil Material as an Array. This show the material in your Viewport or Render Target:

  8. In Outliner to select all the meshes that you need in the Cryptomatte:

  9. Go to Details, search for Depth and:

    1. Tick Render Custom Depth Pass.

    2. Use the Custom Depth Stencil Write Mask option instead of the Value option because this uses ignore depth and so will capture meshes even if they are behind objects in certain shots.

      1. Select All bits (255), ignore depth if you want all your meshes to be assigned the same color channel

      2. Assign different meshes to different bits by using the first to eighth options (there are 8 possible channels because the mask is 8bit.)

  10. Now, in Movie Render Queue> OWL 180/ 360 Rendering >Additional Post Process Materials you can add the modified Custom Stencil material as a new Array element to include as an .exr layer or it’s own .png output.

Stencil Clip Layers

N.B. There is a performance impact to adding multiple Stencil Layers so you may need to create multiple renders, each of which with different passes.


If you do so, all dynamic content (particles etc) must be cached or sequenced and share the same random seed.

Stencil Layers are recommended to use rather than Alpha Settings because they include the correct colors and full post-process effects.

With our integrated EXR output you can add different render passes as Stencil Clip Layers within a single EXR file of your OWL 360 Degree render to be used with color grading workflows.

  1. In your Movie Render Queue Settings add an EXR Sequence, select a Compression Type and make sure Multilayer is ticked:

  2. In OWL 180/ 360 Rendering, scroll down to Stencil Clip Layers:

    • Render Main Pass: Renders the Main Pass as well as the Stencil Clip Layers.

      • This can be turned off if you’re only doing a Stencil Layer based render and don't need the main non-stencil image.

    • Add Default Layer: Renders an additional Stencil Layer which contains all objects which do not belong to selected Actor or Data Layers.

      • This is useful to wanting to isolate one or two layers but still have everything else to composite them over without having to remember to add all objects to a default layer.

    • Actor Layers: Select the Layers to render as part of the composite output.

    • Data Layers: If you are using a World Partition Map use Data Layers to specify the composite Layers.

  3. If you have already set up your Layers in Unreal (see below), click the + button to create an Array element for each one and they will render out separately in your EXR.

  4. If you need to create an Actor Layer:

    1. Go to Window and ensure that Layers is ticked. Click it to view the Layers Panel:

    2. Select one or more Actor/s in your scene then right click in the Layers Panel and select Assign Actor to New Layer. If you have an existing Layer then you can also add the Actor to it:

    3. Now you can select the Layer you have created in Movie Render Queue as a Stencil Layer and whatever Actors are inside will render separately.

  5. If you want to create a Data Layer follow the documentation here (it’s a more sophisticated process).

Trouble-Shooting

Exposure

  • When you drop the 360 Camera into your scene it can be over or under exposed compared to your Viewport.

  • Exposure is set to Manual by default to avoid seams and artefacts caused by Auto Exposure.

  • The 360CameraActor automatically sets an Exposure Compensation of 10 when brought in to a level.

  • This can be too high or low for some scenes.

  • Adjust this setting in the Post Process Settings using the Exposure Compensation slider:

VRAM Management

  • For maximum capacity, close all resouces that are using VRAM/ rendering and then close and re-open Unreal before starting your render:

    1. Close your Unreal Viewport (don’t pause, fully close it).

    2. Pause the Render Target of your OWL 360 Camera/ Component.

    3. Save your project, close and re-open Unreal (this will clean your VRAM usage).

    4. Open Movie Render Queue and initiate your render.

Color Accuracy

  • If your Exposure settings are correct, your Viewport, OWL 360 Camera/ Component and Movie Render Queue renders should all have identical colors.

  • For the most reliable color accuracy we recommend to use OCIO, which works with the 360 Camera for previs as well as with your final renders.

  • You can see a set up guide for color grading with the 360 Camera here.

Seams (Lumen)

  • When you are using Lumen for Global Illumination, seams can appear at the edges of the cubemap which renders the raw pixels of the 360 Camera. These steps should solve them:

    • Face Blend Percent: Try this value at 2.5% and then increase as required. It will eliminate most seams.

    • Hardware Ray-Tracing: Change this setting in your Project Settings. This can reduce seams because Lumen relies less on screen space effects to generate lighting:

    • Cube Face Rotation: Using the Cube Face Rotation settings can prevent seams by ensuring that elements that affect screen-space effects are always positioned inside a face (see above).

    • Local Exposure: Add a the Console Variable r.LocalExposure to Movie Render Queue with value 0.

    • Turn off Screen Traces: This fixes a lot of interior seams caused by bounce lighting.

      • Find the Post Process Volume in your scene, search for Screen Traces, then untick the checkbox.

      • You may notice some Global Illumination is lost by doing this and your image will have a more stark contrast.

    • Remove Meshes (that cause seams due to Indirect Lighting) from Lumen using the Mesh Details panel to disable one of the following:

      • For Software Ray Tracing, uncheck Affect Distance Field Lighting.

      • For Hardware Ray Tracing, uncheck Visible in Ray Tracing.

      • You can also separate complex objects in your modeling software to ensure that Unreal Engine can shade them more accurately, which can improve Lumen reflections and reduce seams.

    • Add Temporal Anti-Aliasing Samples to fast-moving imagery to remove seams and improve shadows and reflections.

Seams (Post-Processing)

  • Post processing and VFX effects like Bloom, Particles, Volumetric Fog, God-Rays, Water, Dirt Masks can cause seams for a variety of reasons and may need workarounds to get perfect content.

  • You can see a comprehensive guide for how to use these effects here.

Post Process Materials

  • Post Process Materials that use Scene Depth as an input often depend on Screen Space Calculations that can cause seams, so these need to be changed to World Space calculations.

    • See our 360 VFX guide (Depth Based Post Process Effects ) for a step-by-step guide on this, or just replace the the scene depth node with the nodes in this material:

  • Visualisation Buffers such as Custom Stencil, Scene Depth, Opacity and Ambient Occlusion need a material modification to work in 360.

    • Select Scene Colour Before DOF in the Blendable Location in the Post Process Material section in the Material Details panel:

    • This ensures they are affected by Anti Aliasing, Depth of Field and other parts of the render pipeline.

Anti-Aliasing (Lumen)

  • Anti-aliasing is massively important for creating high quality renders and is controlled via it’s own section in the Movie Render Queue settings.

  • In moving shots, you need to use your Camera Cut to add excess frames in your Sequencer before your render starts and select Use Camera Cut for Warm Up in your anti-aliasing settings. This ensures that you have temporal samples from your camera for movement, particles and physics effects.

  • You should start with Temporal Super Resolution (TSR) either in your Project Settings or in Movie Render Queue because that is the default setting for Lumen.

  • Alternatively, if you have installed DLSS you can use DLAA which can be very effective (NVIDIA GPUs only)

  • For more information see our full guide.

High Resolution Renders

  • It’s worth implementing all the Performance recommendations above to maximise your render times:

    • You may find you can double your render speed (or more) if you manage VRAM properly.

    • Implementing these steps can also reduce the total number of tiles you need which can reduce render times.

  • Use the Face Control and Horizontal/ Vertical FOV settings to only render the exact pixels you need as this can increase render times by >20%.

  • Ideally run a stitch pass every 200 frames or so, to reduce any risk of disk space overload from storing too many intermediate files.

  • Due to the large size of each frame, read-write disk capacity can be a bottleneck, especially if you are using a shared disk to write final renders to.

  • The stitch pass doesn’t need the GPU so can be run on a high powered CPU with ~nullRHI command which can be a useful hardward optimisation in your studio.

    • The more logical processors you have on your CPU, the more stitches you can run simultaneously.

  • You may encounter issues with Volumetric Fog Pixels, particularly in light shafts.

    • To reduce their visibility change the r.VolumetricFog.GridPixelSize and r.VolumetricFog.GridSize console variables.

    • This will have an effect on GPU overhead.

    • See High Resolution Fog in the 360 VFX doc for a step by step guide.

Path-Tracing

  • Path-tracing requires a lot of configuration and iteration to get a perfect output.

  • It’s only recommended if you want the realistic translucency, refraction that it can offer vs. Lumen because it takes so much longer to render and is much harder to get a perfect final image.

  • Your settings will be very different depending on if you are rendering static or moving shots and also depending on exactly what you are shooting (glass, sunrise/ set etc.)

  • Check our full guide for comprehensive tips.

Console Variables

  • Remove all Console Variable overrides unless you have a specific reason you are including them.

  • If you see any large low-frequency noise, that's from Denoisers. You have to find the right Denoiser (Ambient Occlusion, Global Illumination, Reflections, etc.) and shut it off.

  • If you disable all the options in the Game Overrides section, aside from Flush Grass Streaming, that will get you as close as possible to what you are seeing in the Editor Viewport (assuming you don’t have any CVar overrides in Movie Render Queue).

  • The geometry LOD settings here will not override anything Nanite. They are only for non-nanite actors.