Make sure to have FFmpeg installed.

Load a checkpoint, render a trajectory, and save to a video file. The following trajectory options are available, filename: Load from trajectory created using viewer or blender vfx plugin. interpolate: Create trajectory by interpolating between eval dataset images. spiral: Create a spiral trajectory (can be hit or miss).

usage: ns-render [-h] --load-config PATH
                 [--rendered-output-names STR [STR ...]]
                 [--traj {spiral,filename,interpolate}]
                 [--downscale-factor INT] [--camera-path-filename PATH]
                 [--output-path PATH] [--seconds FLOAT]
                 [--output-format {images,video}] [--interpolation-steps INT]
                 [--eval-num-rays-per-chunk {None}|INT]



Path to config YAML file. (required)


Name of the renderer outputs to use. rgb, depth, etc. concatenates them along y axis (default: rgb)


Possible choices: spiral, filename, interpolate

Trajectory type to render. Select between spiral-shaped trajectory, trajectory loaded from a viewer-generated file and interpolated camera paths from the eval dataset. (default: spiral)


Scaling factor to apply to the camera image resolution. (default: 1)


Filename of the camera path to render. (default: camera_path.json)


Name of the output file. (default: renders/output.mp4)


How long the video should be. (default: 5.0)


Possible choices: images, video

How to save output data. (default: video)


Number of interpolation steps between eval dataset cameras. (default: 10)


Specifies number of rays per chunk during eval. (default: None)