ns-render#

Note

Make sure to have FFmpeg installed.

Load a checkpoint, render a trajectory, and save to a video file. The following trajectory options are available, filename: Load from trajectory created using viewer or blender vfx plugin. interpolate: Create trajectory by interpolating between eval dataset images. spiral: Create a spiral trajectory (can be hit or miss).

usage: ns-render [-h] --load-config PATH
                 [--rendered-output-names STR [STR ...]]
                 [--traj {spiral,filename,interpolate}]
                 [--downscale-factor INT] [--camera-path-filename PATH]
                 [--output-path PATH] [--seconds FLOAT]
                 [--output-format {images,video}] [--interpolation-steps INT]
                 [--eval-num-rays-per-chunk {None}|INT]

arguments#

--load-config

Path to config YAML file. (required)

--rendered-output-names

Name of the renderer outputs to use. rgb, depth, etc. concatenates them along y axis (default: rgb)

--traj

Possible choices: spiral, filename, interpolate

Trajectory type to render. Select between spiral-shaped trajectory, trajectory loaded from a viewer-generated file and interpolated camera paths from the eval dataset. (default: spiral)

--downscale-factor

Scaling factor to apply to the camera image resolution. (default: 1)

--camera-path-filename

Filename of the camera path to render. (default: camera_path.json)

--output-path

Name of the output file. (default: renders/output.mp4)

--seconds

How long the video should be. (default: 5.0)

--output-format

Possible choices: images, video

How to save output data. (default: video)

--interpolation-steps

Number of interpolation steps between eval dataset cameras. (default: 10)

--eval-num-rays-per-chunk

Specifies number of rays per chunk during eval. (default: None)