Renderers#

Collection of renderers

Example:

field_outputs = field(ray_sampler)
weights = ray_sampler.get_weights(field_outputs[FieldHeadNames.DENSITY])

rgb_renderer = RGBRenderer()
rgb = rgb_renderer(rgb=field_outputs[FieldHeadNames.RGB], weights=weights)
class nerfstudio.model_components.renderers.AccumulationRenderer(*args, **kwargs)[source]#

Bases: Module

Accumulated value along a ray.

classmethod forward(weights: Float[Tensor, '*bs num_samples 1'], ray_indices: Optional[Int[Tensor, 'num_samples']] = None, num_rays: Optional[int] = None) Float[Tensor, '*bs 1'][source]#

Composite samples along ray and calculate accumulation.

Parameters
  • weights – Weights for each sample

  • ray_indices – Ray index for each sample, used when samples are packed.

  • num_rays – Number of rays, used when samples are packed.

Returns

Outputs of accumulated values.

class nerfstudio.model_components.renderers.DepthRenderer(method: Literal['median', 'expected'] = 'median')[source]#

Bases: Module

Calculate depth along ray.

Depth Method:
  • median: Depth is set to the distance where the accumulated weight reaches 0.5.

  • expected: Expected depth along ray. Same procedure as rendering rgb, but with depth.

Parameters

method – Depth calculation method.

forward(weights: Float[Tensor, '*batch num_samples 1'], ray_samples: RaySamples, ray_indices: Optional[Int[Tensor, 'num_samples']] = None, num_rays: Optional[int] = None) Float[Tensor, '*batch 1'][source]#

Composite samples along ray and calculate depths.

Parameters
  • weights – Weights for each sample.

  • ray_samples – Set of ray samples.

  • ray_indices – Ray index for each sample, used when samples are packed.

  • num_rays – Number of rays, used when samples are packed.

Returns

Outputs of depth values.

class nerfstudio.model_components.renderers.NormalsRenderer(*args, **kwargs)[source]#

Bases: Module

Calculate normals along the ray.

classmethod forward(normals: Float[Tensor, '*bs num_samples 3'], weights: Float[Tensor, '*bs num_samples 1'], normalize: bool = True) Float[Tensor, '*bs 3'][source]#

Calculate normals along the ray.

Parameters
  • normals – Normals for each sample.

  • weights – Weights of each sample.

  • normalize – Normalize normals.

class nerfstudio.model_components.renderers.RGBRenderer(background_color: Union[Literal['random', 'last_sample', 'black', 'white'], Float[Tensor, '3'], Float[Tensor, '*bs 3']] = 'random')[source]#

Bases: Module

Standard volumetric rendering.

Parameters

background_color – Background color as RGB. Uses random colors if None.

blend_background(image: Tensor, background_color: Optional[Union[Literal['random', 'last_sample', 'black', 'white'], Float[Tensor, '3'], Float[Tensor, '*bs 3']]] = None) Float[Tensor, '*bs 3'][source]#

Blends the background color into the image if image is RGBA. Otherwise no blending is performed (we assume opacity of 1).

Parameters
  • image – RGB/RGBA per pixel.

  • opacity – Alpha opacity per pixel.

  • background_color – Background color.

Returns

Blended RGB.

blend_background_for_loss_computation(pred_image: Tensor, pred_accumulation: Tensor, gt_image: Tensor) Tuple[Tensor, Tensor][source]#

Blends a background color into the ground truth and predicted image for loss computation.

Parameters
  • gt_image – The ground truth image.

  • pred_image – The predicted RGB values (without background blending).

  • pred_accumulation – The predicted opacity/ accumulation.

Returns

A tuple of the predicted and ground truth RGB values.

classmethod combine_rgb(rgb: Float[Tensor, '*bs num_samples 3'], weights: Float[Tensor, '*bs num_samples 1'], background_color: Union[Literal['random', 'last_sample', 'black', 'white'], Float[Tensor, '3'], Float[Tensor, '*bs 3']] = 'random', ray_indices: Optional[Int[Tensor, 'num_samples']] = None, num_rays: Optional[int] = None) Float[Tensor, '*bs 3'][source]#

Composite samples along ray and render color image. If background color is random, no BG color is added - as if the background was black!

Parameters
  • rgb – RGB for each sample

  • weights – Weights for each sample

  • background_color – Background color as RGB.

  • ray_indices – Ray index for each sample, used when samples are packed.

  • num_rays – Number of rays, used when samples are packed.

Returns

Outputs rgb values.

forward(rgb: Float[Tensor, '*bs num_samples 3'], weights: Float[Tensor, '*bs num_samples 1'], ray_indices: Optional[Int[Tensor, 'num_samples']] = None, num_rays: Optional[int] = None, background_color: Optional[Union[Literal['random', 'last_sample', 'black', 'white'], Float[Tensor, '3'], Float[Tensor, '*bs 3']]] = None) Float[Tensor, '*bs 3'][source]#

Composite samples along ray and render color image

Parameters
  • rgb – RGB for each sample

  • weights – Weights for each sample

  • ray_indices – Ray index for each sample, used when samples are packed.

  • num_rays – Number of rays, used when samples are packed.

  • background_color – The background color to use for rendering.

Returns

Outputs of rgb values.

classmethod get_background_color(background_color: Union[Literal['random', 'last_sample', 'black', 'white'], Float[Tensor, '3'], Float[Tensor, '*bs 3']], shape: Tuple[int, ...], device: device) Union[Float[Tensor, '3'], Float[Tensor, '*bs 3']][source]#

Returns the RGB background color for a specified background color. .. note:: This function CANNOT be called for background_color being either “last_sample” or “random”.

Parameters
  • background_color – The background color specification. If a string is provided, it must be a valid color name.

  • shape – Shape of the output tensor.

  • device – Device on which to create the tensor.

Returns

Background color as RGB.

class nerfstudio.model_components.renderers.SHRenderer(background_color: Union[Literal['random', 'last_sample', 'black', 'white'], Float[Tensor, '3'], Float[Tensor, '*bs 3']] = 'random', activation: Optional[Module] = Sigmoid())[source]#

Bases: Module

Render RGB value from spherical harmonics.

Parameters
  • background_color – Background color as RGB. Uses random colors if None

  • activation – Output activation.

forward(sh: Float[Tensor, '*batch num_samples coeffs'], directions: Float[Tensor, '*batch num_samples 3'], weights: Float[Tensor, '*batch num_samples 1']) Float[Tensor, '*batch 3'][source]#

Composite samples along ray and render color image

Parameters
  • sh – Spherical harmonics coefficients for each sample

  • directions – Sample direction

  • weights – Weights for each sample

Returns

Outputs of rgb values.

class nerfstudio.model_components.renderers.SemanticRenderer(*args, **kwargs)[source]#

Bases: Module

Calculate semantics along the ray.

classmethod forward(semantics: Float[Tensor, '*bs num_samples num_classes'], weights: Float[Tensor, '*bs num_samples 1'], ray_indices: Optional[Int[Tensor, 'num_samples']] = None, num_rays: Optional[int] = None) Float[Tensor, '*bs num_classes'][source]#

Calculate semantics along the ray.

class nerfstudio.model_components.renderers.UncertaintyRenderer(*args, **kwargs)[source]#

Bases: Module

Calculate uncertainty along the ray.

classmethod forward(betas: Float[Tensor, '*bs num_samples 1'], weights: Float[Tensor, '*bs num_samples 1']) Float[Tensor, '*bs 1'][source]#

Calculate uncertainty along the ray.

Parameters
  • betas – Uncertainty betas for each sample.

  • weights – Weights of each sample.

Returns

Rendering of uncertainty.

nerfstudio.model_components.renderers.background_color_override_context(mode: Float[Tensor, '3']) Generator[None, None, None][source]#

Context manager for setting background mode.