nerfstudio nerfstudio

https://user-images.githubusercontent.com/3310961/194017985-ade69503-9d68-46a2-b518-2db1a012f090.gif https://user-images.githubusercontent.com/3310961/194020648-7e5f380c-15ca-461d-8c1c-20beb586defe.gif


Nerfstudio provides a simple API that allows for a simplified end-to-end process of creating, training, and testing NeRFs. The library supports a more interpretable implementation of NeRFs by modularizing each component. With more modular NeRFs, we hope to create a more user-friendly experience in exploring the technology.

This is a contributor-friendly repo with the goal of building a community where users can more easily build upon each other’s contributions. Nerfstudio initially launched as an opensource project by Berkeley students in KAIR lab at Berkeley AI Research (BAIR) in October 2022 as a part of a research project (paper). It is currently developed by Berkeley students and community contributors.

We are committed to providing learning resources to help you understand the basics of (if you’re just getting started), and keep up-to-date with (if you’re a seasoned veteran) all things NeRF. As researchers, we know just how hard it is to get onboarded with this next-gen technology. So we’re here to help with tutorials, documentation, and more!

Have feature requests? Want to add your brand-spankin’-new NeRF model? Have a new dataset? We welcome contributions! Please do not hesitate to reach out to the nerfstudio team with any questions via Discord.

Have feedback? We’d love for you to fill out our Nerfstudio Feedback Form if you want to let us know who you are, why you are interested in Nerfstudio, or provide any feedback!

We hope nerfstudio enables you to build faster 🔨 learn together 📚 and contribute to our NeRF community 💖.

Contents#

This documentation is organized into 3 parts:

  • 🏃‍♀️ Getting Started: a great place to start if you are new to nerfstudio. Contains a quick tour, installation, and an overview of the core structures that will allow you to get up and running with nerfstudio.

  • 🧪 Nerfology: want to learn more about the tech itself? We’re here to help with our educational guides. We’ve provided some interactive notebooks that walk you through what each component is all about.

  • 🤓 Developer Guides: describe all of the components and additional support we provide to help you construct, train, and debug your NeRFs. Learn how to set up a model pipeline, use the viewer, create a custom config, and more.

  • 📚 Reference: describes each class and function. Develop a better understanding of the core of our technology and terminology. This section includes descriptions of each module and component in the codebase.

Supported Methods#

Included Methods#

  • Nerfacto: Recommended method, integrates multiple methods into one.

  • Instant-NGP: Instant Neural Graphics Primitives with a Multiresolution Hash Encoding

  • NeRF: OG Neural Radiance Fields

  • Mip-NeRF: A Multiscale Representation for Anti-Aliasing Neural Radiance Fields

  • TensoRF: Tensorial Radiance Fields

  • Splatfacto: Nerfstudio’s Gaussian Splatting implementation

Third-party Methods#

  • BioNeRF: Biologically Plausible Neural Radiance Fields for View Synthesis

  • Instruct-NeRF2NeRF: Editing 3D Scenes with Instructions

  • Instruct-GS2GS: Editing 3DGS Scenes with Instructions

  • SIGNeRF: Controlled Generative Editing of NeRF Scenes

  • K-Planes: Unified 3D and 4D Radiance Fields

  • LERF: Language Embedded Radiance Fields

  • Nerfbusters: Removing Ghostly Artifacts from Casually Captured NeRFs

  • NeRFPlayer: 4D Radiance Fields by Streaming Feature Channels

  • Tetra-NeRF: Representing Neural Radiance Fields Using Tetrahedra

  • PyNeRF: Pyramidal Neural Radiance Fields

  • SeaThru-NeRF: Neural Radiance Field for subsea scenes

  • Zip-NeRF: Anti-Aliased Grid-Based Neural Radiance Fields

  • NeRFtoGSandBack: Converting back and forth between NeRF and GS to get the best of both approaches.

  • OpenNeRF: OpenSet 3D Neural Scene Segmentation

Eager to contribute a method? We’d love to see you use nerfstudio in implementing new (or even existing) methods! Please view our guide for more details about how to add to this list!

Sponsors#

Sponsors of this work includes Luma AI and the BAIR commons.

Luma AILuma AI

BAIRBAIR

Built On#

tyro tyro
  • Easy to use config system

  • Developed by Brent Yi

tyro tyro
  • Library for accelerating NeRF renders

  • Developed by Ruilong Li

Citation#

You can find a paper writeup of the framework on arXiv.

If you use this library or find the documentation useful for your research, please consider citing:

@inproceedings{nerfstudio,
	title        = {Nerfstudio: A Modular Framework for Neural Radiance Field Development},
	author       = {
		Tancik, Matthew and Weber, Ethan and Ng, Evonne and Li, Ruilong and Yi, Brent
		and Kerr, Justin and Wang, Terrance and Kristoffersen, Alexander and Austin,
		Jake and Salahi, Kamyar and Ahuja, Abhik and McAllister, David and Kanazawa,
		Angjoo
	},
	year         = 2023,
	booktitle    = {ACM SIGGRAPH 2023 Conference Proceedings},
	series       = {SIGGRAPH '23}
}

Contributors#

Maintainers#

Nerfstudio Discord

Affiliation

Justin Kerr

justin.kerr

UC Berkeley

Jonáš Kulhánek

jkulhanek

Czech Technical University in Prague

Matt Tancik

tancik

Luma AI

Matias Turkulainen

maturk

ETH Zurich

Ethan Weber

ethanweber

UC Berkeley

Brent Yi

brent

UC Berkeley