

Nerfstudio provides a simple API that allows for a simplified end-to-end process of creating, training, and visualizing NeRFs. The library supports an interpretable implementation of NeRFs by modularizing each component. With modular NeRF components, we hope to create a user-friendly experience in exploring the technology. Nerfstudio is a contributor-friendly repo with the goal of building a community where users can easily build upon each other’s contributions.
It’s as simple as plug and play with nerfstudio!
On top of our API, we are committed to providing learning resources to help you understand the basics of (if you’re just getting started), and keep up-to-date with (if you’re a seasoned veteran) all things NeRF. As researchers, we know just how hard it is to get onboarded with this next-gen technology. So we’re here to help with tutorials, documentation, and more!
Finally, have feature requests? Want to add your brand-spankin’-new NeRF model? Have a new dataset? We welcome contributions! Please do not hesitate to reach out to the nerfstudio team with any questions via Discord.
We hope nerfstudio enables you to build faster 🔨 learn together 📚 and contribute to our NeRF community 💖.
Contents#
This documentation is organized into 3 parts:
🏃♀️ Getting Started: a great place to start if you are new to nerfstudio. Contains a quick tour, installation, and an overview of the core structures that will allow you to get up and running with nerfstudio.
🧪 Nerfology: want to learn more about the tech itself? We’re here to help with our educational guides. We’ve provided some interactive notebooks that walk you through what each component is all about.
🤓 Developer Guides: describe all of the components and additional support we provide to help you construct, train, and debug your NeRFs. Learn how to set up a model pipeline, use the viewer, create a custom config, and more.
📚 Reference: describes each class and function. Develop a better understanding of the core of our technology and terminology. This section includes descriptions of each module and component in the codebase.
Supported Methods#
Included Methods#
Nerfacto: Recommended method, integrates mutiple methods into one.
Instant-NGP: Instant Neural Graphics Primitives with a Multiresolution Hash Encoding
NeRF: OG Neural Radiance Fields
Mip-NeRF: A Multiscale Representation for Anti-Aliasing Neural Radiance Fields
TensoRF: Tensorial Radiance Fields
Third-party Methods#
Instruct-NeRF2NeRF: Editing 3D Scenes with Instructions
K-Planes: Unified 3D and 4D Radiance Fields
LERF: Language Embedded Radiance Fields
Tetra-NeRF: Representing Neural Radiance Fields Using Tetrahedra
Eager to contribute a method? We’d love to see you use nerfstudio in implementing new (or even existing) methods! Please view our guide for more details about how to add to this list!
Quicklinks#
How-to Videos#
Demo video on how to run nerfstudio and use the viewer. |
|
Demo video on how to run nerfstudio without using COLMAP. |
Built On#
Easy to use config system
Developed by Brent Yi


Library for accelerating NeRF renders
Developed by Ruilong Li
Citation#
You can find a paper writeup of the framework on arXiv.
If you use this library or find the documentation useful for your research, please consider citing:
@inproceedings{nerfstudio,
title = {Nerfstudio: A Modular Framework for Neural Radiance Field Development},
author = {
Tancik, Matthew and Weber, Ethan and Ng, Evonne and Li, Ruilong and Yi, Brent
and Kerr, Justin and Wang, Terrance and Kristoffersen, Alexander and Austin,
Jake and Salahi, Kamyar and Ahuja, Abhik and McAllister, David and Kanazawa,
Angjoo
},
year = 2023,
booktitle = {ACM SIGGRAPH 2023 Conference Proceedings},
series = {SIGGRAPH '23}
}
Contributors#
Maintainers#
Nerfstudio Discord |
Affiliation |
|
---|---|---|
justin.kerr |
UC Berkeley |
|
jkulhanek |
Czech Technical University in Prague |
|
tancik |
UC Berkeley |
|
ethanweber |
UC Berkeley |
|
brent |
UC Berkeley |