To get started with NVIDIA Instant NeRF, check out the [blog post](https://developer.nvidia.com/blog/getting-started-with-nvidia-instant-nerfs/) and [SIGGRAPH tutorial](https://www.nvidia.com/en-us/on-demand/session/siggraph2022-sigg22-s-16/).
For business inquiries, please submit the [NVIDIA research licensing form](https://www.nvidia.com/en-us/research/inquiries/).
For business inquiries, please submit the [NVIDIA research licensing form](https://www.nvidia.com/en-us/research/inquiries/).
## Installation
## Installation
If you have Windows, download one of the following releases corresponding to your graphics card and extract it.
If you have Windows, download one of the following releases corresponding to your graphics card and extract it. Then, start `instant-ngp.exe`.
-[**RTX 3000 & 4000 series, RTX A4000–A6000**, and other Ampere & Ada cards](https://github.com/NVlabs/instant-ngp/releases/download/continuous/Instant-NGP-for-RTX-3000-and-4000.zip)
-[**RTX 3000 & 4000 series, RTX A4000–A6000**, and other Ampere & Ada cards](https://github.com/NVlabs/instant-ngp/releases/download/continuous/Instant-NGP-for-RTX-3000-and-4000.zip)
-[**RTX 2000 series, Titan RTX, Quadro RTX 4000–8000**, and other Turing cards](https://github.com/NVlabs/instant-ngp/releases/download/continuous/Instant-NGP-for-RTX-2000.zip)
-[**RTX 2000 series, Titan RTX, Quadro RTX 4000–8000**, and other Turing cards](https://github.com/NVlabs/instant-ngp/releases/download/continuous/Instant-NGP-for-RTX-2000.zip)
-[**GTX 1000 series, Titan Xp, Quadro P1000–P6000**, and other Pascal cards](https://github.com/NVlabs/instant-ngp/releases/download/continuous/Instant-NGP-for-GTX-1000.zip)
-[**GTX 1000 series, Titan Xp, Quadro P1000–P6000**, and other Pascal cards](https://github.com/NVlabs/instant-ngp/releases/download/continuous/Instant-NGP-for-GTX-1000.zip)
Then, keep reading for a guided tour of the application or jump to our instructions for [__creating your own NeRF__](docs/nerf_dataset_tips.md).
Keep reading for a guided tour of the application or, if you are interested in creating your own NeRF, watch [the video tutorial](https://www.youtube.com/watch?v=3TWxO1PftMc) or read the [written instructions for creating your own NeRF](docs/nerf_dataset_tips.md).
If you use Linux, or want the [developer Python bindings](https://github.com/NVlabs/instant-ngp#python-bindings), or if your GPU is not listed above (e.g. Hopper, Volta, or Maxwell generations), follow the [step-by-step instructions to build __instant-ngp__ yourself](https://github.com/NVlabs/instant-ngp#building-instant-ngp-windows--linux).
If you use Linux, or want the [developer Python bindings](https://github.com/NVlabs/instant-ngp#python-bindings), or if your GPU is not listed above (e.g. Hopper, Volta, or Maxwell generations), you need to [build __instant-ngp__ yourself](https://github.com/NVlabs/instant-ngp#building-instant-ngp-windows--linux).
## Usage
## Usage
...
@@ -36,22 +34,21 @@ If you use Linux, or want the [developer Python bindings](https://github.com/NVl
...
@@ -36,22 +34,21 @@ If you use Linux, or want the [developer Python bindings](https://github.com/NVl
Download the [nanovdb volume for the Disney cloud](https://drive.google.com/drive/folders/1SuycSAOSG64k2KLV7oWgyNWyCvZAkafK?usp=sharing), which is derived [from here](https://disneyanimation.com/data-sets/?drawer=/resources/clouds/)([CC BY-SA 3.0](https://media.disneyanimation.com/uploads/production/data_set_asset/6/asset/License_Cloud.pdf)).
Download the [nanovdb volume for the Disney cloud](https://drive.google.com/drive/folders/1SuycSAOSG64k2KLV7oWgyNWyCvZAkafK?usp=sharing), which is derived [from here](https://disneyanimation.com/data-sets/?drawer=/resources/clouds/)([CC BY-SA 3.0](https://media.disneyanimation.com/uploads/production/data_set_asset/6/asset/License_Cloud.pdf)).
Then drag `wdas_cloud_quarter.nvdb` into the GUI or use the command:
Then drag `wdas_cloud_quarter.nvdb` into the window or use the command:
@@ -144,6 +141,25 @@ Then, you can render a video `.mp4` of your camera path or export the keyframes
...
@@ -144,6 +141,25 @@ Then, you can render a video `.mp4` of your camera path or export the keyframes
There is a bit more information about the GUI [in this post](https://developer.nvidia.com/blog/getting-started-with-nvidia-instant-nerfs/) and [in this video guide to creating your own video](https://www.youtube.com/watch?v=3TWxO1PftMc).
There is a bit more information about the GUI [in this post](https://developer.nvidia.com/blog/getting-started-with-nvidia-instant-nerfs/) and [in this video guide to creating your own video](https://www.youtube.com/watch?v=3TWxO1PftMc).
### VR controls
To view the neural graphics primitive in VR, first start your VR runtime. This will most likely be either
- __OculusVR__ if you have an Oculus Rift or Meta Quest (with link cable) headset, and
- __SteamVR__ if you have another headset.
- Any OpenXR-compatible runtime will work.
Then, press the __View in VR/AR headset__ button in the __instant-ngp__ GUI and put on your headset.
In VR, you have the following controls.
| Control | Meaning |
| :--------------------: | ------------- |
| Left stick / trackpad | Move |
| Right stick / trackpad | Turn camera |
| Press stick / trackpad | Erase NeRF around the hand |