@@ -53,12 +53,12 @@ See [nerf_loader.cu](src/nerf_loader.cu) for implementation details and addition
## Preparing new NeRF datasets
To train on self-captured data, one has to process the data into an existing format supported by Instant-NGP. We provide scripts to support three approaches:
To train on self-captured data, one has to process the data into an existing format supported by `instant-ngp`. We provide scripts to support three approaches:
-[COLMAP](#COLMAP) to create a dataset from a set of photos or a video you took
-[Record3D](#Record3D) to create a dataset with an iPhone 12 Pro or newer (based on ARKit)
-[NeRFCapture](https://github.com/jc211/NeRFCapture) to create a dataset or stream posed images directly to InsantNGP with an iOS device.
-[NeRFCapture](#NeRFCapture) to create a dataset or stream posed images directly to `instant-ngp` with an iOS device.
Both require [Python](https://www.python.org/) 3.7 or higher to be installed and available in your PATH.
...
...
@@ -116,7 +116,7 @@ instant-ngp$ ./instant-ngp [path to training data folder containing transforms.j
### Record3D
With an >=iPhone 12 Pro, one can use [Record3D](https://record3d.app/) to collect data and avoid COLMAP. [Record3D](https://record3d.app/) is an iOS app that relies on ARKit to estimate each image's camera pose. It is more robust than COLMAP for scenes that lack textures or contain repetitive patterns. To train Instant-NGPs with Record3D data, follow these steps:
With an >=iPhone 12 Pro, one can use [Record3D](https://record3d.app/) to collect data and avoid COLMAP. [Record3D](https://record3d.app/) is an iOS app that relies on ARKit to estimate each image's camera pose. It is more robust than COLMAP for scenes that lack textures or contain repetitive patterns. To train `instant-ngp` with Record3D data, follow these steps:
1. Record a video and export with the "Shareable/Internal format (.r3d)".
2. Send the exported data to your computer.
...
...
@@ -127,19 +127,17 @@ With an >=iPhone 12 Pro, one can use [Record3D](https://record3d.app/) to collec
```
If you capture the scene in the landscape orientation, add `--rotate`.
5. Launch Instant-NGP training:
5. Launch `instant-ngp` training:
```
instant-ngp$ ./instant-ngp path/to/data
```
### NeRFCapture
[NeRFCapture](https://github.com/jc211/NeRFCapture) is an iOS app that runs on any ARKit device. It allows you to stream images directly from your phone to InstantNGP thus enabling a more interactive experience. It can also collect an offline dataset for later use.
[NeRFCapture](ttps://github.com/jc211/NeRFCapture) is an iOS app that runs on any ARKit device. It allows you to stream images directly from your phone to `instant-ngp` thus enabling a more interactive experience. It can also collect an offline dataset for later use.
The following dependencies are needed to run the NeRFCapture script:
parser.add_argument("--stream",action="store_true",help="Stream images directly to InstantNGP.")
parser.add_argument("--n_frames",default=10,type=int,help="Number of frames before saving the dataset. Also used as the number of cameras to remember when streaming.")
parser.add_argument("--save_path",required='--stream'notinsys.argv,type=str,help="Path to save the dataset.")
parser.add_argument("--depth_scale",default=4.0,type=float,help="Depth scale used when saving depth. Only used when saving dataset.")
parser.add_argument("--depth_scale",default=10.0,type=float,help="Depth scale used when saving depth. Only used when saving dataset.")
parser.add_argument("--overwrite",action="store_true",help="Rewrite over dataset if it exists.")