Update README.md
This commit is contained in:
parent
504d25f8eb
commit
47294e599f
12
README.md
12
README.md
@ -57,7 +57,9 @@ In our environment, we use pytorch=1.13.1+cu116.
|
|||||||
The dataset provided in [D-NeRF](https://github.com/albertpumarola/D-NeRF) is used. You can download the dataset from [dropbox](https://www.dropbox.com/s/0bf6fl0ye2vz3vr/data.zip?dl=0).
|
The dataset provided in [D-NeRF](https://github.com/albertpumarola/D-NeRF) is used. You can download the dataset from [dropbox](https://www.dropbox.com/s/0bf6fl0ye2vz3vr/data.zip?dl=0).
|
||||||
|
|
||||||
**For real dynamic scenes:**
|
**For real dynamic scenes:**
|
||||||
The dataset provided in [HyperNeRF](https://github.com/google/hypernerf) is used. You can download scenes from [Hypernerf Dataset](https://github.com/google/hypernerf/releases/tag/v0.1) and organize them as [Nerfies](https://github.com/google/nerfies#datasets). Meanwhile, [Plenoptic Dataset](https://github.com/facebookresearch/Neural_3D_Video) could be downloaded from their official websites. To save the memory, you should extract the frames of each video and then organize your dataset as follows.
|
The dataset provided in [HyperNeRF](https://github.com/google/hypernerf) is used. You can download scenes from [Hypernerf Dataset](https://github.com/google/hypernerf/releases/tag/v0.1) and organize them as [Nerfies](https://github.com/google/nerfies#datasets).
|
||||||
|
|
||||||
|
Meanwhile, [Plenoptic Dataset](https://github.com/facebookresearch/Neural_3D_Video) could be downloaded from their official websites. To save the memory, you should extract the frames of each video and then organize your dataset as follows.
|
||||||
|
|
||||||
```
|
```
|
||||||
├── data
|
├── data
|
||||||
@ -105,17 +107,19 @@ python scripts/downsample_point.py data/dynerf/cut_roasted_beef/colmap/dense/wor
|
|||||||
# Finally, train.
|
# Finally, train.
|
||||||
python train.py -s data/dynerf/cut_roasted_beef --port 6017 --expname "dynerf/cut_roasted_beef" --configs arguments/dynerf/cut_roasted_beef.py
|
python train.py -s data/dynerf/cut_roasted_beef --port 6017 --expname "dynerf/cut_roasted_beef" --configs arguments/dynerf/cut_roasted_beef.py
|
||||||
```
|
```
|
||||||
For training hypernerf scenes such as `virg/broom`, run
|
For training hypernerf scenes such as `virg/broom`: Pregenerated point clouds by COLMAP are provided [here](https://drive.google.com/file/d/1fUHiSgimVjVQZ2OOzTFtz02E9EqCoWr5/view). Just download them and put them in to correspond folder, and you can skip the former two steps. Also, you can run the commands directly.
|
||||||
|
|
||||||
```python
|
```python
|
||||||
# First, computing dense point clouds by COLMAP
|
# First, computing dense point clouds by COLMAP
|
||||||
bash colmap.sh data/hypernerf/virg/broom2 hypernerf
|
bash colmap.sh data/hypernerf/virg/broom2 hypernerf
|
||||||
# Second, downsample the point clouds generated in the first step.
|
# Second, downsample the point clouds generated in the first step.
|
||||||
python scripts/downsample_point.py data/hypernerf/virg/broom2/colmap/dense/workspace/fused.ply data/hypernerf/virg/broom2/points3D_downsample2.ply
|
python scripts/downsample_point.py data/hypernerf/virg/broom2/colmap/dense/workspace/fused.ply data/hypernerf/virg/broom2/points3D_downsample2.ply
|
||||||
# Finally, train.
|
# Finally, train.
|
||||||
python train.py -s data/hypernerf/virg/broom2/ --port 6017 --expname "hypernerf/broom2" --configs arguments/hypernerf/broom2.py
|
python train.py -s data/hypernerf/virg/broom2/ --port 6017 --expname "hypernerf/broom2" --configs arguments/hypernerf/broom2.py
|
||||||
```
|
```
|
||||||
|
|
||||||
For your custom datasets, install nerfstudio and follow their colmap pipeline.
|
|
||||||
|
For your custom datasets, install nerfstudio and follow their [COLMAP](https://colmap.github.io/) pipeline. You should install COLMAP at first, then:
|
||||||
|
|
||||||
```python
|
```python
|
||||||
pip install nerfstudio
|
pip install nerfstudio
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user