You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi~I believe nerfstudio are using nerfacc to train instant-ngp models of nerf-synthetic dataset, when training on ficus dataset, I notice there are floating things in the scene cube, which hurt the psnr drastically, whereas no such things exist in scene like lego.
Here is a depth map picture rendered from ficus dataset:
any idea on how to fix this? Thanks
Here is a copy of my config:
ngp_config = MethodSpecification(
config=TrainerConfig(
method_name="ngp",
steps_per_eval_batch=200,
steps_per_save=2000,
steps_per_eval_all_images=2000,
max_num_iterations=20000,
mixed_precision=False,
use_grad_scaler=False,
pipeline=DynamicBatchPipelineConfig(
datamanager=VanillaDataManagerConfig(
dataparser=BlenderDataParserConfig()
),
target_num_samples = 1<<18,
max_num_samples_per_ray = 2**8,
model=SegNGPModelConfig(eval_num_rays_per_chunk=8192),
),
optimizers={
"fields": {
"optimizer": AdamOptimizerConfig(lr=1e-2, eps=1e-15, weight_decay=1e-5), # 1e-5 if in ["materials", "ficus", "drums"] else 1e-6
"scheduler": MultiStepSchedulerConfig(
max_steps=20000,
gamma=0.33,
milestones=(10000, 15000, 18000) # need LinearLR ?
),
},
},
vis="wandb",
),
description="Segment NGP config",
)
The text was updated successfully, but these errors were encountered:
Hi~I believe nerfstudio are using nerfacc to train instant-ngp models of nerf-synthetic dataset, when training on ficus dataset, I notice there are floating things in the scene cube, which hurt the psnr drastically, whereas no such things exist in scene like lego.
any idea on how to fix this? Thanks Here is a copy of my config: ngp_config = MethodSpecification( config=TrainerConfig( method_name="ngp", steps_per_eval_batch=200, steps_per_save=2000, steps_per_eval_all_images=2000, max_num_iterations=20000, mixed_precision=False, use_grad_scaler=False, pipeline=DynamicBatchPipelineConfig( datamanager=VanillaDataManagerConfig( dataparser=BlenderDataParserConfig() ), target_num_samples = 1<<18, max_num_samples_per_ray = 2**8, model=SegNGPModelConfig(eval_num_rays_per_chunk=8192), ), optimizers={ "fields": { "optimizer": AdamOptimizerConfig(lr=1e-2, eps=1e-15, weight_decay=1e-5), # 1e-5 if in ["materials", "ficus", "drums"] else 1e-6 "scheduler": MultiStepSchedulerConfig( max_steps=20000, gamma=0.33, milestones=(10000, 15000, 18000) # need LinearLR ? ), }, }, vis="wandb", ), description="Segment NGP config", )Here is a depth map picture rendered from ficus dataset:
The text was updated successfully, but these errors were encountered: