Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

if custom dataset, have trouble,then ... #43

Open
yuedajiong opened this issue Oct 17, 2023 · 5 comments
Open

if custom dataset, have trouble,then ... #43

yuedajiong opened this issue Oct 17, 2023 · 5 comments

Comments

@yuedajiong
Copy link

yuedajiong commented Oct 17, 2023

I have generated my dataset from video, and reconstruted a mesh successfully.
any question, feel free ask me, i will try my best to answer you, IFF our 19reborn is busy. :-)

surface quality optimizing ...

00000000

image

@FeiiYin
Copy link

FeiiYin commented Oct 18, 2023

Hi, I want to ask about the dataset preprocessing. I use a synthesize dataset from sapien, which can run on threestudio. It cannot run here, raise Nerf training generated 0 samples, Aborting training.

And I follow #26 (comment)
to preprocess the dataset, it raise the bellow error. It seems that my input data (camera or mask) is not right.
My input camera is proj_mtx_new, mask is extracted from the alpha of a png.
Thanks in advance!

c2w = transform[:3, :4] # transform is a synthesized parameters
mv = torch.linalg.inv(transform)
fovy =  np.random.uniform(self.fovy_range_min, self.fovy_range_max)
self.resolution = [512, 512]
K = np.array([
                [fovy, 0, self.resolution[0] / 2, 0],
                [0, fovy, self.resolution[1] / 2, 0],
                [0, 0, 1, 0],
                [0, 0, 0, 1]
            ])
K = torch.from_numpy(K).float()
proj_mtx_new = K @ mv

Number of points:0
/mnt/petrelfs/wangtengfei/code/NeuS2/tools/preprocess_cameras.py:220: RuntimeWarning: Mean of empty slice.
  centroid = np.array(all_Xs).mean(axis=0)
/opt/conda/lib/python3.9/site-packages/numpy/core/_methods.py:192: RuntimeWarning: invalid value encountered in scalar divide
  ret = ret.dtype.type(ret / rcount)
/opt/conda/lib/python3.9/site-packages/numpy/core/_methods.py:269: RuntimeWarning: Degrees of freedom <= 0 for slice
  ret = _var(a, axis=axis, dtype=dtype, out=out, ddof=ddof,
/opt/conda/lib/python3.9/site-packages/numpy/core/_methods.py:226: RuntimeWarning: invalid value encountered in divide
  arrmean = um.true_divide(arrmean, div, out=arrmean,
/opt/conda/lib/python3.9/site-packages/numpy/core/_methods.py:261: RuntimeWarning: invalid value encountered in scalar divide
  ret = ret.dtype.type(ret / rcount)
Traceback (most recent call last):
  File "/mnt/petrelfs/wangtengfei/code/NeuS2/tools/preprocess_cameras.py", line 312, in <module>
    get_normalization(opt.source_dir, opt.use_linear_init, dataset)
  File "/mnt/petrelfs/wangtengfei/code/NeuS2/tools/preprocess_cameras.py", line 266, in get_normalization
    normalization, all_Xs = get_normalization_function(Ps, mask_points_all, number_of_normalization_points, number_of_cameras,masks_all)
  File "/mnt/petrelfs/wangtengfei/code/NeuS2/tools/preprocess_cameras.py", line 225, in get_normalization_function
    centroid,scale,all_Xs = refine_visual_hull(masks_all, Ps, scale, centroid)
  File "/mnt/petrelfs/wangtengfei/code/NeuS2/tools/preprocess_cameras.py", line 151, in refine_visual_hull
    points = points + center[:, np.newaxis]
IndexError: invalid index to scalar variable.

@zkx11919369933
Copy link

I shot it with my phone
ValueError: operands could not be broadcast together with shapes (1920,1080,4) (1920,1080,3)
what should I do?

@MikeJPelton
Copy link

Something similar happened to me - I think it's expecting your input images and/or masks to be 4 bytes per pixel, not 3, (i.e. RGB + Alpha, not just RGB). You can probably convert them with imagemagick, or in code using OpenCV. Hope it helps!

@txyugood
Copy link

txyugood commented Apr 9, 2024

How can I improve the quality of the reconstruction result? It looks a bit damaged.
image

@Start1er
Copy link

Start1er commented Aug 8, 2024

Hi:
I use custom dataset and run https://github.com/NVlabs/instant-ngp/blob/master/scripts/colmap2nerf.py to get a transforms.json, and run NeuS2 but seems like what ever I set aabb-scale, the "fog" exsit in cube which can made the object mesh not successfully export. So I'm here to ask does it need some pre-processing in custom data to let the SDF only consentrate on object not with "fog":
because I'm hard to get the object like your result without the "fog" in cube.:
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants