Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug in classify_components_ep #543

Closed
haesemeyer opened this issue May 27, 2019 · 6 comments
Closed

Bug in classify_components_ep #543

haesemeyer opened this issue May 27, 2019 · 6 comments

Comments

@haesemeyer
Copy link

  • Tell us about your operating system (Linux/macOS/Windows), Python version (3.x), working environment (Spyder/Jupyter Notebook/other), and (if applicable) which of the demo scripts you're using for your analysis
    Python 3.6 on Ubuntu Linux, running CaImAn from Jupyter QtConsole with single-threaded=True.

  • Describe the issue that you are experiencing
    During component validation of an imaging stack, classify_components_ep attempts to correlated to scalar values, raising a ValueError on line 193 (rval[i] = scipy.stats.pearsonr(mY, atemp[px])[0]) in the process. This is caused by px being a scalar. There should probably be a check in the function for this (extremely rare) occurence. In case only one point passes threshold (np.where(atemp > 0)[0] an rvalue of 0 should be appended instead of calling pearsonr (as is the currently the case when LOC[i] is None). Something like:
    if px.size < 2:
    rval[i] = 0
    significant_samples.append(0)
    continue

  • Copy error log below
    ValueError Traceback (most recent call last)
    ~/Documents/Python/imaging_pipeline/cai_test.py in
    193 all_c = []
    194 for fn in fnames:
    --> 195 fit_cnm2 = main2(zoom, frame_duration, fn)
    196 all_c.append(fit_cnm2.estimates.C)
    197 # all_c = np.vstack(all_c)

~/Documents/Python/imaging_pipeline/cai_test.py in main2(zoom_level, t_per_frame, filename)
182 cai = CaImAn(4.0, 500/zoom_level, t_per_frame)
183 im = cai.motion_correct(filename)[0]
--> 184 return cai.extract_components(im, filename)[1]
185
186

~/Documents/Python/imaging_pipeline/cai_wrapper.py in extract_components(self, images, fname)
196 'cnn_lowest': cnn_lowest}
197 cnm2.params.set('quality', val_dict)
--> 198 cnm2.estimates.evaluate_components(images, cnm2.params, dview=dview)
199 # update object with selected components
200 cnm2.estimates.select_components(use_object=True)

~/anaconda3/envs/caiman/lib/python3.6/site-packages/caiman/source_extraction/cnmf/estimates.py in evaluate_components(self, imgs, params, dview)
841 thresh_cnn_lowest=opts['cnn_lowest'],
842 r_values_lowest=opts['rval_lowest'],
--> 843 min_SNR_reject=opts['SNR_lowest'])
844 self.idx_components = idx_components.astype(int)
845 self.idx_components_bad = idx_components_bad.astype(int)

~/anaconda3/envs/caiman/lib/python3.6/site-packages/caiman/components_evaluation.py in estimate_components_quality_auto(Y, A, C, b, f, YrA, frate, decay_time, gSig, dims, dview, min_SNR, r_values_min, r_values_lowest, Npeaks, use_cnn, thresh_cnn_min, thresh_cnn_lowest, thresh_fitness_delta, min_SNR_reject, gSig_range)
470 _, _, fitness_raw, _, r_values = estimate_components_quality( # type: ignore # mypy cannot reason about return_all
471 traces, Y, A, C, b, f, final_frate=frate, Npeaks=Npeaks, r_values_min=r_values_min, fitness_min=fitness_min,
--> 472 fitness_delta_min=thresh_fitness_delta, return_all=True, dview=dview, num_traces_per_group=50, N=N_samples)
473
474 comp_SNR = -norm.ppf(np.exp(fitness_raw / N_samples))

~/anaconda3/envs/caiman/lib/python3.6/site-packages/caiman/components_evaluation.py in estimate_components_quality(traces, Y, A, C, b, f, final_frate, Npeaks, r_values_min, fitness_min, fitness_delta_min, return_all, N, remove_baseline, dview, robust_std, Athresh, thresh_C, num_traces_per_group)
628 res = dview.map_sync(evaluate_components_placeholder, params)
629
--> 630 for r_ in res:
631 fitness_raw__, fitness_delta__, erfc_raw__, erfc_delta__, r_values__, _ = r_
632 fitness_raw = np.concatenate([fitness_raw, fitness_raw__])

~/anaconda3/envs/caiman/lib/python3.6/site-packages/caiman/components_evaluation.py in evaluate_components_placeholder(params)
386 fitness_raw, fitness_delta, _, _, r_values, significant_samples =
387 evaluate_components(Y, traces, A, C, b, f, final_frate, remove_baseline=remove_baseline,
--> 388 N=N, robust_std=robust_std, Athresh=Athresh, Npeaks=Npeaks, thresh_C=thresh_C)
389
390 return fitness_raw, fitness_delta, [], [], r_values, significant_samples

~/anaconda3/envs/caiman/lib/python3.6/site-packages/caiman/components_evaluation.py in evaluate_components(Y, traces, A, C, b, f, final_frate, remove_baseline, N, robust_std, Athresh, Npeaks, thresh_C, sigma_factor)
370 # compute the overlap between spatial and movie average across samples with significant events
371 r_values, significant_samples = classify_components_ep(Yr, A, C, b, f, Athresh=Athresh, Npeaks=Npeaks, tB=tB,
--> 372 tA=tA, thres=thresh_C)
373
374 return fitness_raw, fitness_delta, erfc_raw, erfc_delta, r_values, significant_samples

~/anaconda3/envs/caiman/lib/python3.6/site-packages/caiman/components_evaluation.py in classify_components_ep(Y, A, C, b, f, Athresh, Npeaks, tB, tA, thres)
191 mY = np.mean(ysqr[:, indexes], axis=-1)
192 significant_samples.append(indexes)
--> 193 rval[i] = scipy.stats.pearsonr(mY, atemp[px])[0]
194
195 else:

~/anaconda3/envs/caiman/lib/python3.6/site-packages/scipy/stats/stats.py in pearsonr(x, y)
3390
3391 if n < 2:
-> 3392 raise ValueError('x and y must have length at least 2.')
3393
3394 x = np.asarray(x)

ValueError: x and y must have length at least 2.

@epnev
Copy link
Contributor

epnev commented May 28, 2019

@haesemeyer Thanks, I had never encountered this before. I'm thinking whether the size threshold should be 2 here or 3 given that the pearson correlation between two vectors of length 2 will always be either +1 or -1.

@haesemeyer
Copy link
Author

@epnev Yes, to be fair I have so far encountered only one stack where this is a problem ;-) - however since there is no way of catching this exception it will halt the whole analysis of this particular stack which otherwise has useful data.
That's a good point. It is probably doubtful anyways whether validating this case makes sense, so setting the size threshold to 3 is likely more sensible.

@epnev
Copy link
Contributor

epnev commented May 28, 2019

Fixed with ffac9c5
Thanks!

@epnev epnev closed this as completed May 28, 2019
@haesemeyer
Copy link
Author

@epnev
Thanks for the fix but unfortunately the commit ffac9c5 has two errors. The calls to format are misplaced in the following (after the function call instead of at the end of the string):

logging.warning('Component {0} is only active jointly with ' +
                                'neighboring components. Space correlation ' +
                                'calculation might be unreliable.').format(i)

logging.warning('Component {0} is almost empty. Space ' +
            mY = np.mean(ysqr[:, indexes], axis=-1)
                                'correlation is set to 0.').format(i)

epnev added a commit that referenced this issue May 28, 2019
@epnev
Copy link
Contributor

epnev commented May 28, 2019

@haesemeyer sorry about that and thanks for noticing. Should be fine now 1db919b

@haesemeyer
Copy link
Author

@epnev Perfect thank you very much!

tvajtay pushed a commit to tvajtay/CaImAn that referenced this issue Jun 11, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants