Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test coverage drop when upgrading to 0.11.2 from 0.10.0 #467

Closed
David-Sharpe opened this issue Mar 1, 2016 · 4 comments
Closed

Test coverage drop when upgrading to 0.11.2 from 0.10.0 #467

David-Sharpe opened this issue Mar 1, 2016 · 4 comments

Comments

@David-Sharpe
Copy link

When upgrading from simplecov 0.10.0 to 0.11.2 our code coverage drops significantly. We go from 100% to 87.67%.

This is the result set from the 100% test coverage simplecov 0.10.0 run.
resultset_0_10.txt

This is the result set from the 87.67%
resultset_0_11.txt

The only difference between these two runs is the upgrade to simplecov 0.11.2 and running bundle.
I have noticed some files are in more sections with the newer version. For example, a file may show up in 1 and 2 of 4 in the 0_10 result set instead of just 1 of 4 in the 0_11 result set. The additional entries seem to have all lines as 0's. I'm guessing these extra all 0 entries are driving down the test coverage during the merge.

ruby 2.0.0p481 (2014-05-08 revision 45883) [x86_64-darwin14.3.0]
Rails 4.1
RSpec 3.4.1
We are using parallel_tests 1.5.1 with 4 threads.
Tests are invoked with rake parallel:spec[4]

We are not using a .simplecov file.

@David-Sharpe
Copy link
Author

Problem comes from track_files. Setting track_files to false in spec_helper in the SimpleCov.start 'rails' block fixes this.

@rdunlop
Copy link

rdunlop commented Mar 11, 2016

@David-Sharpe Could this be caused by #441 ?

@wkirby
Copy link

wkirby commented Jun 14, 2016

@David-Sharpe Setting track_files to false excludes all files with 0% coverage from the report, arbitrarily inflating our coverage metrics. Not including track_files false shows no ops as not being covered, deflating our coverage metrics.

What is the option for getting accurate coverage metrics?

@PragTob
Copy link
Collaborator

PragTob commented Feb 4, 2017

This should be fixed with the recent merge of the track files fix, if not please let us know

@PragTob PragTob closed this as completed Feb 4, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants