-
Notifications
You must be signed in to change notification settings - Fork 980
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement the various status badges #786
Comments
For coverage and CI status, do we want a package maintainer to be able to explicitly specify links to travis/circleci/coveralls/etc., and use their individual APIs to determine status, or do we want to auto-magically parse these from the badges in the package description? |
I'm not sure to be honest. I think probably this is something we'll want to punt until after launch (other than removing the non-functional badges for now) unless someone feels motivated to think it through and figure out what the best form of it is. Off the top of my head, I think that might be an API method that would allow people to just say "hey we have X% coverage and here's a link to more details" or "our latest build status was X and here's a link to more details" (think something similar to the GitHub Status API except more specialized for particular types of statuses). That absolves Warehouse of any responsibility to attempt to support every service under the sun, and other people can create plugins or integrations for whatever CI or whatever they have going on. That solution would probably be post-launch too though, because it raises a larger question of what does the next-gen API look like as far as structure, authentication, etc. |
That sounds like a great approach. Seems like this is dependent on #994 then. |
Thinking about this in terms of the discussion going on in #991 I want to break this down by the four things we currently have placeholder badges for. Version CheckThis one is sort of unlike the others in that it isn't really measuring anything about the project at all, but is rather about the version of the page you currently are looking at in your browser. I think providing that information is important, but I'm not sure that grouping it in with the other three is the right way to do that (though this is largely a UX question, so I will defer to the experts!). In that same vein, I'm also unsure if a green/red indicator is the right paradigm for this, since it's not wrong to be looking at the old data (perhaps you're using that version and you want to read the README for that version, not the latest version). In the end, I think that this information should stay, and is a net benefit, helping to keep people from accidentally looking at older versions of the project because someone or something happened to link them to an older page (think of this sort of like the "development documentation" warning that projects like Django have). Dependency "Status"This is intended to be a sort of check about whether or not this project depends on an old version of any of it's dependencies. Essentially it will be green if the latest version of a dependency is acceptable given the specifiers in the dependency information, and red if there is some sort of upper bounds, or This check I think is more inline with the "spirit" of the rest of these badges, given that it's actually about the project in question, and not just what page the user happens to be on. However I worry that it might end up being sort of false positive-y. One instance that immediately jumps to mind is that you have projects like Django which offer support for many versions at once and it's very different to be using the latest version in a supported Django release, and using a version that is no longer being supported at all. However we don't currently have any way to determine if the version that would be installed are considered "supported" by the project authors or not. It also strikes me that this may end up punishing projects that depend on "foo", if "foo" brownbags a release that breaks them, forcing them to release a version with a Test Status / Test CoverageI'm lumping these together because they are very similar. I don't see any problem with providing status badges for these things and I think it could be beneficial, not just from a UX stand point, but also as a way to provide more information about the test coverage and status of the releases projects make to services like requires.io (for instance, maybe they'd warn that a newer version has failing tests when they notify about a new version). In my rough outline above, the data would be entirely provided by the projects themselves, so they are responsible for ensuring the accuracy of it, so there's little room for false positives or negatives here. The big concern I have here, is what do we do if a project hasn't provided this information. Currently we display a white badge that simply states that the information is unknown. I wonder if that information would be considered a negative signal, punishing projects for not having test coverage or test status available. I think of these two statuses similarly to how I think about commit statuses on GitHub (except instead of on commits, they are on releases of a project), looking at that, they let the thing providing the status provide if it should treat this as a failure/success/warning/etc in addition to not showing anything unless something has registered a status for that project, making the entire concept opt-in for each project. I think this could be a good example to follow here, to somehow hide these two badges (or however they get displayed) unless a project turns them on, and give control to the thing setting the status as to how the results are interpreted. |
(commenting based on @nlhkabu's request in #991) I agree the "Am I looking at the latest version?" notification is qualitatively different from the others - it seems to me that's most important if you're about to click on the manual download button, and otherwise mainly of interest in understanding whether the project description is likely to be up to date. For test & coverage status, the three things I'd add to @dstufft's comment are:
For dependency status:
|
Re: Becoming PyPI Milestone, that's mostly just there because we need to do something with it prior to launching — most likely removing the current hardcoded "unavailable" badges |
Now that there is pep518 and Also I'm not sure of what a dynamic CI / Coverage status could be used for, as these status often point to the coverage/testing of development branch, so might be misleading. It seem to me that a static way at upload time to give these info should be sufficient. Also instead of just saying "latest", it would be nice to know when was the first major release of this particular version done. The "Uploaded by " only inform that 7.3.6 was released 3 days ago, [I agree it's useful to know if a project is "active"], but I can see that many people would be interested at a glance to know when was the 7.x branch released. Personal opinion, "latest" convey the meaning of "development version" to me, and I guess where there is alpha/beta/rc release you don't want theses pages to have the green indicator. I think that "stable" might be better suited. |
So I can see a few ways to implement this, one is just a plain old static value in Another option is to provide an entry point inside of However, there may be a more fundamental aspect that's wrong with the "bundle the status with the project" that may matter. In an ideal world we'll get the status reports from the actual artifacts that exist on PyPI (and these test results would be specific to an artifact). This will ensure that no packaging mixups or some other thing caused a delta between what their VCS has tagged as some version and what was actually releases as some version that causes the tests to fail. If this is something we want to support (and I think it would be) then we cannot bake the coverage/test results into the package, because we cannot know what they are until after the package has been created. Which leaves us with what I think is likely going to be the final answer, which is providing an API to allow people to submit this data, although this has caused me to also believe we should support submitting this data alongside the upload in the upload API. This allows mostly the best of both worlds, people who have their coverage figured out prior to upload can simply include it with their upload data and people who want to run tests based on what was actually released to PyPI can submit theirs after the fact, and if anyone actually wants to continuously run tests on their old versions, they can do that too! |
Thanks for the detail answer. I think you are right and that's reasonable. |
Update on this: For the launch we will:
See #1308 for details. I am not particularly happy with the current design here, so when we do get to implementing these, I will need to rethink the layout. In the short term, I am moving this out of the launch milestone. |
@Carreau I thought you might want to know: The folks working on Warehouse have gotten funding to concentrate on improving and deploying it, and have kicked off work towards our development roadmap. The most urgent task is to improve Warehouse to the point where we can redirect pypi.python.org to pypi.org so the site is more sustainable and reliable. Since this feature isn't something that the legacy site has, I've moved it to a future milestone. Thanks for sharing your thoughts and telling us what works for you! |
I see you want to integrate badges. There are badges here: http://shields.io/ |
The new design has status badges for things like "Dependencies Up to Date", "Test Coverage", and "Test Status". We'll want to figure out how we can implement these and if we can't do it immediately we'll want to figure out the best way to remove them for the time being until we can.
The text was updated successfully, but these errors were encountered: