-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More robust ZipImporter cache invalidation #202
Comments
Original comment by pje (Bitbucket: pje, GitHub: pje): (copying the notes from the old ticket) A Way To Fix zipimport's Internal CachesSuppose that, when we remove the entries from the zip cache, we actually set them aside for a moment, then create a new zipimporter for the file. Then, we clear the old entry, and update it with the new entry. Because the cache entries are shared between zipimporters, updating the old entry would automatically fix every existing zipimporter for that zipfile. Something like:
(But done for each path in the cache that is the "same" path as path_to_zip, not just the one entry.) This approach should fix all the use cases, without needing to track down and replace individual distribution objects in every environment and working set, because the way the cache works, the dictionary representing a given zipfile is shared between all existing zipimporter objects. So rewriting the old entry should fix the old zipimporters. It would require a bit more elaborate change to the uncaching code, as it would need to first pull out any matching entries for the zipfile, saving them in a list, then it would need to create a zipimporter instance for the path, then pull out the new entry and update the old one(s). All in all, this is probably the "right" way to do it, as it should take care of all updating-an-egg-in-place scenarios properly. |
Original comment by pje (Bitbucket: pje, GitHub: pje): By the way, since Pull Request 48 already changed the Note, however, that this uncaching approach will only work if the uncaching happens after the new file is moved/copied and is in fact a zip file; otherwise creating the zipimporter will either raise an error (if there's no file or it's a directory), or re-cache the old file (if the new one hasn't overwritten it yet). (Since the last discussion of this issue, I've already forgotten whether the uncaching code is called at the right point for this, so it needs to be rechecked. ;-) ) |
Original comment by jaraco (Bitbucket: jaraco, GitHub: jaraco): fix issue202 - update existing zipimporters when replacing a zipped egg When replacing a zipped egg distribution with a different zipped egg, we make Related to the following project issues: |
Original comment by jaraco (Bitbucket: jaraco, GitHub: jaraco): Failing tests in Travis suggest this implementation fails on pypy. |
Original comment by jaraco (Bitbucket: jaraco, GitHub: jaraco): fix issue202 - update existing zipimporters when replacing a zipped egg When replacing a zipped egg distribution with a different zipped egg, we make Related to the following project issues: |
Original comment by jaraco (Bitbucket: jaraco, GitHub: jaraco): The tests are still failing it Travis. |
Original comment by jurko (Bitbucket: jurko, GitHub: jurko): See pull request #65 containing my suggested fix. Note that BitBucket might get confused by multiple default branch heads, and if that happens (I believe it can happen when you do additional commits to pypa/setuptools) you might need to merge its one commit manually. Best regards, |
Original comment by jaraco (Bitbucket: jaraco, GitHub: jaraco): Excellent. By being in the default branch (and particularly because it was against the issue202 head), I was able to merge it manually and directly, and bitbucket detected it. Furthermore, tests are passing. I believe this can go into the next release. Thanks for all the hard work on this one. |
Originally reported by: jaraco (Bitbucket: jaraco, GitHub: jaraco)
In #168 and Pull Request 48, a sub-optimal workaround was implemented, but a more robust solution was proposed. Setuptools should consider implementing those suggestions.
The text was updated successfully, but these errors were encountered: