Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid refetching dependencies when updating code in docker image #217

Closed
Res260 opened this issue Apr 17, 2020 · 7 comments · Fixed by #219
Closed

Avoid refetching dependencies when updating code in docker image #217

Res260 opened this issue Apr 17, 2020 · 7 comments · Fixed by #219
Labels
enhancement New feature or request

Comments

@Res260
Copy link
Collaborator

Res260 commented Apr 17, 2020

Currently, when changing code in pyrdp and rebuilding the image, the dockerfile fetches the dependencies again because the whole pyrdp folder is copied before the dependencies are fetched, increasing build time considerably.

There should be a way to fetch dependencies before copying the pyrdp code in the docker image.

@obilodeau
Copy link
Collaborator

Which dependencies? The OS or pip?

@Res260
Copy link
Collaborator Author

Res260 commented Apr 17, 2020

Yeah it was not clear, i'm talking about pip dependencies.

@alxbl
Copy link
Collaborator

alxbl commented Apr 17, 2020

How sure are we that the GitHub CI will keep the docker images in cache? This might be pointless if GitHub doesn't keep the images around.

In any case, if it's possible it would save us a lot of CI minutes and really speed up the PR check process.

@alxbl alxbl added the enhancement New feature or request label Apr 17, 2020
@Res260
Copy link
Collaborator Author

Res260 commented Apr 17, 2020

I'm not sure about docker caching for github workflows...

I opened the PR because I was fixing a bug which was docker-dependant and to debug I needed to change the codebase some times, but everytime I did it it pulled the whole dependencies, filling my storage space because it was a new (big) cache layer. It's mostly for this use case that it's problematic I think

@alxbl
Copy link
Collaborator

alxbl commented Apr 17, 2020

I mean, it would be clean to have the dependencies as a separate layer, I agree. If it boosts the CI times at the same time, then that would be nice.

@obilodeau
Copy link
Collaborator

I can try to address this. However, I recall having to add explicitly a copy of the whole source tree for the pip install step.

Also note that no matter what changes in the code, the runtime image will be rebuilt, so even if we can skip some of the pip install time spent, the runtime apt-get will take place. I was aware when I transitioned to the multi-stage build. The tradeoff is a space-savings (all of our users) vs dev wait time (devs, CI clock time). The solution to that that I've seen is to use a traditional fat full-build Dockerfile for development. A Dockerfile.dev that doesn't use multi-stage but that developers use for testing.

@obilodeau
Copy link
Collaborator

I have to admit that building that image is quite long now with pyside2, pyav, dbus-python all being compiled for an unrelated code change.

I should have something that improves things a little shortly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants