Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

generate reports for all supported architectures #629

Merged
merged 1 commit into from
Aug 5, 2023
Merged

Conversation

myk002
Copy link
Member

@myk002 myk002 commented Aug 5, 2023

both windows and linux
64-bit only

@myk002 myk002 merged commit 01e2b41 into master Aug 5, 2023
16 of 18 checks passed
@myk002 myk002 deleted the myk_report branch August 5, 2023 01:56
Comment on lines +17 to +18
install(TARGETS xml-dump-type-sizes
DESTINATION .)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we avoid install() here? This is installing the executable into the DF folder if the CMake option to build it is enabled, which isn't necessary. (It's not on by default, but I ran into it on my end.)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the actual tests happen in a different runner from the build runner, we need to upload the built binaries and other files as an artifact. These are read from the install directory for all other invocations of the build, so it keeps things clean and consistent if we install this built binary too. We could add some custom workflow steps to manually install the file instead of having cmake do it. Alternately, we could install to a directory that's more out of the way, but still in the install tree.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it work to upload just the xml-dump-type-sizes executable? As far as I'm aware, that's all that the native test runner needs to run in this workflow. We don't need all of DFHack.

My concern is not just that xml-dump-type-sizes executable is at the top-level DF folder, although that is a somewhat more noticeable annoyance - it's also that the executable is getting installed in general. It's not a large amount of disk space, but it is disk space we don't need to consume, and I have a lot of DF folders.

Copy link
Member Author

@myk002 myk002 Aug 10, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be clear, we do just build xml-dump-type-sizes when we run that build. This is the entire build log from https://github.com/DFHack/df-structures/actions/runs/5824369109/job/15793271743

Run ninja -C build install
ninja: Entering directory `build'
[1/5] Generating codegen.out.xml and df/headers
[2/5] Generating dump-type-sizes.cpp
[3/5] Building CXX object library/xml/tools/CMakeFiles/xml-dump-type-sizes.dir/dump-type-sizes.cpp.o
[[4](https://github.com/DFHack/df-structures/actions/runs/5824369109/job/15793271743#step:11:5)/[5](https://github.com/DFHack/df-structures/actions/runs/5824369109/job/15793271743#step:11:6)] Linking CXX executable library/xml/tools/xml-dump-type-sizes
[4/5] Install the project...
-- Install configuration: "Release"
-- Installing: /home/runner/work/df-structures/df-structures/build/image/./xml-dump-type-sizes
-- Set runtime path of "/home/runner/work/df-structures/df-structures/build/image/./xml-dump-type-sizes" to "hack"

The change to make it install the binary that it builds was done so this step would work without modification (all builds install to build/image):

      - name: Upload artifact
        if: inputs.artifact-name
        uses: actions/upload-artifact@v3
        with:
          name: ${{ inputs.append-date-and-hash && steps.artifactname.outputs.name || inputs.artifact-name }}
          path: build/image/*

we could work around this by changing the above to look like this for the build-linux and build-windows (and future build-macox) reusable workflows:

      - name: Upload artifact (common)
        if: inputs.artifact-name && !inputs.xml-dump-type-sizes
        uses: actions/upload-artifact@v3
        with:
          name: ${{ inputs.append-date-and-hash && steps.artifactname.outputs.name || inputs.artifact-name }}
          path: build/image/*
      - name: Upload artifact (xml-dump-type-sizes)
        if: inputs.artifact-name && inputs.xml-dump-type-sizes
        uses: actions/upload-artifact@v3
        with:
          name: ${{ inputs.append-date-and-hash && steps.artifactname.outputs.name || inputs.artifact-name }}
          path: build/library/xml/tools/CMakeFiles/xml-dump-type-sizes.dir/*

which adds complexity, but isn't horrible. Another way is to add a build option controlling whether to install the binary, which can be off by default but turned on by CI. That would mean adding a -DINSTALL_XMLDUMP:BOOL=1 to the configure steps in build-linux and build-windows -- it can always be on, so there would less added mess to the yaml, no additional input parameters or extra workflow steps.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, the reuse of the normal build+install workflow steps for the size-check pipeline does make things more complicated. I guess an INSTALL_XMLDUMP flag could work, but it just doesn't feel as clean to me.

I had the old workflow build and run xml-dump-type-sizes in a single workflow, rather than splitting up the build+run steps. I would really like to recombine them - the added time of packaging an artifact in one step and unpackaging it in the next isn't really necessary - but I'm not sure this is possible.

Copy link
Member Author

@myk002 myk002 Aug 16, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reusable workflows might make this particular workflow take more individual steps, but it also makes it possible to get combined type size reports for windows and linux (and has a pluggable structure where we can easily add mac as well if we want to), which I think is kind of important. Also, consolidating the build logic in a reusable workflow has given us the ability to run builds and tests in the structures and scripts repos without having to maintain N copies of the build logic. Can you really say that's not worth it?

In this case, the latency appears to almost entirely be due to the download of the docker image for the windows build of xml-dump-type-sizes (>1 minute). uploading/downloading of artifacts is on the order of a few seconds across the entire workflow (reference workflow). If we could slim down the docker download, we would see speedups across all our workflows.

Another thing we could do here is split the linux report from the windows report, allowing the linux one to appear much sooner (<1m) and give quicker initial feedback. The windows report would show up about 1m30s after the linux report. Of course, if we could get the cross compiling docker image size down so that it didn't take so long to download, splitting the reports would be unnecessary.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's not what I'm saying... I'm only saying that the upload/download steps are not strictly necessary, and complicated the logic for only installing what we need somewhat. My mistake on the timing - I am used to the upload step in particular being slow for large numbers of files, but that is not the case here.

Thanks for the followup changes!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

2 participants