Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting maxClients for parallelism in yaml #7481

Merged
9 commits merged into from
Jun 8, 2021

Conversation

epananth
Copy link
Member

@epananth epananth commented Jun 8, 2021

@epananth epananth requested a review from MattGal June 8, 2021 05:15
Copy link
Member

@MattGal MattGal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • I don't think we should be moving to hosted machines to run publishing until the changes are already merged due to the nature of dependency flow automation
  • Need to at least start with lower numbers if we want to switch to hosted machines

eng/publishing/v3/publish-assets.yml Show resolved Hide resolved
@@ -196,6 +196,10 @@ public string BuildQuality
/// </summary>
public bool UseStreamingPublishing { get; set; } = false;

public int StreamingPublishingMaxClients {get; set;}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps a short comment describing why these values are settable is in order.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added a comment with the streaming and non streaming values

Copy link
Member

@MattGal MattGal Jun 8, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comments should be /// < Summary > style and go here
(argh github formatting)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure will update it

@@ -29,8 +29,7 @@ jobs:
# See https://github.com/dotnet/core-eng/issues/13098 for context
${{ if eq(variables['System.TeamFoundationCollectionUri'], 'https://dev.azure.com/dnceng/') }}:
pool:
name: NetCoreInternal-Pool
queue: BuildPool.Server.Amd64.VS2019
vmImage: 'windows-2019'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unless I'm mistaken (please correct me if so), these values as set below won't do anything until:

As such, I think we should hold off using hosted machines until we can actually lower the numbers from default.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I changed it back to NetCoreInternal-Pool and kept the numbers as 16 and 12 for now. I am good with changing the numbers after we hit any error

Copy link
Member

@MattGal MattGal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems OK as-is, I'd suggest making the comments look a little different but NBD.

@epananth epananth added the auto-merge Automatically merge PR once CI passes. label Jun 8, 2021
@ghost
Copy link

ghost commented Jun 8, 2021

Hello @epananth!

Because this pull request has the auto-merge label, I will be glad to assist with helping to merge this pull request once all check-in policies pass.

p.s. you can customize the way I help with merging this pull request, such as holding this pull request until a specific person approves. Simply @mention me (@msftbot) and give me an instruction to get started! Learn more here.

@ghost ghost merged commit 3edd05c into dotnet:main Jun 8, 2021
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto-merge Automatically merge PR once CI passes.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants