Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Configure dhstore key sharding #1759

Closed
wants to merge 13 commits into from
Closed

Configure dhstore key sharding #1759

wants to merge 13 commits into from

Conversation

gammazero
Copy link
Collaborator

Enables DH key sharding in core.

@codecov-commenter
Copy link

codecov-commenter commented May 18, 2023

Codecov Report

Patch coverage: 9.37% and project coverage change: +0.04 🎉

Comparison is base (17fea18) 51.25% compared to head (4689c1c) 51.30%.

❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1759      +/-   ##
==========================================
+ Coverage   51.25%   51.30%   +0.04%     
==========================================
  Files         101      101              
  Lines       11253    11263      +10     
==========================================
+ Hits         5768     5778      +10     
- Misses       4901     4903       +2     
+ Partials      584      582       -2     
Impacted Files Coverage Δ
command/daemon.go 0.00% <0.00%> (ø)
config/indexer.go 39.06% <ø> (ø)
internal/ingest/linksystem.go 67.40% <100.00%> (-0.17%) ⬇️
server/find/handler/cached_stats.go 89.58% <100.00%> (ø)

... and 1 file with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

masih added a commit that referenced this pull request May 18, 2023
Make a deployable container from `dh-key-sharding` branch for testing
the sharding mechanims in dev.

See: #1759
masih added a commit that referenced this pull request May 18, 2023
Make a deployable container from `dh-key-sharding` branch for testing
the sharding mechanims in dev.

See: #1759
@masih masih force-pushed the dh-key-sharding branch from 69ea0df to 9c524d2 Compare May 18, 2023 14:37
masih added a commit that referenced this pull request May 18, 2023
Deploy a new node called `dana` in `dev` wht the head of #1759 to test
out sharding mechanism across dhstore cluster.

The indexer node is configured to read ads from the S3 bucket populated
by `ago`.
masih added a commit that referenced this pull request May 18, 2023
Deploy a new node called `dana` in `dev` wht the head of #1759 to test
out sharding mechanism across dhstore cluster.

The indexer node is configured to read ads from the S3 bucket populated
by `ago`.
gammazero and others added 11 commits May 20, 2023 02:27
Deploy a new node called `dana` in `dev` wht the head of #1759 to test
out sharding mechanism across dhstore cluster.

The indexer node is configured to read ads from the S3 bucket populated
by `ago`.
Node group is used by indexer nodes and had ha max of 3
* Change ExtendedProviders behaviour to 'replace' instead of 'add'

Removed ExtendedProviders additive behaviour. Each ExtendedProviders advertisement will have to have a full set of ExtendedProviders from now on (no deltas). Additive behaviour makes updates and removals harder and is generally counterintuitive.
* Bump up dhstore CPU cores to 7
* Reduce ago ingest worker count to 5
* Temporarily disable ago writes to let dhstore do compaction
@gammazero
Copy link
Collaborator Author

Something got messed up in the last commit. Replacing this PR with #1778

@gammazero gammazero closed this May 20, 2023
@gammazero gammazero deleted the dh-key-sharding branch May 20, 2023 09:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants