Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RUST-1712 Support User Configuration for max_connecting #923

Merged
merged 12 commits into from
Aug 18, 2023

Conversation

LuisOsta
Copy link
Contributor

Adds max_connecting as a configureable variable in ClientOptions which is then piped through down to ConnectionPoolWorker which utilizes it to determine the maximum number of pending connections that can be created

@LuisOsta
Copy link
Contributor Author

LuisOsta commented Jul 27, 2023

This aims to help with #913 (comment), though I would love feedback on the comments I left and whether or not I properly piped through the variable correctly. One specific thing I wanted to call out is that in from_connection_string I set it to None since max_connecting isn't something that seems to be supported by the URI standard format from looking at the documentation

@abr-egn
Copy link
Contributor

abr-egn commented Jul 27, 2023

This looks exactly right to me, thank you! I've authorized a CI run and if there are no unexpected failures (there are a few expected ones, test server backend issues) I'll merge this in.

@LuisOsta
Copy link
Contributor Author

Thanks! Do you know of a better comment/documentation to describe max_connecting? I wasn't sure if that was the clearest way to describe it but a better one didn't come top of mind

@abr-egn
Copy link
Contributor

abr-egn commented Jul 28, 2023

Do you know of a better comment/documentation to describe max_connecting? I wasn't sure if that was the clearest way to describe it but a better one didn't come top of mind

Hmm, not really. @isabelatkinson, do you have thoughts here?

@abr-egn abr-egn requested a review from isabelatkinson July 28, 2023 14:24
Copy link
Contributor

@isabelatkinson isabelatkinson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code changes look mostly good! There are a few JSON tests for this option we'll need to sync as well. The files are:

Once those are synced they should hopefully be able to run without any extra test runner code changes, but let us know if you run into anything. I can also just push these up to your branch if you'd prefer!

src/client/options.rs Outdated Show resolved Hide resolved
src/client/options.rs Outdated Show resolved Hide resolved
LuisOsta and others added 3 commits July 31, 2023 10:24
@LuisOsta LuisOsta requested a review from isabelatkinson July 31, 2023 16:16
@LuisOsta
Copy link
Contributor Author

@isabelatkinson I think I've addressed all of the changes you mentioned, lmk if that's not the case

@LuisOsta
Copy link
Contributor Author

LuisOsta commented Aug 2, 2023

@isabelatkinson Do you have a sense as to when you'd be able to take a look at this PR again?

@isabelatkinson
Copy link
Contributor

Hi @LuisOsta, I am taking a look now and will get back to you today.

Copy link
Contributor

@isabelatkinson isabelatkinson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I authorized a CI run for the most recent commit and it looks like the run_uri_options_spec_tests test is failing with the following error:

 [2023/08/02 19:14:33.200] thread 'client::options::test::run_uri_options_spec_tests' panicked at 'Valid connection pool options are parsed correctly: Error { kind: InvalidArgument { message: "maxconnecting is an invalid option. An option with a similar name exists: maxconnecting" }, labels: {}, wire_version: None, source: None }', src/client/options/test.rs:129:22

I believe adding "maxconnecting" and corresponding parsing logic to this match statement will resolve this issue.

Thank you for syncing the connection-pool-options.json test! It looks like the other test wasn't added; maybe it's currently an untracked file on your branch?

src/client/options.rs Outdated Show resolved Hide resolved
@LuisOsta
Copy link
Contributor Author

LuisOsta commented Aug 2, 2023

@isabelatkinson Thank you for the quick review and the comments! Yes I forgot to include the older test I've added it now and hopefully the provides the needed coverage

@LuisOsta LuisOsta requested a review from isabelatkinson August 2, 2023 19:53
@isabelatkinson
Copy link
Contributor

Hi @LuisOsta, I authorized another CI run and I'm seeing the following failures:

 [2023/08/03 16:29:00.583] FAILURE: failed client::options::test::run_uri_options_spec_tests (cargo test)

 [2023/08/03 16:29:00.583] thread 'client::options::test::run_uri_options_spec_tests' panicked at 'assertion failed: `(left == right)`: Valid connection pool options are parsed correctly

 [2023/08/03 16:29:00.583] Diff < left / right > :

 [2023/08/03 16:29:00.583]  Document({

 [2023/08/03 16:29:00.583]      "maxidletimems": Int32(

 [2023/08/03 16:29:00.583]          50000,

 [2023/08/03 16:29:00.583]      ),

 [2023/08/03 16:29:00.583]      "maxpoolsize": Int64(

 [2023/08/03 16:29:00.583]          5,

 [2023/08/03 16:29:00.583]      ),

 [2023/08/03 16:29:00.583]      "minpoolsize": Int64(

 [2023/08/03 16:29:00.583]          3,

 [2023/08/03 16:29:00.583]      ),

 [2023/08/03 16:29:00.583] >    "maxconnecting": Int32(

 [2023/08/03 16:29:00.583] >        5,

 [2023/08/03 16:29:00.583] >    ),

 [2023/08/03 16:29:00.583]  })

and

 [2023/08/03 16:29:00.633] Executing maxConnecting is enforced

 [2023/08/03 16:29:00.633] thread 'cmap::test::cmap_spec_tests' panicked at '[maxConnecting is enforced] actual

 [2023/08/03 16:29:00.633] ConnectionCreated(

 [2023/08/03 16:29:00.633]     ConnectionCreatedEvent {

 [2023/08/03 16:29:00.633]         address: Tcp {

 [2023/08/03 16:29:00.633]             host: "localhost",

 [2023/08/03 16:29:00.633]             port: Some(

 [2023/08/03 16:29:00.633]                 27017,

 [2023/08/03 16:29:00.633]             ),

 [2023/08/03 16:29:00.633]         },

 [2023/08/03 16:29:00.633]         connection_id: 3,

 [2023/08/03 16:29:00.633]     },

 [2023/08/03 16:29:00.633] )

 [2023/08/03 16:29:00.633]  did not MATCH expected

 [2023/08/03 16:29:00.633] ConnectionReady(

 [2023/08/03 16:29:00.633]     ConnectionReadyEvent {

 [2023/08/03 16:29:00.633]         address: Tcp {

 [2023/08/03 16:29:00.633]             host: "",

 [2023/08/03 16:29:00.633]             port: None,

 [2023/08/03 16:29:00.633]         },

 [2023/08/03 16:29:00.633]         connection_id: 1,

 [2023/08/03 16:29:00.633]     },

 [2023/08/03 16:29:00.633] )

 [2023/08/03 16:29:00.633]  MATCH failure: expected event ConnectionCreated(ConnectionCreatedEvent { address: Tcp { host: "localhost", port: Some(27017) }, connection_id: 3 }), got ConnectionReady(ConnectionReadyEvent { address: Tcp { host: "", port: None }, connection_id: 1 })', src/test/util/matchable.rs:273:5

If you set the MONGODB_URI environment variable to your connection string and run cargo test you should be able to reproduce these failures locally. I am also happy to help you with debugging/push fixes if you'd like -- let me know!

@LuisOsta
Copy link
Contributor Author

LuisOsta commented Aug 7, 2023

@isabelatkinson Thanks for the info! I'll see if I can debug it locally and push a fix

@LuisOsta
Copy link
Contributor Author

LuisOsta commented Aug 8, 2023

@isabelatkinson @abr-egn Why are there duplicates versions of each test in JSON and YAML? I haven't been able to figure that out. I'll put some work on getting the tests to pass but if y'all have the bandwidth I would appreciate any guidance or patch to fix the tests.

Also if y'all know how to run a specific test let me know

@isabelatkinson
Copy link
Contributor

@LuisOsta You can ignore the YAML files. We sync these tests from the MongoDB drivers-wide specifications repo which has the tests in both JSON and YAML format, but they're identical and we only parse/run the JSON.

We are currently focused on other high-priority work but we do have this ticket planned for this quarter. We should be able to get to it in the coming month if you're not able to finish it up!

Also if y'all know how to run a specific test let me know

You can pass the name of the test you want to run to cargo test to run it individually, e.g.

cargo test run_uri_options_spec_tests

@LuisOsta
Copy link
Contributor Author

LuisOsta commented Aug 9, 2023

@isabelatkinson @abr-egn I think I've fixed the tests and resolved the bugs now! I've run the test suite locally and they're all passing

@abr-egn abr-egn self-requested a review August 10, 2023 14:49
@LuisOsta
Copy link
Contributor Author

LuisOsta commented Aug 11, 2023

@abr-egn @isabelatkinson Interesting there seems to be linting issues but I can't run the linter .evergreen/check-all.sh as I seem to be missing a file:

./.evergreen/env.sh: line 10: /.cargo/env: No such file or directory

Any guidance on how to get that file or what linting issues there are would be great. Also when I run cargo fmt a myriad of files get changed so I assume its incorrect/unconfigured for this repo

@LuisOsta
Copy link
Contributor Author

@isabelatkinson @abr-egn Can y'all lmk what failed in this EVG run? And if there are problems? I can't see the actual log lines and would love to get this PR finished

@abr-egn
Copy link
Contributor

abr-egn commented Aug 15, 2023

The scripts in .evergreen don't work in a local client, unfortunately. You can run them locally by using the same invocation as in the script directly, i.e.

rustfmt +nightly --unstable-features --check src/**/*.rs
rustfmt +nightly --unstable-features --check src/*.rs

It looks like the lint failure is just an import order diff:

[2023/08/10 16:28:39.736] Diff in /data/mci/146ceff7535bc455b9b32f7ac7a0b512/src/src/cmap/worker.rs at line 4:
[2023/08/10 16:28:39.736]      conn::PendingConnection,
[2023/08/10 16:28:39.736]      connection_requester,
[2023/08/10 16:28:39.736]      connection_requester::{
[2023/08/10 16:28:39.736] -        ConnectionRequest, ConnectionRequestReceiver, ConnectionRequestResult, ConnectionRequester,
[2023/08/10 16:28:39.736] +        ConnectionRequest,
[2023/08/10 16:28:39.736] +        ConnectionRequestReceiver,
[2023/08/10 16:28:39.736] +        ConnectionRequestResult,
[2023/08/10 16:28:39.736] +        ConnectionRequester,
[2023/08/10 16:28:39.736]      },
[2023/08/10 16:28:39.736]      establish::ConnectionEstablisher,
[2023/08/10 16:28:39.736]      manager,
[2023/08/10 16:28:39.736] Diff in /data/mci/146ceff7535bc455b9b32f7ac7a0b512/src/src/cmap/worker.rs at line 12:
[2023/08/10 16:28:39.736]      options::ConnectionPoolOptions,
[2023/08/10 16:28:39.736]      status,
[2023/08/10 16:28:39.736]      status::{PoolGenerationPublisher, PoolGenerationSubscriber},
[2023/08/10 16:28:39.736] -    Connection, DEFAULT_MAX_POOL_SIZE,
[2023/08/10 16:28:39.736] +    Connection,
[2023/08/10 16:28:39.736] +    DEFAULT_MAX_POOL_SIZE,
[2023/08/10 16:28:39.736]  };
[2023/08/10 16:28:39.736]  use crate::{
[2023/08/10 16:28:39.736]      bson::oid::ObjectId,
[2023/08/10 16:28:39.736] Diff in /data/mci/146ceff7535bc455b9b32f7ac7a0b512/src/src/cmap/worker.rs at line 19:
[2023/08/10 16:28:39.736]      client::auth::Credential,
[2023/08/10 16:28:39.736]      error::{load_balanced_mode_mismatch, Error, ErrorKind, Result},
[2023/08/10 16:28:39.736]      event::cmap::{
[2023/08/10 16:28:39.736] -        CmapEventEmitter, ConnectionClosedEvent, ConnectionClosedReason, PoolClearedEvent,
[2023/08/10 16:28:39.736] -        PoolClosedEvent, PoolReadyEvent,
[2023/08/10 16:28:39.736] +        CmapEventEmitter,
[2023/08/10 16:28:39.736] +        ConnectionClosedEvent,
[2023/08/10 16:28:39.736] +        ConnectionClosedReason,
[2023/08/10 16:28:39.736] +        PoolClearedEvent,
[2023/08/10 16:28:39.736] +        PoolClosedEvent,
[2023/08/10 16:28:39.736] +        PoolReadyEvent,
[2023/08/10 16:28:39.736]      },
[2023/08/10 16:28:39.736]      options::ServerAddress,
[2023/08/10 16:28:39.736]      runtime::{self, WorkerHandleListener},

The rest of the failures are in a CMAP spec test:

[2023/08/10 16:52:09.153] thread 'cmap::test::cmap_spec_tests' panicked at '[custom maxConnecting is enforced] actual
[2023/08/10 16:52:09.153] ConnectionReady(
[2023/08/10 16:52:09.153]     ConnectionReadyEvent {
[2023/08/10 16:52:09.153]         address: Tcp {
[2023/08/10 16:52:09.153]             host: "localhost",
[2023/08/10 16:52:09.153]             port: Some(
[2023/08/10 16:52:09.153]                 27017,
[2023/08/10 16:52:09.153]             ),
[2023/08/10 16:52:09.153]         },
[2023/08/10 16:52:09.153]         connection_id: 3,
[2023/08/10 16:52:09.153]     },
[2023/08/10 16:52:09.153] )
[2023/08/10 16:52:09.153]  did not MATCH expected
[2023/08/10 16:52:09.153] ConnectionReady(
[2023/08/10 16:52:09.153]     ConnectionReadyEvent {
[2023/08/10 16:52:09.153]         address: Tcp {
[2023/08/10 16:52:09.153]             host: "",
[2023/08/10 16:52:09.153]             port: None,
[2023/08/10 16:52:09.153]         },
[2023/08/10 16:52:09.153]         connection_id: 1,
[2023/08/10 16:52:09.153]     },
[2023/08/10 16:52:09.153] )
[2023/08/10 16:52:09.153]  MATCH failure: expected u32 1, got 3', src/test/util/matchable.rs:273:5

Let me know if you can reproduce that - if not I'll take a look.

@LuisOsta
Copy link
Contributor Author

@abr-egn Okay I think I've fixed all of the issues, the issue with the cmap was due to a race condition in the test that made it pass sometimes locally. Thanks for the help!

@abr-egn
Copy link
Contributor

abr-egn commented Aug 16, 2023

Can you give more details about the race condition? The tests are run across all drivers, so it's rare that something like that is actually a bug in the test.

@LuisOsta
Copy link
Contributor Author

@abr-egn Oh it was fixed and completely on me, it was with how I designed the custom max enforcement test, essentially a misuse of waitForEvent with regards to ConnectionCreated and ConnectionReady. It should all be passing now I think

@LuisOsta
Copy link
Contributor Author

It seems that there remained some linting issue though for me running:

rustfmt +nightly --unstable-features --check src/**/*.rs

and

rustfmt +nightly --unstable-features --check src/*.rs

locally both worked so I'm unsure what remains

@abr-egn
Copy link
Contributor

abr-egn commented Aug 16, 2023

Sorry! I should have been more specific. The contents of pool-checkout-custom-maxConnecting-is-enforced.json have diverged quite a bit from the file in the spec repo. We need to stay in sync with the specs; if there's genuinely a bug in the test we'll need to update that so other drivers can get the fix. Does it not pass with the original contents?

@LuisOsta
Copy link
Contributor Author

@abr-egn Ah I think I misunderstood the testing situation, I've made it align with the spec tests!

Copy link
Contributor

@abr-egn abr-egn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! The only remaining evergreen failures are known issues unrelated to this PR.

@LuisOsta
Copy link
Contributor Author

@abr-egn Thanks! Once this gets merged, when should I expect this (and #932 ) to be included in a release? (Also when's the next release)

Copy link
Contributor

@isabelatkinson isabelatkinson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, thank you for your contribution! we are planning on doing a 2.7.0-beta release within the next week or so which will include both this PR and #932.

@LuisOsta
Copy link
Contributor Author

@isabelatkinson Awesome, thanks! And happy to contribute. That's great to hear about the beta release. And when would the stable release be cut? I'd love to be able to plan for that release accordingly

@abr-egn abr-egn merged commit c5b3f83 into mongodb:main Aug 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants