Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

channeldb+routing+gossiper: add local updates to graph immediately #4964

Merged
merged 4 commits into from
Feb 11, 2021

Conversation

halseth
Copy link
Contributor

@halseth halseth commented Jan 27, 2021

Since the batch interval can potentially be long, adding local updates
to the graph could be slow. This would slow down operations like adding
our own channel update and announcements during the funding process, and
updating edge policies for local channels.

Now we instead check whether the update is remote or not, and use the
SchedulerOption to immediately add local updates to the graph.

Companion PR to #4958

@halseth halseth requested a review from cfromknecht January 27, 2021 12:52
@halseth halseth added this to the 0.13.0 milestone Jan 27, 2021
@halseth halseth changed the title hanneldb+routing+gossiper: add local updates to graph immediately channeldb+routing+gossiper: add local updates to graph immediately Jan 27, 2021
@Roasbeef Roasbeef added P1 MUST be fixed or reviewed bug fix database Related to the database/storage of LND graph labels Jan 28, 2021
batch/interface.go Outdated Show resolved Hide resolved
@halseth halseth force-pushed the gossip-local-no-batch branch 2 times, most recently from fedac28 to 1b182bf Compare February 1, 2021 12:52
Copy link
Contributor

@cfromknecht cfromknecht left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🎉

Copy link
Collaborator

@Crypt-iQ Crypt-iQ left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did not test, but changes seem straightforward. One nit

lazy bool
}

// SchedulerOption is a type that can be can supply options to a scheduled
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can be can -> can

Copy link
Member

@Roasbeef Roasbeef left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🤽‍♂️

@Roasbeef
Copy link
Member

Roasbeef commented Feb 4, 2021

There appears to be an itest failure related to private channels.

@halseth halseth force-pushed the gossip-local-no-batch branch 2 times, most recently from f3d8d99 to 23f7958 Compare February 4, 2021 09:34
@halseth
Copy link
Contributor Author

halseth commented Feb 4, 2021

The flakes seems to just be the same as those seen in master, not clear whether this PR makes it worse.

I think we should wait until we have #4953 or similar merged.

@halseth halseth force-pushed the gossip-local-no-batch branch from 23f7958 to 7993d19 Compare February 4, 2021 09:40
@halseth halseth added the blocked label Feb 4, 2021
@Roasbeef
Copy link
Member

Roasbeef commented Feb 9, 2021

The flakes seems to just be the same as those seen in master, not clear whether this PR makes it worse.

@halseth are they? I'm seeing blanket failures across all the itests for this PR.

@halseth halseth force-pushed the gossip-local-no-batch branch from 7993d19 to cea1b1c Compare February 9, 2021 10:24
@halseth
Copy link
Contributor Author

halseth commented Feb 9, 2021

The flakes seems to just be the same as those seen in master, not clear whether this PR makes it worse.

@halseth are they? I'm seeing blanket failures across all the itests for this PR.

Yeah, looks like this makes it worse indeed. I still think the root cause is fixed by #4953 though, probably some timing interaction of this PR that makes it more likely to flake.

@cfromknecht
Copy link
Contributor

@halseth @Roasbeef let's merge #5003 and rebase over that to see if those flakes go away. if so i think this is ready to land

@halseth halseth force-pushed the gossip-local-no-batch branch from cea1b1c to 5a3e195 Compare February 10, 2021 20:19
@halseth
Copy link
Contributor Author

halseth commented Feb 10, 2021

Rebased on #5003 to check travis

We make the default non-lazy, and will make the incoming gossip requests
lazy.
Since the batch interval can potentially be long, adding local updates
to the graph could be slow. This would slow down operations like adding
our own channel update and announcements during the funding process, and
updating edge policies for local channels.

Now we instead check whether the update is remote or not, and only for
remote updates use the SchedulerOption to lazily add them to the graph.
@halseth halseth force-pushed the gossip-local-no-batch branch from 5a3e195 to c9afc93 Compare February 10, 2021 22:54
@halseth
Copy link
Contributor Author

halseth commented Feb 10, 2021

rebased on master

Copy link
Contributor

@cfromknecht cfromknecht left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGMTv2

Copy link
Member

@Roasbeef Roasbeef left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🧃

// If this is a remote update, we set the scheduler option to lazily
// add it to the graph.
var schedulerOp []batch.SchedulerOption
if nMsg.isRemote {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@Roasbeef Roasbeef merged commit 355b6a2 into lightningnetwork:master Feb 11, 2021
@cfromknecht cfromknecht mentioned this pull request Feb 11, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug fix database Related to the database/storage of LND graph P1 MUST be fixed or reviewed v0.13
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants