Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add some minimal support for increasing the soft limit on open file handles, default to at least 4096 #4893

Merged
merged 4 commits into from
Oct 21, 2019

Conversation

jasongrout
Copy link
Member

This hopefully fixes jupyterlab/jupyterlab#6727

@jasongrout
Copy link
Member Author

WIP - let's test this with some of the people that have been having problems on jupyterlab/jupyterlab#6727 before merging.

Copy link
Member

@kevin-bates kevin-bates left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is great - thank you.

I believe this is an excellent example of a traitlet that would be useful for dynamic configuration - where the server auto-detects config changes and reloads. In this case, we'd probably want the code that performs the update in an @observe decorator - but that's future stuff (see ipython/traitlets#521 and ipython/traitlets#522). This would allow admins encountering 'Too Many Open Files' to at least extend the server's life while addressing the (probable) underlying issue.

self.log.debug(
'Raised open file limit: soft {}->{}; hard {}->{}'.format(old_soft, soft, old_hard, hard)
)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This strikes me as more of an informational message than debug. I would also suggest taking an optimistic approach and logging 'Raising open file...' prior to the actual call. This way, the log will contain the values that were presumably bad and could prove helpful in troubleshooting.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very good point!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went ahead and left it a debug message, but moved it before the action. My thinking is that for most people, this should never be a concern, so don't clutter up their terminal. However, if you are trying to track down an issue, likely you are running with --debug.

@jasongrout
Copy link
Member Author

I believe this is an excellent example of a traitlet that would be useful for dynamic configuration - where the server auto-detects config changes and reloads. In this case, we'd probably want the code that performs the update in an @observe decorator - but that's future stuff (see ipython/traitlets#521 and ipython/traitlets#522). This would allow admins encountering 'Too Many Open Files' to at least extend the server's life while addressing the (probable) underlying issue.

Good point too!

@jasongrout
Copy link
Member Author

Feedback at jupyterlab/jupyterlab#6727 (comment) was that this fixed the issue we were seeing. I'll clean things up like you mention, @kevin-bates.

This makes it clearer in the logs what is happening if there is an error.
@jasongrout
Copy link
Member Author

@kevin-bates, @lresende, and whoever else - I think this is ready for review.

Copy link
Member

@kevin-bates kevin-bates left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Zsailer
Copy link
Member

Zsailer commented Sep 30, 2019

LGTM, too.

@Zsailer
Copy link
Member

Zsailer commented Sep 30, 2019

I'll port this to jupyter_server, too.

@@ -802,6 +808,14 @@ def _token_default(self):
"""
)

min_open_files_limit = Integer(4096, config=True,
Copy link
Member

@mpacer mpacer Oct 7, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feel free to ignore this suggestion, but I think the traitlet name would be clearer if it were soft_open_files_limit. The reason I suggest this is that this is a soft bound on the maximum number of files that can be created, rather than a minimum number of files that must be opened.

Copy link
Member

@mpacer mpacer Oct 7, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be fair, you could call it simply open_files_limit as well, since in practice we also increase the hard limit if the soft limit being set exceeds the existing hard limit.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I called it min_open_files_limit because I only set it if the current open files limit is lower than this number (i.e., I will never decrease an open files limit to this number, only raise lower open file limits to at least this.)

https://github.com/jupyter/notebook/pull/4893/files#diff-bbd307804bb6b8836da9f06b8d8a06e9R1405

That said, I also think the name is confusing, but descriptive, and would love an easier to understand name.

@lresende lresende added this to the 6.0.2 milestone Oct 21, 2019
@lresende lresende merged commit 5acbc15 into jupyter:master Oct 21, 2019
self.log.debug(
'Raising open file limit: soft {}->{}; hard {}->{}'.format(old_soft, soft, old_hard, hard)
)
resource.setrlimit(resource.RLIMIT_NOFILE, (soft, hard))

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to the documentation (https://docs.python.org/3/library/resource.html):

Raises ValueError if an invalid resource is specified, if the new soft limit exceeds the hard limit, or (if a process tries to raise its hard limit. ... A process with the effective UID of super-user can request any valid limit value

So it seems this will fail for non-root

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Getting OSError: [Errno 24] Too many open files
6 participants