Skip to content

Commit

Permalink
Pop max concurrency when recursing (langchain-ai#12281)
Browse files Browse the repository at this point in the history
<!-- Thank you for contributing to LangChain!

Replace this entire comment with:
  - **Description:** a description of the change, 
  - **Issue:** the issue # it fixes (if applicable),
  - **Dependencies:** any dependencies required for this change,
- **Tag maintainer:** for a quicker response, tag the relevant
maintainer (see below),
- **Twitter handle:** we announce bigger features on Twitter. If your PR
gets announced, and you'd like a mention, we'll gladly shout you out!

Please make sure your PR is passing linting and testing before
submitting. Run `make format`, `make lint` and `make test` to check this
locally.

See contribution guidelines for more information on how to write/run
tests, lint, etc:

https://github.com/langchain-ai/langchain/blob/master/.github/CONTRIBUTING.md

If you're adding a new integration, please include:
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in `docs/extras`
directory.

If no one reviews your PR within a few days, please @-mention one of
@baskaryan, @eyurtsev, @hwchase17.
 -->
  • Loading branch information
nfcampos authored Oct 25, 2023
1 parent 69f4e40 commit b5b2d07
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion libs/langchain/langchain/llms/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -294,6 +294,7 @@ def batch(
inputs[i : i + max_concurrency]
for i in range(0, len(inputs), max_concurrency)
]
config = [{**c, "max_concurrency": None} for c in config] # type: ignore[misc]
return [
output
for batch in batches
Expand Down Expand Up @@ -336,10 +337,13 @@ async def abatch(
inputs[i : i + max_concurrency]
for i in range(0, len(inputs), max_concurrency)
]
config = [{**c, "max_concurrency": None} for c in config] # type: ignore[misc]
return [
output
for batch in batches
for output in await self.abatch(batch, config=config, **kwargs)
for output in await self.abatch(
batch, config=config, return_exceptions=return_exceptions, **kwargs
)
]

def stream(
Expand Down

0 comments on commit b5b2d07

Please sign in to comment.