Skip to content

Commit

Permalink
[Storage] Added connection pool note to max_concurrency kwarg for u…
Browse files Browse the repository at this point in the history
…pload/download APIs (#38254)
  • Loading branch information
weirongw23-msft authored Nov 4, 2024
1 parent 9c40f03 commit 957903c
Show file tree
Hide file tree
Showing 6 changed files with 38 additions and 12 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -540,8 +540,9 @@ def upload_blob(
value specified in this header, the request will fail with
MaxBlobSizeConditionNotMet error (HTTP status code 412 - Precondition Failed).
:keyword int max_concurrency:
Maximum number of parallel connections to use when the blob size exceeds
64MB.
Maximum number of parallel connections to use when transferring the blob in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword ~azure.storage.blob.CustomerProvidedEncryptionKey cpk:
Encrypts the data on the service-side with the given key.
Use of customer-provided keys must be done over HTTPS.
Expand Down Expand Up @@ -695,7 +696,9 @@ def download_blob(
As the encryption key itself is provided in the request,
a secure connection must be established to transfer the key.
:keyword int max_concurrency:
The number of parallel connections with which to download.
Maximum number of parallel connections to use when transferring the blob in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword Optional[str] encoding:
Encoding to decode the downloaded bytes. Default is None, i.e. no decoding.
:keyword progress_hook:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -530,8 +530,9 @@ async def upload_blob(
value specified in this header, the request will fail with
MaxBlobSizeConditionNotMet error (HTTP status code 412 - Precondition Failed).
:keyword int max_concurrency:
Maximum number of parallel connections to use when the blob size exceeds
64MB.
Maximum number of parallel connections to use when transferring the blob in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword ~azure.storage.blob.CustomerProvidedEncryptionKey cpk:
Encrypts the data on the service-side with the given key.
Use of customer-provided keys must be done over HTTPS.
Expand Down Expand Up @@ -687,7 +688,9 @@ async def download_blob(
As the encryption key itself is provided in the request,
a secure connection must be established to transfer the key.
:keyword int max_concurrency:
The number of parallel connections with which to download.
Maximum number of parallel connections to use when transferring the blob in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword str encoding:
Encoding to decode the downloaded bytes. Default is None, i.e. no decoding.
:keyword progress_hook:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -493,6 +493,10 @@ def upload_data(
see `here <https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/storage/azure-storage-file-datalake
#other-client--per-operation-configuration>`_. This method may make multiple calls to the service and
the timeout will apply to each call individually.
:keyword int max_concurrency:
Maximum number of parallel connections to use when transferring the file in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword int chunk_size:
The maximum chunk size for uploading a file in chunks.
Defaults to 100*1024*1024, or 100MB.
Expand Down Expand Up @@ -775,7 +779,9 @@ def download_file(self, offset=None, length=None, **kwargs):
Use of customer-provided keys must be done over HTTPS.
Required if the file was created with a Customer-Provided Key.
:keyword int max_concurrency:
The number of parallel connections with which to download.
Maximum number of parallel connections to use when transferring the file in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword int timeout:
Sets the server-side timeout for the operation in seconds. For more details see
https://learn.microsoft.com/rest/api/storageservices/setting-timeouts-for-blob-service-operations.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -406,6 +406,10 @@ async def upload_data(
see `here <https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/storage/azure-storage-file-datalake
#other-client--per-operation-configuration>`_. This method may make multiple calls to the service and
the timeout will apply to each call individually.
:keyword int max_concurrency:
Maximum number of parallel connections to use when transferring the file in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword int chunk_size:
The maximum chunk size for uploading a file in chunks.
Defaults to 100*1024*1024, or 100MB.
Expand Down Expand Up @@ -623,7 +627,9 @@ async def download_file(self, offset=None, length=None, **kwargs):
Use of customer-provided keys must be done over HTTPS.
Required if the file was created with a Customer-Provided Key.
:keyword int max_concurrency:
The number of parallel connections with which to download.
Maximum number of parallel connections to use when transferring the file in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword int timeout:
Sets the server-side timeout for the operation in seconds. For more details see
https://learn.microsoft.com/rest/api/storageservices/setting-timeouts-for-blob-service-operations.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -535,7 +535,9 @@ def upload_file(
already validate. Note that this MD5 hash is not stored with the
file.
:keyword int max_concurrency:
Maximum number of parallel connections to use.
Maximum number of parallel connections to use when transferring the file in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword lease:
Required if the file has an active lease. Value can be a ShareLeaseClient object
or the lease ID as a string.
Expand Down Expand Up @@ -805,7 +807,9 @@ def download_file(
Number of bytes to read from the stream. This is optional, but
should be supplied for optimal performance.
:keyword int max_concurrency:
Maximum number of parallel connections to use.
Maximum number of parallel connections to use when transferring the file in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword bool validate_content:
If true, calculates an MD5 hash for each chunk of the file. The storage
service checks the hash of the content that has arrived with the hash
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -532,7 +532,9 @@ async def upload_file(
already validate. Note that this MD5 hash is not stored with the
file.
:keyword int max_concurrency:
Maximum number of parallel connections to use.
Maximum number of parallel connections to use when transferring the file in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword str encoding:
Defaults to UTF-8.
:keyword lease:
Expand Down Expand Up @@ -804,7 +806,9 @@ async def download_file(
Number of bytes to read from the stream. This is optional, but
should be supplied for optimal performance.
:keyword int max_concurrency:
Maximum number of parallel connections to use.
Maximum number of parallel connections to use when transferring the file in chunks.
This option does not affect the underlying connection pool, and may
require a separate configuration of the connection pool.
:keyword bool validate_content:
If true, calculates an MD5 hash for each chunk of the file. The storage
service checks the hash of the content that has arrived with the hash
Expand Down

0 comments on commit 957903c

Please sign in to comment.