Skip to content

Commit

Permalink
fix retries and refactor httpsession (#877)
Browse files Browse the repository at this point in the history
  • Loading branch information
thehesiod authored Aug 20, 2021
1 parent d3db4f3 commit ef96004
Show file tree
Hide file tree
Showing 21 changed files with 296 additions and 208 deletions.
24 changes: 15 additions & 9 deletions CHANGES.rst
Original file line number Diff line number Diff line change
@@ -1,38 +1,44 @@
Changes
-------
1.4.0 (2021-08-20)
^^^^^^^^^^^^^^^^^^
* fix retries via config `#877 <https://github.com/aio-libs/aiobotocore/pull/877>`_
* remove AioSession and get_session top level names to match botocore_
* change exceptions raised to match those of botocore_, see `mappings <https://github.com/aio-libs/aiobotocore/pull/877/files#diff-b1675e1eb4276bfae81107cda919ba446e4ce1b1e228a9e878d65dd1f474bf8cR162-R181>`_

1.3.3 (2021-07-12)
^^^^^^^^^^^^^^^^^^
* fix AioJSONParser #872
* fix AioJSONParser `#872 <https://github.com/aio-libs/aiobotocore/issues/872>`_

1.3.2 (2021-07-07)
^^^^^^^^^^^^^^^^^^
* Bump to botocore 1.20.106
* Bump to botocore_ to `1.20.106 <https://github.com/boto/botocore/tree/1.20.106>`_

1.3.1 (2021-06-11)
^^^^^^^^^^^^^^^^^^
* TCPConnector: change deprecated ssl_context to ssl
* fix non awaited generate presigned url calls #868
* fix non awaited generate presigned url calls `#868 <https://github.com/aio-libs/aiobotocore/issues/868>`_

1.3.0 (2021-04-09)
^^^^^^^^^^^^^^^^^^
* Bump to botocore 1.20.49 #856
* Bump to botocore_ to `1.20.49 <https://github.com/boto/botocore/tree/1.20.49>`_ `#856 <https://github.com/aio-libs/aiobotocore/pull/856>`_

1.2.2 (2021-03-11)
^^^^^^^^^^^^^^^^^^
* Await call to async method _load_creds_via_assume_role #851 (thanks @puzza007)
* Await call to async method _load_creds_via_assume_role `#858 <https://github.com/aio-libs/aiobotocore/pull/858>`_ (thanks `@puzza007 <https://github.com/puzza007>`_)

1.2.1 (2021-02-10)
^^^^^^^^^^^^^^^^^^
* verify strings are now correctly passed to aiohttp.TCPConnector #851 (thanks @FHTMitchell)
* verify strings are now correctly passed to aiohttp.TCPConnector `#851 <https://github.com/aio-libs/aiobotocore/pull/851>`_ (thanks `@FHTMitchell <https://github.com/FHTMitchell>`_)

1.2.0 (2021-01-11)
^^^^^^^^^^^^^^^^^^
* bump botocore to 1.19.52
* use passed in http_session_cls param to create_client (#797)
* bump botocore to `1.19.52 <https://github.com/boto/botocore/tree/1.19.52>`_
* use passed in http_session_cls param to create_client `#797 <https://github.com/aio-libs/aiobotocore/issues/797>`_

1.1.2 (2020-10-07)
^^^^^^^^^^^^^^^^^^
* fix AioPageIterator search method #831 (thanks @joseph-jones)
* fix AioPageIterator search method #831 (thanks `@joseph-jones <https://github.com/joseph-jones>`_)

1.1.1 (2020-08-31)
^^^^^^^^^^^^^^^^^^
Expand Down
32 changes: 14 additions & 18 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,7 @@ aiobotocore

Async client for amazon services using botocore_ and aiohttp_/asyncio_.

Main purpose of this library to support amazon s3 api, but other services
should work (may be with minor fixes). For now we have tested
only upload/download api for s3, other users report that SQS and Dynamo
services work also. More tests coming soon.
This library is a mostly full featured asynchronous version of botocore.


Install
Expand All @@ -36,7 +33,7 @@ Basic Example
.. code:: python
import asyncio
import aiobotocore
from aiobotocore.session import get_session
AWS_ACCESS_KEY_ID = "xxx"
AWS_SECRET_ACCESS_KEY = "xxx"
Expand All @@ -48,7 +45,7 @@ Basic Example
folder = 'aiobotocore'
key = '{}/{}'.format(folder, filename)
session = aiobotocore.get_session()
session = get_session()
async with session.create_client('s3', region_name='us-west-2',
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
aws_access_key_id=AWS_ACCESS_KEY_ID) as client:
Expand Down Expand Up @@ -90,36 +87,36 @@ Context Manager Examples
.. code:: python
from contextlib import AsyncExitStack
from aiobotocore.session import AioSession
# How to use in existing context manager
class Manager:
def __init__(self):
self._exit_stack = AsyncExitStack()
self._s3_client = None
async def __aenter__(self):
session = AioSession()
self._s3_client = await self._exit_stack.enter_async_context(session.create_client('s3'))
async def __aexit__(self, exc_type, exc_val, exc_tb):
await self._exit_stack.__aexit__(exc_type, exc_val, exc_tb)
# How to use with an external exit_stack
async def create_s3_client(session: AioSession, exit_stack: AsyncExitStack):
# Create client and add cleanup
client = await exit_stack.enter_async_context(session.create_client('s3'))
return client
async def non_manager_example():
session = AioSession()
async with AsyncExitStack() as exit_stack:
s3_client = await create_s3_client(session, exit_stack)
# do work with s3_client
Expand Down Expand Up @@ -186,8 +183,7 @@ Requirements
.. _Python: https://www.python.org
.. _asyncio: https://docs.python.org/3/library/asyncio.html
.. _botocore: https://github.com/boto/botocore
.. _aiohttp: https://github.com/KeepSafe/aiohttp

.. _aiohttp: https://github.com/aio-libs/aiohttp

awscli
------
Expand Down
5 changes: 1 addition & 4 deletions aiobotocore/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1 @@
from .session import get_session, AioSession

__all__ = ['get_session', 'AioSession']
__version__ = '1.3.3'
__version__ = '1.4.0'
128 changes: 16 additions & 112 deletions aiobotocore/endpoint.py
Original file line number Diff line number Diff line change
@@ -1,23 +1,17 @@
import aiohttp
import asyncio
import io
import pathlib
import os
import ssl
import sys

import aiohttp.http_exceptions
from aiohttp.client import URL
from botocore.endpoint import EndpointCreator, Endpoint, DEFAULT_TIMEOUT, \
MAX_POOL_CONNECTIONS, logger, history_recorder, create_request_object
from botocore.exceptions import ConnectionClosedError
from botocore.hooks import first_non_none_response
from botocore.utils import is_valid_endpoint_url
from multidict import MultiDict
from urllib.parse import urlparse
from urllib3.response import HTTPHeaderDict

from aiobotocore.httpsession import AIOHTTPSession
from aiobotocore.response import StreamingBody
from aiobotocore._endpoint_helpers import _text, _IOBaseWrapper, \
ClientResponseProxy
from aiobotocore._endpoint_helpers import ClientResponseProxy # noqa: F401, E501 lgtm [py/unused-import]


async def convert_to_response_dict(http_response, operation_model):
Expand Down Expand Up @@ -62,10 +56,6 @@ async def convert_to_response_dict(http_response, operation_model):


class AioEndpoint(Endpoint):
def __init__(self, *args, proxies=None, **kwargs):
super().__init__(*args, **kwargs)
self.proxies = proxies or {}

async def create_request(self, params, operation_model=None):
request = create_request_object(params)
if operation_model:
Expand Down Expand Up @@ -237,53 +227,15 @@ async def _needs_retry(self, attempts, operation_model, request_dict,
return True

async def _send(self, request):
# Note: When using aiobotocore with dynamodb, requests fail on crc32
# checksum computation as soon as the response data reaches ~5KB.
# When AWS response is gzip compressed:
# 1. aiohttp is automatically decompressing the data
# (http://aiohttp.readthedocs.io/en/stable/client.html#binary-response-content)
# 2. botocore computes crc32 on the uncompressed data bytes and fails
# cause crc32 has been computed on the compressed data
# The following line forces aws not to use gzip compression,
# if there is a way to configure aiohttp not to perform decompression,
# we can remove the following line and take advantage of
# aws gzip compression.
# https://github.com/boto/botocore/issues/1255
url = request.url
headers = request.headers
data = request.body

headers['Accept-Encoding'] = 'identity'
headers_ = MultiDict(
(z[0], _text(z[1], encoding='utf-8')) for z in headers.items())

# botocore does this during the request so we do this here as well
# TODO: this should be part of the ClientSession, perhaps make wrapper
proxy = self.proxies.get(urlparse(url.lower()).scheme)

if isinstance(data, io.IOBase):
data = _IOBaseWrapper(data)

url = URL(url, encoded=True)
resp = await self.http_session.request(
request.method, url=url, headers=headers_, data=data, proxy=proxy)

# If we're not streaming, read the content so we can retry any timeout
# errors, see:
# https://github.com/boto/botocore/blob/develop/botocore/vendored/requests/sessions.py#L604
if not request.stream_output:
await resp.read()

return resp
return await self.http_session.send(request)


class AioEndpointCreator(EndpointCreator):
# TODO: handle socket_options
def create_endpoint(self, service_model, region_name, endpoint_url,
verify=None, response_parser_factory=None,
timeout=DEFAULT_TIMEOUT,
max_pool_connections=MAX_POOL_CONNECTIONS,
http_session_cls=aiohttp.ClientSession,
http_session_cls=AIOHTTPSession,
proxies=None,
socket_options=None,
client_cert=None,
Expand All @@ -297,68 +249,20 @@ def create_endpoint(self, service_model, region_name, endpoint_url,
endpoint_prefix = service_model.endpoint_prefix

logger.debug('Setting %s timeout as %s', endpoint_prefix, timeout)

if isinstance(timeout, (list, tuple)):
conn_timeout, read_timeout = timeout
else:
conn_timeout = read_timeout = timeout

if connector_args is None:
# AWS has a 20 second idle timeout:
# https://forums.aws.amazon.com/message.jspa?messageID=215367
# aiohttp default timeout is 30s so set something reasonable here
connector_args = dict(keepalive_timeout=12)

timeout = aiohttp.ClientTimeout(
sock_connect=conn_timeout,
sock_read=read_timeout
)

verify = self._get_verify_value(verify)
ssl_context = None
if client_cert:
if isinstance(client_cert, str):
key_file = None
cert_file = client_cert
elif isinstance(client_cert, tuple):
cert_file, key_file = client_cert
else:
raise TypeError("client_cert must be str or tuple, not %s" %
client_cert.__class__.__name__)

ssl_context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
ssl_context.load_cert_chain(cert_file, key_file)
elif isinstance(verify, (str, pathlib.Path)):
ssl_context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH,
cafile=str(verify))

if ssl_context:
# Enable logging of TLS session keys via defacto standard environment variable # noqa: E501
# 'SSLKEYLOGFILE', if the feature is available (Python 3.8+). Skip empty values. # noqa: E501
if hasattr(ssl_context, 'keylog_filename'):
keylogfile = os.environ.get('SSLKEYLOGFILE')
if keylogfile and not sys.flags.ignore_environment:
ssl_context.keylog_filename = keylogfile

# TODO: add support for proxies_config

connector = aiohttp.TCPConnector(
limit=max_pool_connections,
verify_ssl=bool(verify),
ssl=ssl_context,
**connector_args)

aio_session = http_session_cls(
connector=connector,
http_session = http_session_cls(
timeout=timeout,
skip_auto_headers={'CONTENT-TYPE'},
response_class=ClientResponseProxy,
auto_decompress=False)
proxies=proxies,
verify=self._get_verify_value(verify),
max_pool_connections=max_pool_connections,
socket_options=socket_options,
client_cert=client_cert,
proxies_config=proxies_config,
connector_args=connector_args
)

return AioEndpoint(
endpoint_url,
endpoint_prefix=endpoint_prefix,
event_emitter=self._event_emitter,
response_parser_factory=response_parser_factory,
http_session=aio_session,
proxies=proxies)
http_session=http_session)
Loading

0 comments on commit ef96004

Please sign in to comment.