Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QUESTION] aiohttp.ClientSession.request('GET') issue #1403

Closed
ja8zyjits opened this issue Nov 15, 2016 · 5 comments
Closed

[QUESTION] aiohttp.ClientSession.request('GET') issue #1403

ja8zyjits opened this issue Nov 15, 2016 · 5 comments
Labels

Comments

@ja8zyjits
Copy link

ja8zyjits commented Nov 15, 2016

Long story short

Iam trying to build a proxy server, but iam unable to perform a GET request properly.

from aiohttp import web
import aiohttp
import asyncio
import async_timeout


loop = asyncio.get_event_loop()

async def handle(request):
    url = request.rel_url
    kwargs = {"headers": request.headers}
    if request.has_body:
        kwargs['data'] = request.read()
    print(kwargs)
    async with aiohttp.ClientSession(loop=loop) as session:
        with async_timeout.timeout(10):
            async with session.request(
                request.method, url, **kwargs) as response:
                html = await response.content.read()
                headers = dict(response.headers)
                status = response.status
            #if 'Content-Encoding' in headers:
            #    del headers['Content-Encoding']
        return web.Response(body=html, headers=headers, status=status)

app = web.Application(loop=loop)
app.router.add_route('GET', '/{url:.*}', handle)
app.router.add_route('POST', '/{url:.*}', handle)
web.run_app(app, port=5000)

Expected behaviour

When we request a url we must get the proper complete response after all redirects etc.

>>> import requests
>>> p = {'http': 'http://127.0.0.1:5000', 'https': 'http://127.0.0.1:5000'}
>>> response = request.get("http://www.facebook.com", proxies=p)
>>> response
<Response [200]>

Actual behaviour

But we are getting an error like this

In [59]: response = requests.get("http://www.facebook.com", proxies=p)
---------------------------------------------------------------------------
ChunkedEncodingError                      Traceback (most recent call last)
<ipython-input-59-404a34deb489> in <module>()
----> 1 response = requests.get("http://www.facebook.com", proxies=p)

/usr/local/lib/python2.7/dist-packages/requests/api.pyc in get(url, params, **kwargs)
     69 
     70     kwargs.setdefault('allow_redirects', True)
---> 71     return request('get', url, params=params, **kwargs)
     72 
     73 

/usr/local/lib/python2.7/dist-packages/requests/api.pyc in request(method, url, **kwargs)
     55     # cases, and look like a memory leak in others.
     56     with sessions.Session() as session:
---> 57         return session.request(method=method, url=url, **kwargs)
     58 
     59 

/usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    473         }
    474         send_kwargs.update(settings)
--> 475         resp = self.send(prep, **send_kwargs)
    476 
    477         return resp

/usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in send(self, request, **kwargs)
    615 
    616         if not stream:
--> 617             r.content
    618 
    619         return r

/usr/local/lib/python2.7/dist-packages/requests/models.pyc in content(self)
    739                     self._content = None
    740                 else:
--> 741                     self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
    742 
    743             except AttributeError:

/usr/local/lib/python2.7/dist-packages/requests/models.pyc in generate()
    665                         yield chunk
    666                 except ProtocolError as e:
--> 667                     raise ChunkedEncodingError(e)
    668                 except DecodeError as e:
    669                     raise ContentDecodingError(e)

ChunkedEncodingError: ('Connection broken: IncompleteRead(16 bytes read)', IncompleteRead(16 bytes read))

For some sites eg:nginx Iam getting a bit different error.

In [61]: response = requests.get("http://www.nginx.org", proxies=p)
---------------------------------------------------------------------------
ContentDecodingError                      Traceback (most recent call last)
<ipython-input-61-d57994e03924> in <module>()
----> 1 response = requests.get("http://www.nginx.org", proxies=p)

/usr/local/lib/python2.7/dist-packages/requests/api.pyc in get(url, params, **kwargs)
     69 
     70     kwargs.setdefault('allow_redirects', True)
---> 71     return request('get', url, params=params, **kwargs)
     72 
     73 

/usr/local/lib/python2.7/dist-packages/requests/api.pyc in request(method, url, **kwargs)
     55     # cases, and look like a memory leak in others.
     56     with sessions.Session() as session:
---> 57         return session.request(method=method, url=url, **kwargs)
     58 
     59 

/usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    473         }
    474         send_kwargs.update(settings)
--> 475         resp = self.send(prep, **send_kwargs)
    476 
    477         return resp

/usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in send(self, request, **kwargs)
    604 
    605         # Resolve redirects if allowed.
--> 606         history = [resp for resp in gen] if allow_redirects else []
    607 
    608         # Shuffle things around if there's history.

/usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in resolve_redirects(self, resp, req, stream, timeout, verify, cert, proxies, **adapter_kwargs)
    177                 proxies=proxies,
    178                 allow_redirects=False,
--> 179                 **adapter_kwargs
    180             )
    181 

/usr/local/lib/python2.7/dist-packages/requests/sessions.pyc in send(self, request, **kwargs)
    615 
    616         if not stream:
--> 617             r.content
    618 
    619         return r

/usr/local/lib/python2.7/dist-packages/requests/models.pyc in content(self)
    739                     self._content = None
    740                 else:
--> 741                     self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
    742 
    743             except AttributeError:

/usr/local/lib/python2.7/dist-packages/requests/models.pyc in generate()
    667                     raise ChunkedEncodingError(e)
    668                 except DecodeError as e:
--> 669                     raise ContentDecodingError(e)
    670                 except ReadTimeoutError as e:
    671                     raise ConnectionError(e)

ContentDecodingError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing: incorrect header check',))

Steps to reproduce

just copy the above server code and run it in one terminal and another terminal to test with ipython

Your environment

Python3.5 virtual env, Ubuntu16.04, my Ipython is running on Python2.7

@popravich popravich added the question StackOverflow label Nov 15, 2016
@ja8zyjits
Copy link
Author

I got some break through in the problem, the problem is with the headers
i tried

if 'Content-Encoding' in headers:
    del headers['Content-Encoding']
if 'Transfer-Encoding' in headers:
    del headers['Transfer-Encoding']

is there any way to transfer ClientSession.Response object to 'aiohttp.web.Response' object so that i can forward the headers properly?

@asvetlov
Copy link
Member

asvetlov commented Nov 15, 2016

There is no such function in aiohttp, sorry.
Full fledged proxy server usually does more job than just passing data from source socket to destination and back.
It serves websockets, adds Via/Forwarded headers etc.

@ja8zyjits
Copy link
Author

@asvetlov oh I see, thanks.
But Still a Great library 👍

@asvetlov
Copy link
Member

Thank you

@lock
Copy link

lock bot commented Oct 29, 2019

This thread has been automatically locked since there has not been
any recent activity after it was closed. Please open a new issue for
related bugs.

If you feel like there's important points made in this discussion,
please include those exceprts into that new issue.

@lock lock bot added the outdated label Oct 29, 2019
@lock lock bot locked as resolved and limited conversation to collaborators Oct 29, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

3 participants