Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Aiohttp failing to get response on awaited urls #3432

Closed
ChakshuGautam opened this issue Dec 6, 2018 · 6 comments
Closed

Aiohttp failing to get response on awaited urls #3432

ChakshuGautam opened this issue Dec 6, 2018 · 6 comments

Comments

@ChakshuGautam
Copy link

Long story short

Aiohttp failing to get response on awaited urls

    allUrls = await getAllUrls(a, b, c);
    urls = ['url1', 'url3'];
    result = await fetchAll(session, tasks, urls);
    return json(result);

When using awaited allUrls I am getting the following error
aiohttp.client_exceptions.ContentTypeError: 0, message='Attempt to decode JSON with unexpected mimetype: '

When I hardcode the urls (by printing them previously) in the variable urls, it gives the expected output

Expected behaviour

It should be same in both cases. I am dynamically generating the urls and I might be missing something really basic.

@aio-libs-bot
Copy link

GitMate.io thinks the contributor most likely able to help you is @asvetlov.

Possibly related issues are #660 (aiohttp.request hangs on some URLs), #2920 (AIOHttp failing after some requests), #2540 (Drop await aiohttp.request(...)), #1403 ([QUESTION] aiohttp.ClientSession.request('GET') issue), and #3427 (Readd aiohttp.{get,post,...} shortcuts).

@asvetlov
Copy link
Member

asvetlov commented Dec 6, 2018

I afraid we cannot help you: none of all three used functions (getAllUrls(), fetchAll(), json()) is a part of aiohttp API.
There is no way to run your code and reproduce your problem.

Hint: the exception is raised by resp.json() code if a returned response is not a JSON (technically well-formed JSON response should have Content-Type: application/json header). Passing content_type=None argument disables the check.

@ChakshuGautam
Copy link
Author

Hi,

Thanks for the prompt reply.

Sorry for not providing enough information.

async def fetchAll(session, tasks=[], urls):
    for url in urls:
        print(url)
        task = asyncio.ensure_future(fetchNew(session, url))
        tasks.append(task)
    result = await asyncio.gather(*tasks)
    return result

getAllUrls() is a simple async function which reads the db to returns an array of Urls. I will provide a standalone file to replicate the issue.

#Note: I am using this with Sanic

@webknjaz
Copy link
Member

webknjaz commented Dec 6, 2018

fetchNew() is also not aiohttp's function. And aiohttp is not Sanic.

@ChakshuGautam
Copy link
Author

Sorry guys found a small bug in the code that was doing it. Thanks for the prompt reply.

@lock
Copy link

lock bot commented Dec 7, 2019

This thread has been automatically locked since there has not been
any recent activity after it was closed. Please open a new issue for
related bugs.

If you feel like there's important points made in this discussion,
please include those exceprts into that new issue.

@lock lock bot added the outdated label Dec 7, 2019
@lock lock bot locked as resolved and limited conversation to collaborators Dec 7, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

4 participants