Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add stream_response option to hackney adapter #498

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

tim-smart
Copy link

Playing around with streaming responses (#271).

Only implemented for hackney here.

@tim-smart tim-smart force-pushed the streaming branch 3 times, most recently from 1283159 to a4b3802 Compare October 29, 2021 03:34
Hackney only allows the controller process to access the ref
url: "#{@http}/ip"
}

assert {:ok, %Env{} = response} = call(request, stream_to_pid: self())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When passing pid explicitly, I'd expect to handle incoming message manually.
Is there anything stopping us from using steam_response: true instead?

Does the implementation handle multiple concurrent requests originating from the same process?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hackney requires you to transfer ownership of the response to the desired PID, otherwise it will get GCed.

Workaround will be to capture self() during the request phase, and assume that will be the PID that will consume the steam.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And what happens when the stream is consumed from a different process?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From what I recall, Hackney shuts down the request after a short amount of time.

So for long running responses you get a partial reply.

@teamon
Copy link
Member

teamon commented Dec 17, 2021

Amazing! Let's get the response streaming become a thing 👍

@maxmarcon
Copy link

👍 for this!

@teamon teamon linked an issue May 8, 2022 that may be closed by this pull request
@teamon teamon added feature ✨ hackney Issues related to hackney adapter and removed awaiting-feedback labels May 8, 2022
@cgarvis
Copy link
Contributor

cgarvis commented Aug 23, 2022

@tim-smart @teamon anything I can do to get this over the finish line? This would allow me to remove a dependency on Downstream from my app.

@Hajto
Copy link

Hajto commented Sep 18, 2022

Do you need any help with this? I would also want to help to get this through the finish line.

@byronalley
Copy link

I'll add my voice to the "I'd love to see this feature merged, can I help?" crowd!

Tim Smart added 3 commits October 13, 2022 12:41
* Rename `stream_to_pid` to `stream`
* Add `__pid__` to `Env`, for capturing the process that initiated the
  request
* hackney: add `stream_owner` option, which defaults to `env.__pid__`
* Defaults to `:default`
* Has `:stream` option, which attempts to stream the response if the
  adapter supports it
@tim-smart
Copy link
Author

Added some changes to improve the API surface.

  • Added response to the Env struct. Defaults to :default, but also has a :stream option
  • Added __pid__ to Env, for capturing the pid of the requestor process
  • Added stream_owner to the hackney adapter options, which defaults to __pid__ from the Env

New API example:

{:ok, %Env{body: stream}} = Tesla.get("https://google.com", response: :stream)

# Use the response stream
data = Enum.join(stream)

@tim-smart
Copy link
Author

Maybe leaving this as an adapter option is better, as it will not be implemented for every adapter.

@connorjacobsen
Copy link

Hey @teamon I used this as a base for a version that is adapter option driver: https://github.com/connorjacobsen/tesla/tree/hackney-response-streaming

I would love to be able to use hackney to stream responses like you and Tim have detailed here. Is there anything I can do to help make that happen? Happy to take a stab at some more code as needed, would just want some direction from you so it fits with where you want to guide the library.

@connorjacobsen
Copy link

@tim-smart @yordis is there anything I can do to help with this PR? Or the fork I linked above? tesla is a fantastic library and I would love to be able to contribute a little something back to it if it's helpful!

@yordis
Copy link
Member

yordis commented May 9, 2023

@connorjacobsen, by all means, please take the lead.

In the worst case, @tim-smart solved it differently, and we commit to one of the solutions, and you get to learn about Tesla internals; from my perspective, still a huge win.
In the best case, we fix the issue.

I am unfamiliar with hackney, but I can help to some extent.

@connorjacobsen
Copy link

@yordis I am equally happy with either solution. I basically just took @tim-smart's comment and riffed on it in a way that kept mostly everything isolated to the Hackney adapter itself rather than adding the response field to the core Tesla.Env struct.

So I guess my question for you + others is: do you want to isolate this to the Hackney adapter or should we go down a more generalized path adding streaming for all adapters (all seem to support streaming responses). If the latter, do we want to pursue that one-by-one or all at once? Do you have a preference?

As a meta question: is this the right place to discuss this? Is there somewhere better?

@yordis
Copy link
Member

yordis commented May 10, 2023

Ideally, we add streaming to all adapters.

About one by one vs. all at once. If you are giving it a try, instead, you do a big PR where you experiment and find a path forward that works for the official adapters, and if we need to, we split the PR into smaller ones if they are too big.

I worry about merging something that prevents me from releasing a version or something broken or error-prune.

@teamon
Copy link
Member

teamon commented May 10, 2023

There is also this PR #540 that implements streaming for Finch.

I’d say we go with the response: :stream opt and implement each adapter separately.

@yordis
Copy link
Member

yordis commented May 10, 2023

Either way is fine by me

@noozo
Copy link

noozo commented Jun 19, 2023

Any progress on this?

@feld
Copy link

feld commented Aug 29, 2024

Is there a branch that best embodies an approach that is likely to be merged that can be collaborated on and/or is there a way we can put a bounty on this?

@yordis
Copy link
Member

yordis commented Aug 29, 2024

@feld I haven't used Hackney for years by now, so I wouldn't consider myself qualified to have strong professional judgment.

I would appreciate it if folks test it and share your code review.

@yordis yordis force-pushed the master branch 6 times, most recently from 2bca420 to fe7207c Compare October 24, 2024 16:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature ✨ hackney Issues related to hackney adapter
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Streaming Response
10 participants