-
Notifications
You must be signed in to change notification settings - Fork 891
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added text streams to AIChat UI #18376
Conversation
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for delayed response
components/ai_chat/ai_chat_api.cc
Outdated
// new instance of the process for each call, which may result in unordered | ||
// chunks. Thus, we create a single instance of the parser and reuse it for | ||
// consecutive calls | ||
data_decoder_ = std::make_unique<data_decoder::DataDecoder>(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice find! 👍 I didn't know about data_decoder::DataDecoder
components/ai_chat/ai_chat_api.cc
Outdated
std::string response = result.body(); | ||
const base::Value::Dict* dict = result.value_body().GetIfDict(); | ||
is_request_in_progress_ = false; | ||
current_request_->reset(nullptr); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Resetting the SimpleURLLoader doesn't seem to be enough, as the entity in APIRequestHelper::url_loaders_ will remain. I think we should do what APIRequestHelper::Cancel() does.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done. I just had to run the cancel after running the callback.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then, could you remove this line?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
current_request_ is a ::Ticket type which is not a unique_ptr itself but an iterator. When we Cancel() isn't it just removing from the list and the caller is suppose to explicitly reset or reassign to nullptr?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When an entity of list is erased, the entity's desturctor should be called. In this case, the entity is unique_ptr, so it'd free the space for us, isn't it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for delayed response
A Storybook has been deployed to preview UI for the latest push |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM with a nit
components/ai_chat/ai_chat_api.cc
Outdated
std::string response = result.body(); | ||
const base::Value::Dict* dict = result.value_body().GetIfDict(); | ||
is_request_in_progress_ = false; | ||
current_request_->reset(nullptr); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then, could you remove this line?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
DEPS
lgtm with a nit.
components/ai_chat/DEPS
Outdated
@@ -1,4 +1,5 @@ | |||
include_rules = [ | |||
"+services/data_decoder/public", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pls sort
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice work getting it working! Few comments...
@@ -111,6 +110,18 @@ class APIRequestHelper { | |||
const APIRequestOptions& request_options = {}, | |||
ResponseConversionCallback conversion_callback = base::NullCallback()); | |||
|
|||
Ticket Request(const std::string& method, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
public function definition could do with comment
components/ai_chat/ai_chat_api.cc
Outdated
current_request_ = api_request_helper_.Request( | ||
"POST", api_url, CreateJSONRequestBody(dict), "application/json", this, | ||
headers, {}, | ||
base::BindOnce(&AIChatAPI::OnResponseStarted, | ||
weak_ptr_factory_.GetWeakPtr()), | ||
base::BindRepeating(&AIChatAPI::OnDownloadProgress, | ||
weak_ptr_factory_.GetWeakPtr())); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Most of the benefit of APIRequestHelper seems to be boilerplate for handling the response where it parses JSON and handles various situations. It doesn't seem worth using it anymore if we're just bypassing it. Wouldn't it be almost just as much code to make a request here via SimpleURLLoader
? Or, perhaps more prefably, to put the event-stream
handling inside APIRequestHelper. Since it's part of the regular http spec, it seems like it should go there where any consumers can benefit - it already has JSON parsing. What do you think? We could just detect the response header content-type: text/event-stream
and do what you have here instead.
components/ai_chat/ai_chat_api.h
Outdated
ResponseCallback response_callback_; | ||
CompletionCallback completion_callback_; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why store these callbacks here and not pass them through as arguments bound event handlers? Seems like we're deepening the current architectural issue where request and callback state is not cleared when the conversation is changed as a result of a navigation event.
components/ai_chat/ai_chat_api.cc
Outdated
std::move(callback).Run(response, success); | ||
DCHECK(response_callback_); | ||
|
||
if (const std::string* completion = result->FindStringKey("completion")) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What if an error occurs as there is no completion key, shouldn't the callback get notified?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The safe thing to do here is to drop the callback I'd say and debug why there's no completion key.
The callback this function reports to is critical because on the other end it consumes the message which updates the UI.
components/ai_chat/ai_chat_api.cc
Outdated
void AIChatAPI::OnResponseStarted( | ||
const GURL& final_url, | ||
const network::mojom::URLResponseHead& response_head) { | ||
is_request_in_progress_ = true; | ||
} | ||
|
||
void AIChatAPI::OnDownloadProgress(uint64_t current) { | ||
is_request_in_progress_ = true; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not set is_request_in_progress_
to true
inside the QueryPrompt
function when creating the request?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This has been handled in a new way in the tab helper itself.
@@ -279,8 +298,8 @@ void AIChatTabHelper::DistillViaAlgorithm(const ui::AXTree& tree) { | |||
void AIChatTabHelper::CleanUp() { | |||
chat_history_.clear(); | |||
article_summary_.clear(); | |||
SetRequestInProgress(false); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removing this line from cleanup deepens the issue where navigating away does not clean up some state. Right now, that problem manifests by any in-progress requests showing their result on another page. Now that problem will also have the UI show as in-progress. If we're going to have AIChatAPI
store state and be single-request-only, then perhaps it's time in this PR to add a Cancel()
method to it.
components/ai_chat/ai_chat_api.cc
Outdated
DVLOG(1) << __func__ << " Using model: " << model_name; | ||
void AIChatAPI::OnDataReceived(base::StringPiece string_piece, | ||
base::OnceClosure resume) { | ||
std::vector<std::string> stream_data = base::SplitString( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should check for a success/error http response code like APIRequestResult
can (or re-use that code). Additionally, a request could still come back with the Content-Type response header of anything, e.g. application/json
if it's an error. We should check for text/event-stream
before assuming.
} | ||
}, [props.list.length, props.isLoading]) | ||
}, [props.list.length, props.isLoading, lastConversationEntryElementRef.current?.clientHeight]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the reasoning on depending on a ref's child attribute? Changing the value of that won't trigger a react render.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The effect logic here aims to interact with the browser's scroll event rather than triggering a re-render. React will re-render to update the DOM, which can result in changes to the height of the <div>
. When such changes occur, the effect should respond by triggering the browser's scroll event.
Perhaps, this can be better using a resize observer.
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
A Storybook has been deployed to preview UI for the latest push |
one-shot requests
A Storybook has been deployed to preview UI for the latest push |
@@ -253,32 +397,45 @@ void APIRequestHelper::OnResponse( | |||
raw_body = converted_body.value(); | |||
} | |||
|
|||
data_decoder::DataDecoder::ParseJsonIsolated( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nullhook
APIRequestHelper
is used in multiple components.
That change (ParseJsonIsolated
=> ParseJson
) results in reusing DataDecoder
instances in some cases (not only for ai_chat_*
) and reduces sanitizing security.
Has that change passed through a sec-review?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we did not review this AFAIK
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, didn't create a sec-review for this.
Resolves brave/brave-browser#29607
This PR adds support for consuming chunks from the Anthropic completion API, which improves the user experience by allowing messages to be sent to the UI as they come in, rather than sending the entire message as a single unit.
The PR follows the rule of two from the security docs. The SimpleURLLoader relies on the network process, while the data decoder relies on the utility process. We only read the values in the browser process.
Submitter Checklist:
QA/Yes
orQA/No
;release-notes/include
orrelease-notes/exclude
;OS/...
) to the associated issuenpm run test -- brave_browser_tests
,npm run test -- brave_unit_tests
wikinpm run lint
,npm run presubmit
wiki,npm run gn_check
,npm run tslint
git rebase master
(if needed)Reviewer Checklist:
gn
After-merge Checklist:
changes has landed on
Test Plan:
This PR enables streaming response by default. However, it's important to test two cases.
Prereqs:
Steps:
By default, you should observe the messages coming in real-time in chunks. Additionally, you will see a caret similar to a terminal-like interface.
Let's proceed to test the one-shot response by disabling the
ai_chat_sse
feature flag.--enable-features=AIChat:ai_chat_sse\/false
Repeat the test with the same article, and this time the message should be received as a whole, which may take some time.