Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

C-API output tensors reset #2683

Merged
merged 14 commits into from
Oct 22, 2024
Merged

C-API output tensors reset #2683

merged 14 commits into from
Oct 22, 2024

Conversation

atobiszei
Copy link
Collaborator

@atobiszei atobiszei commented Sep 13, 2024

🛠 Summary

Adding safeguard to C-API so that when setting output buffers is used we restore the original output buffers after the inference. Otherwise you would have to use output tensors always for any particular model. ov::InferRequest would store the previously set output buffer, and inference would trigger overwrite.

Ticket:CVS-141788, CVS-152665

🧪 Checklist

  • Unit tests added.
  • The documentation updated.
  • Change follows security best practices.
    ``

@atobiszei atobiszei added WIP Do not merge until resolved POC DO NOT MERGE labels Sep 13, 2024
@atobiszei atobiszei removed WIP Do not merge until resolved POC DO NOT MERGE labels Sep 25, 2024
@atobiszei atobiszei force-pushed the atobisze_output_tensors_reset branch from eaf5cd4 to 1ab3f0b Compare October 14, 2024 12:20
@atobiszei atobiszei force-pushed the atobisze_output_tensors_reset branch from bc84084 to 9ea8028 Compare October 17, 2024 09:20
src/modelinstance.cpp Show resolved Hide resolved
src/BUILD Outdated
@@ -1578,7 +1578,7 @@ cc_library(
srcs = ["test/openvino_remote_tensors_tests.cpp"],
data = [
"test/c_api/config_standard_dummy.json",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about standard dummy config?

src/BUILD Outdated
@@ -1627,7 +1627,7 @@ cc_library(
srcs = ["test/openvino_tests.cpp"],
data = [
"test/c_api/config_standard_dummy.json",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about standard dummy config?

if (!status.ok()) {
return reinterpret_cast<OVMS_Status*>(new Status(std::move(status)));
}
return nullptr;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is nullptr interpreted as OK status?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes

@@ -60,6 +60,13 @@ Status InferenceRequest::setOutputBuffer(const char* name, const void* addr, siz
}
return it->second.setBuffer(addr, byteSize, bufferType, deviceId);
}
Status InferenceRequest::removeOutputBuffer(const char* name) {
auto it = outputs.find(name);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it make sense to check name pointer validity first?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's checked at capi.cpp

@@ -99,6 +106,13 @@ Status InferenceRequest::removeInput(const char* name) {
}
return StatusCode::NONEXISTENT_TENSOR_FOR_REMOVAL;
}
Status InferenceRequest::removeOutput(const char* name) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as above

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's checked at capi.cpp

bool allOutputsSupported = true;
for (auto [name, _] : getOutputsInfo()) {
try {
ov::Tensor tensor = request.get_tensor(name);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible that we capture input name here? If model has an input called name?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Output name is logged in L1442/1439

@atobiszei atobiszei merged commit 0cef8c9 into main Oct 22, 2024
8 checks passed
@atobiszei atobiszei deleted the atobisze_output_tensors_reset branch October 22, 2024 07:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants