-
Notifications
You must be signed in to change notification settings - Fork 130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fine tune connector process function #1954
Conversation
Signed-off-by: Yaliang Wu <ylwu@amazon.com>
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #1954 +/- ##
============================================
+ Coverage 82.83% 82.93% +0.10%
- Complexity 5447 5506 +59
============================================
Files 522 533 +11
Lines 21912 22085 +173
Branches 2228 2244 +16
============================================
+ Hits 18150 18316 +166
- Misses 2851 2853 +2
- Partials 911 916 +5
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Signed-off-by: Yaliang Wu <ylwu@amazon.com>
Signed-off-by: Yaliang Wu <ylwu@amazon.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Some thoughts here:
It's likely that we need to add more and more pre/post functions to support different remote Models like triton, etc, we should pay attention to common processes and see if some Models can share these functions to avoid duplications. Hopefully in the next models that we support, we only need to config the existing pre/post functions without adding code.
Good point. We can continuously fine tune the code when adding more and more models. Create some common process functions will save effort |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cohere rerank connector functions look good to me.
Does this let us use pre-process functions on the generic remote inference dataset now? If so, awesome!
Yes, with this PR, user can use pre-process function on remote inference data |
* fine tune connector process function Signed-off-by: Yaliang Wu <ylwu@amazon.com> * add unit test for process function Signed-off-by: Yaliang Wu <ylwu@amazon.com> * add license header Signed-off-by: Yaliang Wu <ylwu@amazon.com> --------- Signed-off-by: Yaliang Wu <ylwu@amazon.com> (cherry picked from commit c5225de)
* fine tune connector process function Signed-off-by: Yaliang Wu <ylwu@amazon.com> * add unit test for process function Signed-off-by: Yaliang Wu <ylwu@amazon.com> * add license header Signed-off-by: Yaliang Wu <ylwu@amazon.com> --------- Signed-off-by: Yaliang Wu <ylwu@amazon.com> (cherry picked from commit c5225de) Co-authored-by: Yaliang Wu <ylwu@amazon.com>
* fine tune connector process function Signed-off-by: Yaliang Wu <ylwu@amazon.com> * add unit test for process function Signed-off-by: Yaliang Wu <ylwu@amazon.com> * add license header Signed-off-by: Yaliang Wu <ylwu@amazon.com> --------- Signed-off-by: Yaliang Wu <ylwu@amazon.com>
Description
Issues Resolved
[List any issues this PR will resolve]
Check List
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.