Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trim leading whitespace when processing #900

Merged
merged 1 commit into from
Jul 17, 2024

Conversation

krassowski
Copy link
Member

Related to #896

Partially fixes an issue where prefix is not stripped of suggestions generate by ollama due to a spurious space.

image

I think this does not fix the general issue of the model returning print(1) when prompted pr, which results in pr print(1) (if I recall correctly, there is some prefix trimming on lab side but it would fail here).

@krassowski krassowski added the bug Something isn't working label Jul 16, 2024
@krassowski krassowski marked this pull request as ready for review July 16, 2024 17:44
@dlqqq dlqqq force-pushed the ollama-inline-completions branch from 35d42dd to 4846249 Compare July 17, 2024 17:29
Copy link
Member

@dlqqq dlqqq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Thanks!

@dlqqq dlqqq enabled auto-merge (squash) July 17, 2024 17:29
@dlqqq dlqqq merged commit 12d069e into jupyterlab:main Jul 17, 2024
8 checks passed
Marchlak pushed a commit to Marchlak/jupyter-ai that referenced this pull request Oct 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants