Skip to content
This repository has been archived by the owner on Jun 24, 2024. It is now read-only.

Update to latest llama.cpp #62

Closed
Tracked by #81
philpax opened this issue Mar 23, 2023 · 2 comments
Closed
Tracked by #81

Update to latest llama.cpp #62

philpax opened this issue Mar 23, 2023 · 2 comments
Assignees
Labels
issue:enhancement New feature or request

Comments

@philpax
Copy link
Collaborator

philpax commented Mar 23, 2023

There have been quite a few changes since our last major sync: https://github.com/ggerganov/llama.cpp/compare/904d2a8d6acd667c9633138d45a361d40fbf76d0..HEAD

(There may be others we haven't accounted for in the inferencing code).

Need to do a more precise breakdown, but

@philpax philpax added the issue:enhancement New feature or request label Mar 24, 2023
This was referenced Mar 26, 2023
@philpax philpax self-assigned this Mar 27, 2023
@iacore
Copy link
Contributor

iacore commented Apr 6, 2023

I met the InvalidMagic error again, after #61. Is there another breaking change in model format?

@philpax
Copy link
Collaborator Author

philpax commented Apr 6, 2023

Yep, #93.

@iacore iacore mentioned this issue Apr 6, 2023
@philpax philpax closed this as completed Apr 10, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
issue:enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants