Merge changes from randaller/llama-chat #4
Merge changes from randaller/llama-chat #4Honigmelone wants to merge 1 commit intorandaller:mainfrom
Conversation
…for the interactive chat to work
|
@Honigmelone this will break all other examples; llama-chat is now a primary repo, and this repo is deprecated |
|
I see, is it somehow possible to run llama-chat in cpu only mode or do you drop this functionality? |
|
I haven't a clue what I'm doing and am just quickly messing around, but regarding llama-chat/llama/model.py: I changed Hopefully proper CPU support will come to the main repo some day.... For now I guess I'll just base my own personal experiments on this deprecated repo, or Frankenstein myself some hybrid of the two. |
Hey,
I notices the default prompt in
example-chat.pywas quite different from your two repos. I have merges some more recent changes from https://github.com/randaller/llama-chat to get the interactive chat working in the cpu only version.I have not merged the model and the tokenizer yet. You might want to consider to build up on this and to merge them as well to obtain two consistent repositories