Steps To Reproduce Steps to reproduce the behavior: 1. services.ollama = { enable = true; acceleration = "rocm"; }; Build log log here if short otherwise a link to a gist https://gist.github.com/codebam/be066269c54cd2522b1f7c9b2c3f0d7d A...
Closed Build failure: ollama when specifying acceleration as rocm #297081 AxiteYT opened this issue Mar 19, 2024· 7 comments Comments AxiteYT commented Mar 19, 2024 Steps To Reproduce Steps to reproduce the behavior: Set services.ollama.acceleration = "rocm"; OR Set acceleration = "...
Bug Description So I'm using Ollama along with llamaindex. I followed the tutorial and docs and everything works fine until I try to edit the parameters like max_new_tokens. This is the code I'm using: from llama_index.llms.ollama import...
GPU Apple CPU Apple Ollama version 0.1.38, installed with Homebrewccreutzi added the bug label Jun 3, 2024 jmorganca self-assigned this Jun 3, 2024 jmorganca changed the title Weird behaviour of stop tokens Stop token behavior changes when specifying list of stop tokens Jun 3, 2024 Si...