启动VSCode 连接到对应的WSL distro 点击Extension图标 搜索框输入ollama 选择twinny - AI Code Completion and Chat 点击安装 注:Twinny插件比Ollama Autocoder多了chat功能,下载量和好评量也高。配置的大部分方法也适用于Ollama Autocoder 安装完,我们还需要进行相关的设置 点击齿轮图标 选择Extension Setting 选择User...
rjmacarthycommentedJan 12, 2024• edited Hey, thanks for the awesome work you've been doing with Ollama. I was hoping that you would consider adding my extension for vscode to the list of extensions and plugins for Ollama. It's basically a Copilot alternative with FIM and Chat features...
Claude Dev- VSCode extension for multi-file/whole-repo coding Cherry Studio(Desktop client with Ollama support) ConfiChat(Lightweight, standalone, multi-platform, and privacy focused LLM chat interface with optional encryption) Archyve(RAG-enabling document library) ...
Claude Dev- VSCode extension for multi-file/whole-repo coding Cherry Studio(Desktop client with Ollama support) ConfiChat(Lightweight, standalone, multi-platform, and privacy focused LLM chat interface with optional encryption) Archyve(RAG-enabling document library) crewAI with Mesop(Mesop Web Int...
[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] O...
"main": "./out/extension.js", ... "scripts": { "vscode:prepublish": "npm run compile", "compile": "tsc -p ./", "watch": "tsc -watch -p ./", "pretest": "npm run compile && npm run lint", "lint": "eslint src", "test": "vscode-test" }, ... Also, notice how the...
Claude Dev - VSCode extension for multi-file/whole-repo coding Cherry Studio (Desktop client with Ollama support) ConfiChat (Lightweight, standalone, multi-platform, and privacy focused LLM chat interface with optional encryption) Archyve (RAG-enabling document library) crewAI with Mesop (Mesop...
Claude Dev - VSCode extension for multi-file/whole-repo coding Cherry Studio (Desktop client with Ollama support) ConfiChat (Lightweight, standalone, multi-platform, and privacy focused LLM chat interface with optional encryption) Archyve (RAG-enabling document library) crewAI with Mesop (Mesop...
"main": "./dist/extension.js", "activationEvents": [], "contributes": { "commands": [ @@ -126,8 +126,9 @@ }, "scripts": { "vscode:prepublish": "pnpm run compile", "compile": "tsc -p ./", "watch": "tsc -watch -p ./", "esbuild-base": "esbuild ./src/extension.ts...
Thank you for your reply. When I run the model on my machine, I set keep_alive. If I call the model interface remotely without setting keep_alive, the previously set keep_alive will become invalid. Is this scene normal? Sorry, something went wrong. ...