Ollama 聊天模型节点常见问题#
¥Ollama Chat Model node common issues
以下是 Ollama 聊天模型节点 的一些常见错误和问题,以及解决或故障排除步骤。
¥Here are some common errors and issues with the Ollama Chat Model node and steps to resolve or troubleshoot them.
处理参数#
¥Processing parameters
“Ollama Chat Model”节点是一个 sub-node。使用表达式处理多个项目时,子节点的行为与其他节点不同。
¥The Ollama Chat Model node is a sub-node. Sub-nodes behave differently than other nodes when processing multiple items using expressions.
大多数节点(包括 根节点集)接受任意数量的项目作为输入,处理这些项目,并输出结果。你可以使用表达式引用输入项,节点会依次解析每个项的表达式。例如,给定五个名称值的输入,表达式 {{ $json.name }} 依次解析为每个名称。
¥Most nodes, including root nodes, take any number of items as input, process these items, and output the results. You can use expressions to refer to input items, and the node resolves the expression for each item in turn. For example, given an input of five name values, the expression {{ $json.name }} resolves to each name in turn.
在子节点中,表达式始终解析为第一个项。例如,给定五个名称值的输入,表达式 {{ $json.name }} 始终解析为第一个名称。
¥In sub-nodes, the expression always resolves to the first item. For example, given an input of five name values, the expression {{ $json.name }} always resolves to the first name.
无法连接到远程 Ollama 实例#
¥Can't connect to a remote Ollama instance
“Ollama Chat Model”节点支持使用 Bearer 令牌进行身份验证,以便连接到经过身份验证的代理(例如 Open WebUI)后面的远程 Ollama 实例。
¥The Ollama Chat Model node supports Bearer token authentication for connecting to remote Ollama instances behind authenticated proxies (such as Open WebUI).
对于远程身份验证连接,请在你的 Ollama 凭据中配置远程 URL 和 API 密钥。
¥For remote authenticated connections, configure both the remote URL and API key in your Ollama credentials.
请遵循 Ollama 凭证说明 文档了解更多信息。
¥Follow the Ollama credentials instructions for more information.
使用 Docker 时无法连接到本地 Ollama 实例#
¥Can't connect to a local Ollama instance when using Docker
“Ollama Chat Model”节点使用 Ollama 凭证 定义的基本 URL 连接到本地托管的 Ollama 实例。在 Docker 中运行 n8n 或 Ollama 时,需要配置网络,以便 n8n 可以连接到 Ollama。
¥The Ollama Chat Model node connects to a locally hosted Ollama instance using the base URL defined by Ollama credentials. When you run either n8n or Ollama in Docker, you need to configure the network so that n8n can connect to Ollama.
Ollama 通常监听本地网络地址 localhost 上的连接。在 Docker 中,默认情况下,每个容器都有自己的 localhost,该 localhost 只能从容器内部访问。如果 n8n 或 Ollama 运行在容器中,它们将无法通过 localhost 连接。
¥Ollama typically listens for connections on localhost, the local network address. In Docker, by default, each container has its own localhost which is only accessible from within the container. If either n8n or Ollama are running in containers, they won't be able to connect over localhost.
解决方案取决于你如何托管这两个组件。
¥The solution depends on how you're hosting the two components.
如果只有 Ollama 在 Docker 容器中#
¥If only Ollama is in Docker
如果 Docker 中仅运行 Ollama,请配置 Ollama 监听所有接口,方法是将其绑定到容器内的 0.0.0.0(官方镜像已如此配置)。
¥If only Ollama is running in Docker, configure Ollama to listen on all interfaces by binding to 0.0.0.0 inside of the container (the official images are already configured this way).
运行容器时,使用带有 -p 标志的 发布端口。默认情况下,Ollama 运行在 11434 端口,因此你的 Docker 命令应如下所示:
¥When running the container, publish the ports with the -p flag. By default, Ollama runs on port 11434, so your Docker command should look like this:
1 | |
配置 Ollama 凭证 时,localhost 地址应该可以正常工作(将基本 URL 设置为 http://localhost:11434)。
¥When configuring Ollama credentials, the localhost address should work without a problem (set the base URL to http://localhost:11434).
如果只有 n8n 在 Docker 容器中#
¥If only n8n is in Docker
如果 Docker 中仅运行 n8n,请配置 Ollama 监听主机上的所有接口,方法是将其绑定到主机上的 0.0.0.0。
¥If only n8n is running in Docker, configure Ollama to listen on all interfaces by binding to 0.0.0.0 on the host.
如果你在 Linux 系统上使用 Docker 运行 n8n,请在启动容器时使用 --add-host 标志将 host.docker.internal 映射到 host-gateway。例如:
¥If you are running n8n in Docker on Linux, use the --add-host flag to map host.docker.internal to host-gateway when you start the container. For example:
1 | |
如果你使用的是 Docker Desktop,系统会自动为你配置。
¥If you are using Docker Desktop, this is automatically configured for you.
配置 Ollama 凭证 时,请使用 host.docker.internal 作为主机地址,而不是 localhost。例如,要绑定到默认端口 11434,你可以将基本 URL 设置为 http://host.docker.internal:11434。
¥When configuring Ollama credentials, use host.docker.internal as the host address instead of localhost. For example, to bind to the default port 11434, you could set the base URL to http://host.docker.internal:11434.
如果 Ollama 和 n8n 运行在不同的 Docker 容器中#
¥If Ollama and n8n are running in separate Docker containers
如果 n8n 和 Ollama 都运行在 Docker 的不同容器中,你可以使用 Docker 网络将它们连接起来。
¥If both n8n and Ollama are running in Docker in separate containers, you can use Docker networking to connect them.
通过绑定容器内的 0.0.0.0,配置 Ollama 监听所有接口(官方镜像已完成此配置)。
¥Configure Ollama to listen on all interfaces by binding to 0.0.0.0 inside of the container (the official images are already configured this way).
配置 Ollama 凭证 时,请使用 Ollama 容器的名称作为主机地址,而不是 localhost。例如,如果你调用 Ollama 容器 my-ollama,并且它监听默认端口 11434,则应将“基本 URL”设置为 http://my-ollama:11434。
¥When configuring Ollama credentials, use the Ollama container's name as the host address instead of localhost. For example, if you call the Ollama container my-ollama and it listens on the default port 11434, you would set the base URL to http://my-ollama:11434.
如果 Ollama 和 n8n 运行在同一个 Docker 容器中#
¥If Ollama and n8n are running in the same Docker container
如果 Ollama 和 n8n 运行在同一个 Docker 容器中,则 localhost 地址无需任何特殊配置。你可以配置 Ollama 监听 localhost,并将 n8n 中的 Ollama 凭证 中的基本 URL 配置为使用 localhost。http://localhost:11434。
¥If Ollama and n8n are running in the same Docker container, the localhost address doesn't need any special configuration. You can configure Ollama to listen on localhost and configure the base URL in the Ollama credentials in n8n to use localhost: http://localhost:11434.
错误:连接 ECONNREFUSED ::1:11434#
¥Error: connect ECONNREFUSED ::1:11434
当你的计算机已启用 IPv6,但 Ollama 正在监听 IPv4 地址时,会发生此错误。
¥This error occurs when your computer has IPv6 enabled, but Ollama is listening to an IPv4 address.
要修复此问题,请将 Ollama 凭证 中的基本 URL 更改为连接到 127.0.0.1(IPv4 特定的本地地址),而不是可以解析为 IPv4 或 IPv6 的别名 localhost:http://127.0.0.1:11434。
¥To fix this, change the base URL in your Ollama credentials to connect to 127.0.0.1, the IPv4-specific local address, instead of the localhost alias that can resolve to either IPv4 or IPv6: http://127.0.0.1:11434.
Ollama 和 HTTP/HTTPS 代理#
¥Ollama and HTTP/HTTPS proxies
Ollama 的配置不支持自定义 HTTP 代理。这使得在自定义 HTTP/HTTPS 代理后使用 Ollama 变得困难。根据你的代理配置,即使设置了 HTTP_PROXY 或 HTTPS_PROXY 环境变量,也可能完全无法工作。
¥Ollama doesn't support custom HTTP agents in its configuration. This makes it difficult to use Ollama behind custom HTTP/HTTPS proxies. Depending on your proxy configuration, it might not work at all, despite setting the HTTP_PROXY or HTTPS_PROXY environment variables.
有关更多信息,请参阅 Ollama 常见问题解答。
¥Refer to Ollama's FAQ for more information.