Skip to content

Ollama 聊天模型节点常见问题(Ollama Chat Model node common issues)#

以下是一些与 Ollama 聊天模型节点 相关的常见错误和问题,以及解决或排查它们的步骤。

🌐 Here are some common errors and issues with the Ollama Chat Model node and steps to resolve or troubleshoot them.

处理参数(Processing parameters)#

Ollama 聊天模型节点是一个 子节点。在使用表达式处理多个项目时,子节点的行为与其他节点不同。

🌐 The Ollama Chat Model node is a sub-node. Sub-nodes behave differently than other nodes when processing multiple items using expressions.

大多数节点,包括根节点,可以接收任意数量的输入项,对这些项进行处理,并输出结果。你可以使用表达式来引用输入项,节点会依次解析每个项的表达式。例如,给定五个名称值作为输入,表达式 {{ $json.name }} 将依次解析为每个名称。

🌐 Most nodes, including root nodes, take any number of items as input, process these items, and output the results. You can use expressions to refer to input items, and the node resolves the expression for each item in turn. For example, given an input of five name values, the expression {{ $json.name }} resolves to each name in turn.

在子节点中,表达式总是解析到第一个项目。例如,给定五个名称值的输入时,表达式 {{ $json.name }} 总是解析到第一个名称。

🌐 In sub-nodes, the expression always resolves to the first item. For example, given an input of five name values, the expression {{ $json.name }} always resolves to the first name.

无法连接到远程 Ollama 实例(Can't connect to a remote Ollama instance)#

“Ollama Chat Model”节点支持使用 Bearer 令牌进行身份验证,以便连接到经过身份验证的代理(例如 Open WebUI)后面的远程 Ollama 实例。

🌐 The Ollama Chat Model node supports Bearer token authentication for connecting to remote Ollama instances behind authenticated proxies (such as Open WebUI).

对于远程身份验证连接,请在你的 Ollama 凭据中配置远程 URL 和 API 密钥。

🌐 For remote authenticated connections, configure both the remote URL and API key in your Ollama credentials.

请遵循 Ollama 凭证说明 了解更多信息。

🌐 Follow the Ollama credentials instructions for more information.

使用 Docker 时无法连接到本地 Ollama 实例(Can't connect to a local Ollama instance when using Docker)#

Ollama 聊天模型节点使用 Ollama 凭证 定义的基础 URL 连接到本地托管的 Ollama 实例。当你在 Docker 中运行 n8n 或 Ollama 时,需要配置网络,以便 n8n 能连接到 Ollama。

🌐 The Ollama Chat Model node connects to a locally hosted Ollama instance using the base URL defined by Ollama credentials. When you run either n8n or Ollama in Docker, you need to configure the network so that n8n can connect to Ollama.

Ollama 通常在本地网络地址 localhost 上监听连接。在 Docker 中,默认情况下,每个容器都有自己的 localhost,只能从容器内部访问。如果 n8n 或 Ollama 在容器中运行,它们将无法通过 localhost 进行连接。

🌐 Ollama typically listens for connections on localhost, the local network address. In Docker, by default, each container has its own localhost which is only accessible from within the container. If either n8n or Ollama are running in containers, they won't be able to connect over localhost.

解决方案取决于你如何托管这两个组件。

🌐 The solution depends on how you're hosting the two components.

如果只有 Ollama 在 Docker 容器中(If only Ollama is in Docker)#

如果 Ollama 仅在 Docker 中运行,请通过在容器内绑定到 0.0.0.0 来配置 Ollama 监听所有接口(官方镜像已经这样配置)。

🌐 If only Ollama is running in Docker, configure Ollama to listen on all interfaces by binding to 0.0.0.0 inside of the container (the official images are already configured this way).

运行容器时,使用 -p 标志发布端口。默认情况下,Ollama 运行在端口 11434,所以你的 Docker 命令应该如下所示:

🌐 When running the container, publish the ports with the -p flag. By default, Ollama runs on port 11434, so your Docker command should look like this:

1
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

在配置 Ollama 凭据 时,localhost 地址应该可以正常使用(将 基础 URL 设置为 http://localhost:11434)。

🌐 When configuring Ollama credentials, the localhost address should work without a problem (set the base URL to http://localhost:11434).

如果只有 n8n 在 Docker 容器中(If only n8n is in Docker)#

如果 n8n 仅在 Docker 中运行,请通过在主机上绑定到 0.0.0.0 来配置 Ollama 监听所有接口。

🌐 If only n8n is running in Docker, configure Ollama to listen on all interfaces by binding to 0.0.0.0 on the host.

如果你在 Linux 上使用 Docker 运行 n8n,在启动容器时使用 --add-host 标志将 host.docker.internal 映射到 host-gateway。例如:

🌐 If you are running n8n in Docker on Linux, use the --add-host flag to map host.docker.internal to host-gateway when you start the container. For example:

1
docker run -it --rm --add-host host.docker.internal:host-gateway --name n8n -p 5678:5678 -v n8n_data:/home/node/.n8n docker.n8n.io/n8nio/n8n

如果你使用的是 Docker Desktop,系统会自动为你配置。

🌐 If you are using Docker Desktop, this is automatically configured for you.

在配置 Ollama 凭据 时,请使用 host.docker.internal 作为主机地址,而不是 localhost。例如,要绑定到默认端口 11434,你可以将基础 URL 设置为 http://host.docker.internal:11434

🌐 When configuring Ollama credentials, use host.docker.internal as the host address instead of localhost. For example, to bind to the default port 11434, you could set the base URL to http://host.docker.internal:11434.

如果 Ollama 和 n8n 运行在不同的 Docker 容器中(If Ollama and n8n are running in separate Docker containers)#

如果 n8n 和 Ollama 都运行在 Docker 的不同容器中,你可以使用 Docker 网络将它们连接起来。

🌐 If both n8n and Ollama are running in Docker in separate containers, you can use Docker networking to connect them.

通过在容器内将 Ollama 绑定到 0.0.0.0 来配置 Ollama 监听所有接口(官方镜像已默认如此配置)。

🌐 Configure Ollama to listen on all interfaces by binding to 0.0.0.0 inside of the container (the official images are already configured this way).

在配置 Ollama 凭据 时,请使用 Ollama 容器的名称作为主机地址,而不是 localhost。例如,如果你将 Ollama 容器命名为 my-ollama 并且它监听默认端口 11434,则应将基础 URL 设置为 http://my-ollama:11434

🌐 When configuring Ollama credentials, use the Ollama container's name as the host address instead of localhost. For example, if you call the Ollama container my-ollama and it listens on the default port 11434, you would set the base URL to http://my-ollama:11434.

如果 Ollama 和 n8n 运行在同一个 Docker 容器中(If Ollama and n8n are running in the same Docker container)#

如果 Ollama 和 n8n 在同一个 Docker 容器中运行,localhost 地址无需任何特殊配置。你可以将 Ollama 配置为监听本地主机,并在 n8n 中的 Ollama 凭据 中配置基础 URL 使用 localhost:http://localhost:11434

🌐 If Ollama and n8n are running in the same Docker container, the localhost address doesn't need any special configuration. You can configure Ollama to listen on localhost and configure the base URL in the Ollama credentials in n8n to use localhost: http://localhost:11434.

错误:连接被拒绝 ECONNREFUSED ::1:11434(Error: connect ECONNREFUSED ::1:11434)#

当你的计算机已启用 IPv6,但 Ollama 正在监听 IPv4 地址时,会发生此错误。

🌐 This error occurs when your computer has IPv6 enabled, but Ollama is listening to an IPv4 address.

要解决此问题,请在你的 Ollama 凭证 中更改基础 URL,以连接到 127.0.0.1,这是 IPv4 专用的本地地址,而不是可以解析为 IPv4 或 IPv6 的 localhost 别名:http://127.0.0.1:11434

🌐 To fix this, change the base URL in your Ollama credentials to connect to 127.0.0.1, the IPv4-specific local address, instead of the localhost alias that can resolve to either IPv4 or IPv6: http://127.0.0.1:11434.

Ollama 和 HTTP/HTTPS 代理(Ollama and HTTP/HTTPS proxies)#

Ollama 的配置不支持自定义 HTTP 代理。这使得在自定义 HTTP/HTTPS 代理后使用 Ollama 变得困难。根据你的代理配置,即使设置了 HTTP_PROXYHTTPS_PROXY 环境变量,它也可能完全无法工作。

🌐 Ollama doesn't support custom HTTP agents in its configuration. This makes it difficult to use Ollama behind custom HTTP/HTTPS proxies. Depending on your proxy configuration, it might not work at all, despite setting the HTTP_PROXY or HTTPS_PROXY environment variables.

有关更多信息,请参阅 Ollama 的常见问题解答

🌐 Refer to Ollama's FAQ for more information.