Ollama 凭证#
¥Ollama credentials
你可以使用以下凭据验证以下节点:
¥You can use these credentials to authenticate the following nodes:
先决条件#
¥Prerequisites
创建并运行一个包含一个用户的 Ollama 实例。请参阅 Ollama 快速入门 以获取更多信息。
¥Create and run an Ollama instance with one user. Refer to the Ollama Quick Start for more information.
支持的身份验证方法#
¥Supported authentication methods
- 实例 URL
¥Instance URL
相关资源#
¥Related resources
有关服务的更多信息,请参阅 Ollama API 文档。
¥Refer to Ollama's API documentation for more information about the service.
View n8n's Advanced AI documentation.
使用实例 URL#
¥Using instance URL
要配置此凭据,你需要:
¥To configure this credential, you'll need:
- Ollama 实例或远程已验证 Ollama 实例的基本 URL。
¥The Base URL of your Ollama instance or remote authenticated Ollama instances.
- (可选的)如果连接到远程已认证代理,则需要使用 Bearer 令牌进行身份验证的 API 密钥。
¥(Optional) The API Key for Bearer token authentication if connecting to a remote, authenticated proxy.
默认的基本 URL 为 http://localhost:11434,但如果你已设置 OLLAMA_HOST 环境变量,请输入该值。如果你在连接本地 n8n 服务器时遇到问题,请尝试使用 127.0.0.1 而不是 localhost。
¥The default Base URL is http://localhost:11434, but if you've set the OLLAMA_HOST environment variable, enter that value. If you have issues connecting to a local n8n server, try 127.0.0.1 instead of localhost.
如果你通过经过身份验证的代理服务(例如 Open WebUI)连接到 Ollama,则必须包含 API 密钥。如果你不需要身份验证,请将此字段留空。如果提供了 API 密钥,它将作为 Bearer 令牌包含在发送到 Ollama API 的请求的 Authorization 标头中。
¥If you're connecting to Ollama through authenticated proxy services (such as Open WebUI) you must include an API key. If you don't need authentication, leave this field empty. When provided, the API key is sent as a Bearer token in the Authorization header of the request to the Ollama API.
有关更多信息,请参阅 如何配置 Ollama 服务器?。
¥Refer to How do I configure Ollama server? for more information.
Ollama 和自托管 n8n#
¥Ollama and self-hosted n8n
如果你将 n8n 与 Ollama 托管在同一台机器上,并且它们运行在不同的容器中,则可能会遇到问题。
¥If you're self-hosting n8n on the same machine as Ollama, you may run into issues if they're running in different containers.
对于此设置,请通过设置 OLLAMA_ORIGINS 变量或将 OLLAMA_HOST 调整为其他容器可以访问的地址,为 n8n 与 Ollama 通信打开一个特定端口。
¥For this setup, open a specific port for n8n to communicate with Ollama by setting the OLLAMA_ORIGINS variable or adjusting OLLAMA_HOST to an address the other container can access.
有关更多信息,请参阅 Ollama 的 如何允许其他 Web 源访问 Ollama? 文档。
¥Refer to Ollama's How can I allow additional web origins to access Ollama? for more information.