Linux搭建个人大模型RAG-(ollama+deepseek+anythingLLM)

发布于:2025-03-05 ⋅ 阅读:(20) ⋅ 点赞:(0)

本文是远程安装ollama + deepseek,本地笔记本电脑安装anythingLLM

1.安装ollama

安装可以非常简单,一行命令完事。(有没有GPU,都没有关系,自动下载合适的版本)

cd 到合适的文件目录

下载安装一键安装文件

curl -fsSL https://ollama.com/install.sh -o ollama_install.sh

赋权

chmod +x ollama_install.sh

执行自动下载&安装

sh ollama_install.sh

ps:这里下载可能会很感人,所以有人说可以替换ollama下载为github下载:

替换一键安装文件中的下载网址:注意修改下边命令行中蓝色字体最新的版本

sed -i 's|https://ollama.com/download/|https://github.com/ollama/ollama/releases/download/v0.5.7/|' ollama_install.sh

但是我发现我这里还是用ollama快一点

安装完成后

查看ollama 命令

ollama -help

命令:

ollama --help

Available Commands:

  serve       Start ollama

  create      Create a model from a Modelfile

  show        Show information for a model

  run         Run a model

  stop        Stop a running model

  pull        Pull a model from a registry

  push        Push a model to a registry

  list        List models

  ps          List running models

  cp          Copy a model

  rm          Remove a model

  help        Help about any command

Flags:

  -h, --help      help for ollama

  -v, --version   Show version information

Use "ollama [command] --help" for more information about a command.

配置ollama服务

暂停服务

systemctl stop ollama

修改ollama.service  文件,配置上远程访问

切换到ollama.service 路径

cd /etc/systemd/system

vi ollama.service 

增加:

Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"

systemctl daemon-reload

systemctl restart ollama

修改后的 ollama.service 文件内容:

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=root
Group=root

Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/data/tools/zookeeper/current/bin:/data/tools/scala/current/bin:/data/tools/kafka/current/bin:/data/tools/hadoop/current/bin:/data/tools/hadoop/current/sbin:/data/tools/hive/current/bin:/data/tools/spark/current/bin:/data/tools/spark/current/sbin:/data/tools/eagle/current/bin:/data/tools/flink/current/bin:/data/tools/maven/current/bin:/opt/jdk-11.0.15/bin:/root/bin"
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"


[Install]
WantedBy=default.target

ps: 注意这两句不要加到最后边,要加到 [Service] 后边

2.安装deepseek

报错:连接超时,重新下载即可

查看ollama 安装大模型目录

ollama list

启动

ollama run deepseek-r1:1.5b

对话效果:

1.5b速度还是可以的,但是相当纸张..

关闭

ollama stop deepseek-r1:1.5b

3.安装anythingLLM

下载安装文件

curl -fsSL https://cdn.anythingllm.com/latest/installer.sh 

切换非root用户

su 普通user

不然会报错:>> This script should not be run as root. Please run it as a regular user.

查看当前用户列表: ls /home

执行安装命令

sh installer.sh