Spring Boot3使用Spring AI通过Ollama集成deepseek

发布于:2025-04-03 ⋅ 阅读:(14) ⋅ 点赞:(0)

项目地址

版本信息

版本
Spring Boot 3.4.4
JDK 21
spring-ai 1.0.0-M6
ollama 0.6.3
LLM deepseek:14b

集成步骤

  1. 引入依赖
<dependency>
	<groupId>org.springframework.ai</groupId>
	<artifactId>spring-ai-openai-spring-boot-starter</artifactId>
</dependency>
  1. 配置pom文件
spring:
  application:
    name: AiDoctor
  profiles:
    active: dev
  ai:
    openai:
      api-key: sk-xxx
      base-url: http://127.0.0.1:11434
      chat:
        options:
          model: yuan-doctor:1.0
          role: user
          temperature: 0.8
          top-p: 0.8
  1. 非流式调用代码
public Map aiChat(String message) {
        return Map.of("generation", this.chatModel.call(message));
    }
  1. 流式输出调用的具体代码
public Flux<ChatResponse> aiChatStream(String message) {
    Prompt prompt = new Prompt(new UserMessage(message));
    log.info(prompt.toString());
    return this.chatModel.stream(prompt);
}

public List<String> aiChatStream2(String message) {
    Prompt prompt = new Prompt(new UserMessage(message));
    log.info(prompt.toString());
    Flux<ChatResponse> responseFlux = this.chatModel.stream(prompt);
    List<String> list = responseFlux.toStream().map(chatResponse -> {
        String text = chatResponse.getResult().getOutput().getText();
        log.info(text);
        return text;
    }).toList();
    log.info("list:{}", list);
    return list;
}