Springboot——整合LangChain4j实现交互

发布于:2025-06-18 ⋅ 阅读:(22) ⋅ 点赞:(0)

前言

虽然Spring 生态园中,推出了Spring AI。但用起来总感觉不是那么好用。LangChain4j 本身就对许多大模型做了封装,支持普通输出流式输出操作。

本篇博客重点说明如何将LangChain4j整合到Springboot 的项目应用中。

环境准备

  • jdk 17
  • maven 3.6.3
  • springboot 3.4.0
  • langchain4j 1.0.0-beta1

依赖引入

核心依赖以及版本限定。

<properties>
    <maven.compiler.source>17</maven.compiler.source>
    <maven.compiler.target>17</maven.compiler.target>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <langchain4j.version>1.0.0-beta1</langchain4j.version>
</properties>
<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>

    <!-- langchain4j 核心依赖 -->
    <dependency>
        <groupId>dev.langchain4j</groupId>
        <artifactId>langchain4j</artifactId>
        <version>${langchain4j.version}</version>
    </dependency>

    <!-- langchain4j -->
    <dependency>
        <groupId>dev.langchain4j</groupId>
        <artifactId>langchain4j-community-dashscope-spring-boot-starter</artifactId>
    </dependency>

    <!-- 流式输出 -->
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-webflux</artifactId>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>

    <dependency>
        <groupId>org.projectlombok</groupId>
        <artifactId>lombok</artifactId>
        <version>1.18.24</version> <!-- 使用最新版本 -->
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-api</artifactId>
        <version>2.14.1</version> <!-- 使用最新的稳定版本 -->
    </dependency>
    <dependency>
        <groupId>org.apache.logging.log4j</groupId>
        <artifactId>log4j-core</artifactId>
        <version>2.14.1</version> <!-- 使用与 log4j-api 相同的版本 -->
    </dependency>

    <!-- ArrayListMultimap -->
    <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>33.3.1-jre</version>
        <scope>compile</scope>
    </dependency>

</dependencies>

<!-- 版本限制 -->
<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>dev.langchain4j</groupId>
            <artifactId>langchain4j-community-bom</artifactId>
            <version>${langchain4j.version}</version>
            <type>pom</type>
            <scope>import</scope>
        </dependency>
    </dependencies>
</dependencyManagement>

常规输出与流式输出

配置大模型key和使用模型信息

langchain4j-community-dashscope-spring-boot-starter这个依赖中,存在一个他定义好的bean实例 dev.langchain4j.community.dashscope.spring.AutoConfig
在这里插入图片描述
当存在langchain4j.community.dashscope.chat-model.api-key的配置点时,会自动构建QwenChatModel的实例对象。

简单输出

按照bean的配置要求,本次使用大模型为qwen-max,对应key为阿里百炼key

langchain4j.community.dashscope.chat-model.api-key=${ALI_AI_KEY}
langchain4j.community.dashscope.chat-model.model-name=qwen-max

构建简单测试接口

import dev.langchain4j.community.model.dashscope.QwenChatModel;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

/**
 * 普通对话模式
 */
@RestController
@RequestMapping("/ai")
public class ChatController {

    @Autowired
    private QwenChatModel qwenChatModel;

    @GetMapping("/chat")
    public String chat(@RequestParam(defaultValue = "你是谁?") String message) {
        return qwenChatModel.chat(message);
    }
}

访问与结果

http://localhost:8080/ai/chat

在这里插入图片描述

流式输出

使用流式输出,需要引入spring-boot-starter-webflux依赖。且大模型需要更换为支持流式输出的,如qwq-32b

配置文件

langchain4j.community.dashscope.streaming-chat-model.api-key=${ALI_AI_KEY}
langchain4j.community.dashscope.streaming-chat-model.model-name=qwq-32b

编写流式输出接口

import dev.langchain4j.community.model.dashscope.QwenStreamingChatModel;
import dev.langchain4j.model.chat.response.ChatResponse;
import dev.langchain4j.model.chat.response.StreamingChatResponseHandler;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import reactor.core.publisher.Flux;

/**
 * 流式输出
 */
@RestController
@RequestMapping("/ai/stream")
public class ChatStreamController {

    @Autowired
    private QwenStreamingChatModel qwenStreamingChatModel;

    // MediaType.TEXT_EVENT_STREAM_VALUE
    // produces = "text/event-stream;charset=UTF-8"
    @RequestMapping(value = "/chat",produces = "text/stream;charset=UTF-8")
    public Flux<String> chat(@RequestParam(defaultValue = "你是谁?") String message) {
        return Flux.create(emitter-> {
            qwenStreamingChatModel.chat(message, new StreamingChatResponseHandler() {

                /**
                 * 每次流式响应的文本
                 *  大模型根据每个token响应
                 * @param partialResponse The partial response (usually a single token), which is a part of the complete response.
                 */
                @Override
                public void onPartialResponse(String partialResponse) {
                    System.out.println("onPartialResponse---"+partialResponse);
                    // 逐次返回不分响应
                    emitter.next(partialResponse);
                }

                /**
                 * 响应结束的文本
                 *
                 * @param completeResponse The complete response generated by the model.
                 *                         For textual responses, it contains all tokens from {@link #onPartialResponse} concatenated.
                 */
                @Override
                public void onCompleteResponse(ChatResponse completeResponse) {
                    System.out.println("onCompleteResponse---"+completeResponse);
                    // 完成整个响应流
                    emitter.complete();
                }

                /**
                 * 出现报错的文本
                 *
                 * @param error The error that occurred
                 */
                @Override
                public void onError(Throwable error) {
                    System.out.println("onError---"+error);
                    emitter.error(error);
                }
            });
        });
    }
}

效果展示

http://localhost:8080/ai/stream/chat

在这里插入图片描述


网站公告

今日签到

点亮在社区的每一天
去签到