Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Microsoft – NPI EA (cat = Baeldung)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Microsoft – NPI EA (cat= Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, you can get started over on the documentation page.

And, you can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – All Access – NPI EA (cat= Spring)
announcement - icon

All Access is finally out, with all of my Spring courses. Learn JUnit is out as well, and Learn Maven is coming fast. And, of course, quite a bit more affordable. Finally.

>> GET THE COURSE
Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – LambdaTest – NPI EA (cat=Testing)
announcement - icon

End-to-end testing is a very useful method to make sure that your application works as intended. This highlights issues in the overall functionality of the software, that the unit and integration test stages may miss.

Playwright is an easy-to-use, but powerful tool that automates end-to-end testing, and supports all modern browsers and platforms.

When coupled with LambdaTest (an AI-powered cloud-based test execution platform) it can be further scaled to run the Playwright scripts in parallel across 3000+ browser and device combinations:

>> Automated End-to-End Testing With Playwright

Course – Spring Sale 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

Course – Spring Sale 2025 – NPI (cat=Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

1. Overview

Modern web applications are increasingly integrating with Large Language Models (LLMs) to build solutions, which are not just limited to general knowledge-based question-answering.

To enhance an AI model’s response and make it more context-aware, we can connect it to external sources like search engines, databases, and file systems. However, integrating and managing multiple data sources with different formats and protocols is a challenge.

The Model Context Protocol (MCP), introduced by Anthropic, addresses this integration challenge and provides a standardized way to connect AI-powered applications with external data sources. Through MCP, we can build complex agents and workflows on top of a native LLM.

In this tutorial, we’ll understand the concept of MCP by practically implementing its client-server architecture using Spring AI. We’ll create a simple chatbot and extend its capabilities through MCP servers to perform web searches, execute filesystem operations, and access custom business logic.

2. Model Context Protocol 101

Before we dive into the implementation, let’s take a closer look at MCP and its various components:

Architecture diagram of the model context protocol (MCP) demonstrating the relationship between host, clients, servers, and external sources.

MCP follows a client-server architecture that revolves around several key components:

  • MCP Host: is our main application that integrates with an LLM and requires it to connect with external data sources
  • MCP Clients: are components that establish and maintain 1:1 connections with MCP servers
  • MCP Servers: are components that integrate with external data sources and expose functionalities to interact with them
  • Tools: refer to the executable functions/methods that the MCP servers expose for clients to invoke

Additionally, to handle communication between clients and servers, MCP provides two transport channels.

To enable communication through standard input and output streams with local processes and command-line tools, it provides the Standard Input/Output (stdio) transport type. Alternatively, for HTTP-based communication between clients and servers, it provides the Server-Sent Events (SSE) transport type.

MCP is a complex and vast topic, refer to the official documentation to learn more.

3. Creating an MCP Host

Now that we’ve got a high-level understanding of MCP, let’s start implementing the MCP architecture practically.

We’ll be building a chatbot using Anthropic’s Claude model, which will act as our MCP host. Alternatively, we can use a local LLM via Hugging Face or Ollama, as the specific AI model is irrelevant for this demonstration.

3.1. Dependencies

Let’s start by adding the necessary dependencies to our project’s pom.xml file:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-anthropic-spring-boot-starter</artifactId>
    <version>1.0.0-M6</version>
</dependency>
<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-mcp-client-spring-boot-starter</artifactId>
    <version>1.0.0-M6</version>
</dependency>

The Anthropic starter dependency is a wrapper around the Anthropic Message API, and we’ll use it to interact with the Claude model in our application.

Additionally, we import the MCP client starter dependency, which will allow us to configure clients inside our Spring Boot application that maintain 1:1 connections with the MCP servers.

Since the current version, 1.0.0-M6, is a milestone release, we’ll also need to add the Spring Milestones repository to our pom.xml:

<repositories>
    <repository>
        <id>spring-milestones</id>
        <name>Spring Milestones</name>
        <url>https://repo.spring.io/milestone</url>
        <snapshots>
            <enabled>false</enabled>
        </snapshots>
    </repository>
</repositories>

This repository is where milestone versions are published, as opposed to the standard Maven Central repository.

Given that we’re using multiple Spring AI starters in our project, let’s also include the Spring AI Bill of Materials (BOM) in our pom.xml:

<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>org.springframework.ai</groupId>
            <artifactId>spring-ai-bom</artifactId>
            <version>1.0.0-M6</version>
            <type>pom</type>
            <scope>import</scope>
        </dependency>
    </dependencies>
</dependencyManagement>

With this addition, we can now remove the version tag from both of our starter dependencies. The BOM eliminates the risk of version conflicts and ensures our Spring AI dependencies are compatible with each other.

Next, let’s configure our Anthropic API key and chat model in the application.yaml file:

spring:
  ai:
    anthropic:
      api-key: ${ANTHROPIC_API_KEY}
      chat:
        options:
          model: claude-3-7-sonnet-20250219

We use the ${} property placeholder to load the value of our API Key from an environment variable.

Additionally, we specify Claude 3.7 Sonnet, the most intelligent model by Anthropic, using the claude-3-7-sonnet-20250219 model ID. Feel free to explore and use a different model based on requirements.

On configuring the above properties, Spring AI automatically creates a bean of type ChatModel, allowing us to interact with the specified model.

3.2. Configuring MCP Clients for Brave Search and Filesystem Servers

Now, let’s configure MCP clients for two pre-built MCP server implementations: Brave Search and Filesystem. These servers will enable our chatbot to perform web searches and filesystem operations.

Let’s start by registering an MCP client for the Brave search MCP server in the application.yaml file:

spring:
  ai:
    mcp:
      client:
        stdio:
          connections:
            brave-search:
              command: npx
              args:
                - "-y"
                - "@modelcontextprotocol/server-brave-search"
              env:
                BRAVE_API_KEY: ${BRAVE_API_KEY}

Here, we configure a client with stdio transport. We specify the npx command to download and run the TypeScript-based @modelcontextprotocol/server-brave-search package and use the -y flag to confirm all the installation prompts.

Additionally, we provide the BRAVE_API_KEY as an environment variable.

Next, let’s configure an MCP client for the Filesystem MCP server:

spring:
  ai:
    mcp:
      client:
        stdio:
          connections:
            filesystem:
              command: npx
              args:
                - "-y"
                - "@modelcontextprotocol/server-filesystem"
                - "./"

Similar to the previous configuration, we specify the command and arguments required to run the Filesystem MCP server package. This setup allows our chatbot to perform operations like creating, reading, and writing files in the specified directory.

Here, we only configure the current directory (./) to be used for filesystem operations, however, we can specify multiple directories by adding them to the args list.

During application startup, Spring AI will scan our configurations, create the MCP clients, and establish connections with their corresponding MCP servers. It also creates a bean of type SyncMcpToolCallbackProvider, which provides a list of all the tools exposed by the configured MCP servers.

3.3. Building a Basic Chatbot

With our AI model and MCP clients configured, let’s build a simple chatbot:

@Bean
ChatClient chatClient(ChatModel chatModel, SyncMcpToolCallbackProvider toolCallbackProvider) {
    return ChatClient
      .builder(chatModel)
      .defaultTools(toolCallbackProvider.getToolCallbacks())
      .build();
}

We start by creating a bean of type ChatClient using the ChatModel and SyncMcpToolCallbackProvider beans. The ChatClient class will act as our main entry point for interacting with our chat completion model, i.e., Claude 3.7 Sonnet.

Next, let’s inject the ChatClient bean to create a new ChatbotService class:

String chat(String question) {
    return chatClient
      .prompt()
      .user(question)
      .call()
      .content();
}

We create a chat() method where we pass the user’s question to the chat client bean and simply return the AI model’s response.

Now that we’ve implemented our service layer, let’s expose a REST API on top of it:

@PostMapping("/chat")
ResponseEntity<ChatResponse> chat(@RequestBody ChatRequest chatRequest) {
    String answer = chatbotService.chat(chatRequest.question());
    return ResponseEntity.ok(new ChatResponse(answer));
}

record ChatRequest(String question) {}

record ChatResponse(String answer) {}

We’ll use the above API endpoint to interact with our chatbot later in the tutorial.

4. Creating a Custom MCP Server

In addition to using pre-built MCP servers, we can create our own MCP servers to extend the capabilities of our chatbot with our business logic.

Let’s explore how to create a custom MCP server using Spring AI.

We’ll be creating a new Spring Boot application in this section.

4.1. Dependencies

First, let’s include the necessary dependency in our pom.xml file:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-mcp-server-webmvc-spring-boot-starter</artifactId>
    <version>1.0.0-M6</version>
</dependency>

We import Spring AI’s MCP server dependency, which provides the necessary classes for creating a custom MCP server that supports the HTTP-based SSE transport.

4.2. Defining and Exposing Custom Tools

Next, let’s define a few custom tools that our MCP server will expose.

We’ll create an AuthorRepository class that provides methods to fetch author details:

class AuthorRepository {
    @Tool(description = "Get Baeldung author details using an article title")
    Author getAuthorByArticleTitle(String articleTitle) {
        return new Author("John Doe", "[email protected]");
    }

    @Tool(description = "Get highest rated Baeldung authors")
    List<Author> getTopAuthors() {
        return List.of(
          new Author("John Doe", "[email protected]"),
          new Author("Jane Doe", "[email protected]")
        );
    }

    record Author(String name, String email) {
    }
}

For our demonstration, we’re returning hardcoded author details, but in a real application, the tools would typically interact with a database or an external API.

We annotate our two methods with @Tool annotation and provide a brief description for each of them. The description helps the AI model decide if and when to call the tools based on the user input and incorporate the result in its response.

Next, let’s register our author tools with the MCP server:

@Bean
ToolCallbackProvider authorTools() {
    return MethodToolCallbackProvider
      .builder()
      .toolObjects(new AuthorRepository())
      .build();
}

We use the MethodToolCallbackProvider to create a ToolCallbackProvider bean from the tools defined in our AuthorRepository class. The methods annotated with @Tool will get exposed as MCP tools when the application starts up.

4.3. Configuring an MCP Client for Our Custom MCP Server

Finally, to use our custom MCP server in our chatbot application, we need to configure an MCP client against it:

spring:
  ai:
    mcp:
      client:
        sse:
          connections:
            author-tools-server:
              url: http://localhost:8081

In the application.yaml file, we configure a new client against our custom MCP server. Note, that we’re using the sse transport type here.

This configuration assumes the MCP server to be running at http://localhost:8081, make sure to update the url if it’s running on a different host or port.

With this configuration, our MCP client can now invoke the tools exposed by our custom server, in addition to the tools provided by the Brave Search and Filesystem MCP servers.

5. Interacting With Our Chatbot

Now that we’ve built our chatbot and integrated it with various MCP servers, let’s interact with it and test it out.

We’ll use the HTTPie CLI to invoke the chatbot’s API endpoint:

http POST :8080/chat question="How much was Elon Musk's initial offer to buy OpenAI in 2025?"

Here, we send a simple question to the chatbot about an event that occurred after the LLM’s knowledge cut-off date. Let’s see what we get as a response:

{
    "answer": "Elon Musk's initial offer to buy OpenAI was $97.4 billion. [Source](https://www.reuters.com/technology/openai-board-rejects-musks-974-billion-offer-2025-02-14/)."
}

As we can see, the chatbot was able to perform a web search using the configured Brave Search MCP server and provide an accurate answer along with a source.

Next, let’s verify that the chatbot can perform filesystem operations using the Filesystem MCP server:

http POST :8080/chat question="Create a text file named 'mcp-demo.txt' with content 'This is awesome!'."

We instruct the chatbot to create an mcp-demo.txt file with certain content. Let’s see if it’s able to fulfill the request:

{
    "answer": "The text file named 'mcp-demo.txt' has been successfully created with the content you specified."
}

The chatbot responds with a successful response. We can verify that the file was created in the directory we specified in the application.yaml file.

Finally, let’s verify if the chatbot can call one of the tools exposed by our custom MCP server. We’ll enquire about the author details by mentioning an article title:

http POST :8080/chat question="Who wrote the article 'Testing CORS in Spring Boot?' on Baeldung, and how can I contact them?"

Let’s invoke the API and see if the chatbot response contains the hardcoded author details:

{
    "answer": "The article 'Testing CORS in Spring Boot' on Baeldung was written by John Doe. You can contact him via email at [[email protected]](mailto:[email protected])."
}

The above response verifies that the chatbot fetches the author details using the getAuthorByArticleTitle() tool that our custom MCP server exposes.

We highly recommend setting up the codebase locally and playing around with the chatbot using different prompts.

6. Conclusion

In this article, we’ve explored the Model Context Protocol and implemented its client-server architecture using Spring AI.

First, we built a simple chatbot using Anthropic’s Claude 3.7 Sonnet model to act as our MCP host.

Then, to provide our chatbot with web search capability and enable it to execute filesystem operations, we configured MCP clients against pre-built MCP server implementations of Brave Search API and Filesystem.

Finally, we created a custom MCP server and configured its corresponding MCP client inside our MCP host application.

The code backing this article is available on GitHub. Once you're logged in as a Baeldung Pro Member, start learning and coding on the project.
Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Microsoft – NPI EA (cat = Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Course – Spring Sale 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

Course – Spring Sale 2025 – NPI (All)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

Partner – Microsoft – NPI (cat=Spring)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

eBook Jackson – NPI EA – 3 (cat = Jackson)
4 Comments
Oldest
Newest
Inline Feedbacks
View all comments