Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Microsoft – NPI EA (cat = Baeldung)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Microsoft – NPI EA (cat= Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, you can get started over on the documentation page.

And, you can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – All Access – NPI EA (cat= Spring)
announcement - icon

All Access is finally out, with all of my Spring courses. Learn JUnit is out as well, and Learn Maven is coming fast. And, of course, quite a bit more affordable. Finally.

>> GET THE COURSE
Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – LambdaTest – NPI EA (cat=Testing)
announcement - icon

End-to-end testing is a very useful method to make sure that your application works as intended. This highlights issues in the overall functionality of the software, that the unit and integration test stages may miss.

Playwright is an easy-to-use, but powerful tool that automates end-to-end testing, and supports all modern browsers and platforms.

When coupled with LambdaTest (an AI-powered cloud-based test execution platform) it can be further scaled to run the Playwright scripts in parallel across 3000+ browser and device combinations:

>> Automated End-to-End Testing With Playwright

Course – Spring Sale 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

Course – Spring Sale 2025 – NPI (cat=Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

1. Introduction

Protocol Buffers (protobuf) offer a fast and efficient way to serialize structured data. They’re a compact, high-performance alternative to JSON.

Unlike JSON, which is text-based and needs parsing, protobuf generates optimized code for multiple languages. This makes it easier to send structured data between different systems.

With protobuf, we define the data structure once in a .proto file. Then, we use the generated code to handle data transmission across streams and platforms. They’re ideal when dealing with typed, structured data—especially if the payload is just a few megabytes.

Protobuf supports common types like strings, integers, booleans, and floats. They also work well with lists and maps, making complex data easy to manage. In this tutorial, we’ll learn how to use maps in protobuf.

2. Understanding Maps in Protobuf

Let’s explore how we can define and use maps as part of a protobuf message.

2.1. What Are Maps?

A map is a key-value data structure, similar to a dictionary.

Each key links to a specific value, which makes lookups fast and efficient. We can think of a DNS system as an analogy: each domain name points to an IP address. Maps work in a similar way.

2.2. Syntax for Defining Maps

Protobuf 3 supports maps out of the box.

Here’s a simple example:

message Dictionary {
    map<string, string> pairs = 1;
}

map<key_type, value_type> defines the field. The key must be a scalar type like string, int32, or bool. The value can be any valid protobuf type – scalar, enum, or even another message.

3. Implementing Maps in Protobuf

Now that we’ve explored the benefits of using protobuf, let’s put theory into practice by building a food delivery system where each restaurant has its own menu.

3.1. Setting up Protobuf in Our Codebase

Before defining the message structure, the Protoc compiler must be integrated into the build lifecycle. This can be achieved by configuring the protobuf-maven-plugin in the project’s pom.xml file. By doing so, the protocol buffer definitions are automatically compiled into Java classes during the Maven build process.

Let’s add the plugin configuration to the build section of the pom.xml file:

<build>
    <plugins>
        <plugin>
              <groupId>org.xolstice.maven.plugins</groupId>
              <artifactId>protobuf-maven-plugin</artifactId>
              <version>0.6.1</version>
            <configuration>
                <protoSourceRoot>${project.basedir}/src/main/proto</protoSourceRoot>
                <protocArtifact>com.google.protobuf:protoc:4.30.2:exe:${os.detected.classifier}</protocArtifact>
            </configuration>
            <executions>
                <execution>
                    <goals>
                        <goal>compile</goal>
                        <goal>test-compile</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

In addition to the compiler, the protocol buffers runtime is also required. Let’s add its dependency to the Maven POM file:

<dependency>
    <groupId>com.google.protobuf</groupId>
    <artifactId>protobuf-java</artifactId>
    <version>4.30.2</version>
</dependency>

We may use another version of the runtime, provided that it’s the same as the compiler’s version.

3.2. Defining a Message With Map Field

Let’s start by defining a simple protobuf message that includes a map. Here, we create a protobuf schema where the restaurants map stores a restaurant name as a key and its menu as a value. The menu itself is another map, mapping food items to their price:

syntax = "proto3"
message Menu {
    map<string, float> items = 1;
}

message FoodDelivery {
    map<string, Menu> restaurants = 1;
}

3.3. Populating the Map

Now that we’ve defined a map in our schema, we need to populate it with data in our code.

The map<k, v> structure in protobuf behaves like a Java HashMap, allowing us to store key-value pairs efficiently.

In our case, we use a map<string, Menu> to store restaurant names as keys and their corresponding menu items as values:

Food.Menu pizzaMenu = Food.Menu.newBuilder()
  .putItems("Margherita", 12.99f)
  .putItems("Pepperoni", 14.99f)
  .build();

Food.Menu sushiMenu = Food.Menu.newBuilder()
  .putItems("Salmon Roll", 10.50f)
  .putItems("Tuna Roll", 12.33f)
  .build();

Initially, we define the menu for the restaurants. Then we populate our map with the restaurant names and their respective menus too:

Food.FoodDelivery.Builder foodData = Food.FoodDelivery.newBuilder();

We start by creating an instance of the map. Next, we’ll simply put the restaurants in place and build the map:

foodData.putRestaurants("Pizza Place", pizzaMenu);
foodData.putRestaurants("Sushi Place", sushiMenu);

return foodData.build();

4. Storing and Retrieving Data From a Binary File

Next, we’ll write our protobuf map data to a binary file – a process known as serialization. This ensures efficient storage and easy transmission. And, of course, we’ll also read it back by deserializing the field.

4.1. Serializing the Protobuf Map to a Binary File

Serialization converts our structured data into a compact binary format, making it lightweight and fast to store or send over a network. Let’s see how we can implement this.

We start by defining a file path for the file where we’ll be writing this data:

private final String FILE_PATH = "src/main/resources/foodfile.bin";

Then, we’ll write the logic to serialize the file:

public void serializeToFile(Food.FoodDelivery delivery) {
    try (FileOutputStream fos = new FileOutputStream(FILE_PATH)) {
        delivery.writeTo(fos);
        logger.info("Successfully wrote to the file.");
    } catch (IOException ioe) {
        logger.warning("Error serializing the Map or writing the file");
    }
}

The generated source files allow direct writing to an output stream.

4.2. Deserializing the Binary File to a Protobuf Map

Now, let’s deserialize the binary file back into a Protobuf map. We’ll start by opening an input stream and use the methods generated by the Protobuf compiler to parse the stored data:

public Food.FoodDelivery deserializeFromFile(Food.FoodDelivery delivery) {
    try (FileInputStream fis = new FileInputStream(FILE_PATH)) {
        return Food.FoodDelivery.parseFrom(fis);
    } catch (FileNotFoundException e) {
        logger.severe(String.format("File not found: %s location", FILE_PATH));
        return Food.FoodDelivery.newBuilder().build();
    } catch (IOException e) {
        logger.warning(String.format("Error reading file: %s location", FILE_PATH));
        return Food.FoodDelivery.newBuilder().build();
    }
}

We open a file input stream and pass it on to the parseFrom() method, which reconstructs the protobuf object. If the file is missing or is empty, then we log the issue.

4.3. Displaying the Results After Deserialization

Now that we’ve deserialized the data, we can proceed with displaying the deserialized results:

public void displayRestaurants(Food.FoodDelivery delivery) {
    Map<String, Food.Menu> restaurants = delivery.getRestaurantsMap();
    for (Map.Entry<String, Food.Menu> restaurant : restaurants.entrySet()) {
        logger.info(String.format("Restaurant: %s", restaurant.getKey()));
        restaurant.getValue()
          .getItemsMap()
          .forEach((menuItem, price) -> logger.info(String.format(" - %s costs $ %.2f", menuItem, price)));
    }
}

Here, we display the stored data. Since our map holds all the restaurant names as keys and their respective menus as values, we can simply iterate through the data and log each restaurant along with its menu items and prices.

5. Testing Our Implementation

To ensure our Protobuf map operations work correctly, we verify that the serialization correctly writes data to the file and confirm that the deserialization restores the original data correctly.

We also need to capture the log output and check if the expected data is logged. So we begin with verifying that the serialization happens correctly:

@Test
void givenProtobufObject_whenSerializeToFile_thenFileShouldExist() {
    foodDelivery.serializeToFile(testData);
    File file = new File(FILE_PATH);
    assertTrue(file.exists(), "Serialized file should exist");
}

Once we’re done with verifying the serialization, let’s see how we can test whether deserialization happens:

@Test
void givenSerializedFile_whenDeserialize_thenShouldMatchOriginalData() {
    foodDelivery.serializeToFile(testData);
    Food.FoodDelivery deserializedData = foodDelivery.deserializeFromFile(testData);
    assertEquals(testData.getRestaurantsMap(), deserializedData.getRestaurantsMap(), "Deserialized data should match the original data");
}

Here we first serialize the file and then check if the testData map is equal to the deserializedData‘s map. After this, let’s verify whether we get the data logged correctly:

@Test
void givenDeserializedObject_whenDisplayRestaurants_thenShouldLogCorrectOutput() {
    foodDelivery.serializeToFile(testData);
    Food.FoodDelivery deserializedData = foodDelivery.deserializeFromFile(testData);
    Logger logger = Logger.getLogger(FoodDelivery.class.getName());
    TestLogHandler testHandler = new TestLogHandler();
    logger.addHandler(testHandler);
    logger.setUseParentHandlers(false);
    foodDelivery.displayRestaurants(deserializedData);
    List<String> logs = testHandler.getLogs();
    assertTrue(logs.stream().anyMatch(log -> log.contains("Restaurant: Pizza Place")),
      "Log should contain 'Restaurant: Pizza Place'");
    assertTrue(logs.stream().anyMatch(log -> log.contains("Margherita costs $ 12.99")),
      "Log should contain 'Margherita costs $ 12.99'");
}

To verify whether our application logs expected messages during execution, we need a way to capture and inspect log output programmatically. The TestLogHandler helps us do exactly that by extending Java’s Handler:

static class TestLogHandler extends Handler {
    private final List<String> logMessages = new ArrayList<>();

    @Override
    public void publish(LogRecord record) {
        if (record.getLevel().intValue() >= Level.INFO.intValue()) {
            logMessages.add(record.getMessage());
        }
    }

    @Override
    public void flush() {
    }

    @Override
    public void close() throws SecurityException {
    }

    public List<String> getLogs() {
        return logMessages;
    }
}

It has a list of log messages where we push in each and every LogRecord that has a level greater than or equal to the level of INFO log. We store it in a list since it helps keep the order of the logs as they appear in the console.

6. Conclusion

Using maps in Protobuf provides a structured and efficient way to manage key-value relationships in our data models. In this article, we explored how to define, serialize, and deserialize Protobuf maps in Java, ensuring that our data remains compact, readable, and easily transferable. By implementing robust unit tests, we verified that our serialization and deserialization processes function correctly, maintaining data integrity.

With the right Maven setup and best practices in place, we can now confidently integrate Protobuf maps into our applications, leveraging their performance benefits while keeping our codebase clean and maintainable.

The code associated with this article is available over on GitHub.

Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Microsoft – NPI EA (cat = Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Course – Spring Sale 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

Course – Spring Sale 2025 – NPI (All)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

eBook Jackson – NPI EA – 3 (cat = Jackson)
Subscribe
Notify of
guest
0 Comments
Oldest
Newest
Inline Feedbacks
View all comments