Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Microsoft – NPI EA (cat = Baeldung)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Microsoft – NPI EA (cat= Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, you can get started over on the documentation page.

And, you can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat=Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag=Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – Guide Spring Cloud – NPI EA (cat=Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

eBook – Mockito – NPI EA (tag = Mockito)
announcement - icon

Mocking is an essential part of unit testing, and the Mockito library makes it easy to write clean and intuitive unit tests for your Java code.

Get started with mocking and improve your application tests using our Mockito guide:

Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Reactive – NPI EA (cat=Reactive)
announcement - icon

Spring 5 added support for reactive programming with the Spring WebFlux module, which has been improved upon ever since. Get started with the Reactor project basics and reactive programming in Spring Boot:

>> Join Pro and download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Jackson – NPI EA (cat=Jackson)
announcement - icon

Do JSON right with Jackson

Download the E-book

eBook – HTTP Client – NPI EA (cat=Http Client-Side)
announcement - icon

Get the most out of the Apache HTTP Client

Download the E-book

eBook – Maven – NPI EA (cat = Maven)
announcement - icon

Get Started with Apache Maven:

Download the E-book

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

eBook – RwS – NPI EA (cat=Spring MVC)
announcement - icon

Building a REST API with Spring?

Download the E-book

Course – LS – NPI EA (cat=Jackson)
announcement - icon

Get started with Spring and Spring Boot, through the Learn Spring course:

>> LEARN SPRING
Course – RWSB – NPI EA (cat=REST)
announcement - icon

Explore Spring Boot 3 and Spring 6 in-depth through building a full REST API with the framework:

>> The New “REST With Spring Boot”

Course – LSS – NPI EA (cat=Spring Security)
announcement - icon

Yes, Spring Security can be complex, from the more advanced functionality within the Core to the deep OAuth support in the framework.

I built the security material as two full courses - Core and OAuth, to get practical with these more complex scenarios. We explore when and how to use each feature and code through it on the backing project.

You can explore the course here:

>> Learn Spring Security

Course – All Access – NPI EA (cat= Spring)
announcement - icon

All Access is finally out, with all of my Spring courses. Learn JUnit is out as well, and Learn Maven is coming fast. And, of course, quite a bit more affordable. Finally.

>> GET THE COURSE
Course – LSD – NPI EA (tag=Spring Data JPA)
announcement - icon

Spring Data JPA is a great way to handle the complexity of JPA with the powerful simplicity of Spring Boot.

Get started with Spring Data JPA through the guided reference course:

>> CHECK OUT THE COURSE

Partner – LambdaTest – NPI EA (cat=Testing)
announcement - icon

End-to-end testing is a very useful method to make sure that your application works as intended. This highlights issues in the overall functionality of the software, that the unit and integration test stages may miss.

Playwright is an easy-to-use, but powerful tool that automates end-to-end testing, and supports all modern browsers and platforms.

When coupled with LambdaTest (an AI-powered cloud-based test execution platform) it can be further scaled to run the Playwright scripts in parallel across 3000+ browser and device combinations:

>> Automated End-to-End Testing With Playwright

Course – Spring Sale 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

Course – Spring Sale 2025 – NPI (cat=Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

eBook – Guide Spring Cloud – NPI (cat=Cloud/Spring Cloud)
announcement - icon

Let's get started with a Microservice Architecture with Spring Cloud:

>> Join Pro and download the eBook

1. Overview

NoSQL databases have become a popular choice for building an application’s persistence layer.

Amazon DynamoDB is one such serverless and fully managed NoSQL database provided by Amazon Web Services (AWS). For nearly a decade, DynamoDB has established itself as one of the most popular and widely used NoSQL databases in the cloud due to its scalability, flexibility, and performance.

The core components we interact with when working with DynamoDB are tables, items, and attributes. A table is a collection of items, and each item is a collection of attributes.

In this tutorial, we’ll explore integrating Amazon DynamoDB into a Spring Boot application.

2. Setting up the Project

Before we can start interacting with the Amazon DynamoDB service, we’ll need to include the necessary dependency and configure our application correctly.

2.1. Dependencies

We’ll be using Spring Cloud AWS to establish a connection and interact with the DynamoDB service, rather than using the DynamoDB SDK provided by AWS directly. Spring Cloud AWS is a wrapper around the official AWS SDKs, which significantly simplifies configuration and provides simple methods to interact with AWS services.

Let’s start by adding the DynamoDB starter dependency from Spring Cloud AWS to our project’s pom.xml file:

<dependency>
    <groupId>io.awspring.cloud</groupId>
    <artifactId>spring-cloud-aws-starter-dynamodb</artifactId>
    <version>3.3.0</version>
</dependency>

Next, let’s also include the Spring Cloud AWS BOM (Bill of Materials) to manage the version of the DynamoDB starter in our pom.xml:

<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>io.awspring.cloud</groupId>
            <artifactId>spring-cloud-aws</artifactId>
            <version>3.3.0</version>
            <type>pom</type>
            <scope>import</scope>
        </dependency>
    </dependencies>
</dependencyManagement>

With this addition, we can now remove the version tag from our starter dependency.

The BOM ensures version compatibility between the declared dependencies, avoids conflicts, and makes it easier to update dependency versions in the future.

2.2. Defining AWS Configuration Properties

Now, to interact with the DynamoDB service, we need to configure our AWS credentials for authentication and the AWS region where we’ve provisioned our table.

We’ll configure these properties in our application.yaml file:

spring:
  cloud:
    aws:
      dynamodb:
        region: ${AWS_REGION}
      credentials:
        access-key: ${AWS_ACCESS_KEY}
        secret-key: ${AWS_SECRET_KEY}

We use the ${} property placeholder to load the values of our properties from environment variables.

2.3. Domain Entity

Now, let’s define a simple User entity class that represents the data model for our DynamoDB table:

@DynamoDbBean
public class User {

    private UUID id;
    private String name;
    private String email;

    @DynamoDbPartitionKey
    public UUID getId() {
        return id;
    }

    // standard setters and getters
}

Here, we annotate our User class with the @DynamoDbBean annotation, marking it as an entity that can be mapped to a DynamoDB table. It’s important to note that Spring Cloud AWS is unable to map the table attributes when using the package-private modifier, hence the User entity and its corresponding getter and setter methods must be declared as public.

Additionally, we configure the id field as the partition key, i.e., the primary key for our table by annotating the getId() method with the @DynamoDbPartitionKey annotation. We should remember to place this annotation on the getter method, and not to the field itself.

DynamoDB also supports composite primary keys through the use of optional sort keys. We can define a composite primary key by annotating the corresponding getter methods of the additional fields with the @DynamoDbSortKey annotation. We won’t be using sort keys in this tutorial, but it’s good to be aware of their existence.

3. Defining a Custom DynamoDbTableNameResolver Bean

By default, Spring Cloud AWS converts the entity class name into its snake case representation to determine the corresponding DynamoDB table name. However, this convention might not always align with our naming conventions or requirements.

We can define a custom bean that implements the DynamoDbTableNameResolver interface to override this default behavior.

Let’s start by creating a custom annotation to specify the table name directly on our entity class:

@Target(TYPE)
@Retention(RUNTIME)
@interface TableName {
    String name();
}

We create a simple @TableName annotation, which takes a single name attribute where we can specify the desired DynamoDB table name.

Next, let’s create our custom DynamoDbTableNameResolver implementation:

@Component
class CustomTableNameResolver implements DynamoDbTableNameResolver {

    @Override
    public <T> String resolve(Class<T> clazz) {
        return clazz.getAnnotation(TableName.class).name();
    }
}

Here, we override the resolve() method in our CustomTableNameResolver class. We retrieve the name attribute value of the @TableName annotation applied to the entity class and use it as the table name.

Finally, let’s annotate our User class with the new @TableName annotation:

@DynamoDbBean
@TableName(name = "users")
class User {
    // ...
}

With this configuration, Spring Cloud AWS will use our CustomTableNameResolver class to determine that the User entity maps to the users table in DynamoDB.

4. Setting up Local Test Environment With LocalStack

During development, it’s often convenient to test our application locally. LocalStack is a popular tool that allows us to run an emulated AWS environment locally on our machine. We’ll use Testcontainers to set up the LocalStack service in our application.

The prerequisite for running the LocalStack service via Testcontainers is an active Docker instance. We need to ensure this prerequisite is met when running the test suite either locally or when using a CI/CD pipeline.

4.1. Test Dependencies

First, let’s add the necessary test dependencies to our pom.xml:

<dependency>
    <groupId>io.awspring.cloud</groupId>
    <artifactId>spring-cloud-aws-testcontainers</artifactId>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>localstack</artifactId>
    <scope>test</scope>
</dependency>

We import the Spring Cloud AWS Testcontainers dependency and the LocalStack module of Testcontainers. These dependencies provide us with the necessary classes to spin up ephemeral Docker instances for the LocalStack service.

4.2. Provisioning DynamoDB Table Using Init Hooks

Next, we’ll need to provision a DynamoDB table that our application can interact with. Localstack provides the ability to create the required AWS resources when the container is started via Initialization Hooks.

Let’s create an init-dynamodb-table.sh bash script for this purpose inside our src/test/resources folder:

#!/bin/bash
table_name="users"
partition_key="id"

awslocal dynamodb create-table
  --table-name "$table_name"
  --key-schema AttributeName="$partition_key",KeyType=HASH
  --attribute-definitions AttributeName="$partition_key",AttributeType=S
  --billing-mode PAY_PER_REQUEST

echo "DynamoDB table '$table_name' created successfully with partition key '$partition_key'"
echo "Executed init-dynamodb-table.sh"

The above script creates a DynamoDB table with the name users. We use the awslocal command inside the shell script, which is a wrapper around the AWS CLI that points to the LocalStack service. We end the script by writing a few echo statements to confirm the script’s successful execution.

We’ll copy this script to the /etc/localstack/init/ready.d path inside the LocalStack container for execution in the upcoming section.

4.3. Defining LocalStackContainer Bean

Next, let’s create a @TestConfiguration class that defines our Testcontainers bean:

@TestConfiguration(proxyBeanMethods = false)
class TestcontainersConfiguration {

    @Bean
    @ServiceConnection
    LocalStackContainer localStackContainer() {
        return new LocalStackContainer(DockerImageName.parse("localstack/localstack:4.3.0"))
          .withServices(LocalStackContainer.Service.DYNAMODB)
          .withCopyFileToContainer(
            MountableFile.forClasspathResource("init-dynamodb-table.sh", 0744),
            "/etc/localstack/init/ready.d/init-dynamodb-table.sh"
          )
          .waitingFor(Wait.forLogMessage(".*Executed init-dynamodb-table.sh.*", 1));
    }
}

We specify the latest stable version of the LocalStack Docker image when creating our LocalStackContainer bean.

Then, we enable the DynamoDB service and copy our bash script into the container to ensure the table creation. Additionally, we configure a strategy to wait for the Executed init-dynamodb-table.sh statement to be printed, as defined in our init script.

We also annotate our bean method with the @ServiceConnection annotation, which dynamically registers all the properties required to set up a connection with the started LocalStack container. Now, we can use this configuration by annotating our test classes with the @Import(TestcontainersConfiguration.class) annotation.

5. Interacting With Our DynamoDB Table

Now that we have our local environment set up, let’s use the DynamoDbTemplate bean that Spring Cloud AWS automatically creates for us to interact with the users table we’ve provisioned.

5.1. Performing Basic CRUD Operations

Let’s start by creating a new User item in the provisioned DynamoDB table:

User user = Instancio.create(User.class);

dynamoDbTemplate.save(user);

Key partitionKey = Key.builder().partitionValue(user.getId().toString()).build();
User retrievedUser = dynamoDbTemplate.load(partitionKey, User.class);
assertThat(retrievedUser)
  .isNotNull()
  .usingRecursiveComparison()
  .isEqualTo(user);

We use Instancio to create a new User object with random test data. Then, we persist it to our provisioned table using the save() method of the DynamoDbTemplate bean.

To verify this operation, we create a Key object using the user’s partition key value and pass it to the load() method. Then, we assert that the retrievedUser matches the original user we persisted.

Next, let’s update an existing User item:

String updatedName = RandomString.make();
String updatedEmail = RandomString.make();
user.setName(updatedName);
user.setEmail(updatedEmail);
dynamoDbTemplate.update(user);

Key partitionKey = Key.builder().partitionValue(user.getId().toString()).build();
User updatedUser = dynamoDbTemplate.load(partitionKey, User.class);
assertThat(updatedUser.getName())
  .isEqualTo(updatedName);
assertThat(updatedUser.getEmail())
  .isEqualTo(updatedEmail);

Here, we update the name and email attributes of an already persisted User object. We call the update() method of the DynamoDbTemplate bean to persist the changes. We assert that the attributes have been modified correctly by retrieving the updated user.

Finally, let’s delete a User item from our table:

dynamoDbTemplate.delete(user);

Key partitionKey = Key.builder().partitionValue(user.getId().toString()).build();
User deletedUser = dynamoDbTemplate.load(partitionKey, User.class);
assertThat(deletedUser)
  .isNull();

We call the delete() method of the DynamoDbTemplate bean, passing the User object to be deleted. Then, we attempt to retrieve the deleted object using the same partition key and assert that it no longer exists in the table.

5.2. Performing Scan Operations

We’ve already seen that we can use the load() method of the DynamoDbTemplate bean to fetch an item from the table using its partition key.

However, we may also need to retrieve multiple items or filter them based on non-key attributes. DynamoDB provides the scan operation to achieve this.

First, let’s see how to retrieve all User items from the provisioned table:

int numberOfUsers = 10;
for (int i = 0; i < numberOfUsers; i++) {
    User user = Instancio.create(User.class);
    dynamoDbTemplate.save(user);
}

List<User> retrievedUsers = dynamoDbTemplate
  .scanAll(User.class)
  .items()
  .stream()
  .toList();

assertThat(retrievedUsers.size())
  .isEqualTo(numberOfUsers);

We save multiple User objects to our table and then use the scanAll() method to retrieve all items. We assert that the number of retrieved users matches the number of users we initially saved.

By default, if the query result exceeds 1 MB in size, DynamoDB paginates the response and returns a token to fetch the next page. However, Spring Cloud AWS handles the pagination for us behind the scenes and returns all the items saved in the table.

Additionally, we can perform a scan operation with a filter expression to retrieve specific items:

Expression expression = Expression.builder()
  .expression("#email = :email")
  .putExpressionName("#email", "email")
  .putExpressionValue(":email", AttributeValue.builder().s(user.getEmail()).build())
  .build();
ScanEnhancedRequest scanRequest = ScanEnhancedRequest
  .builder()
  .filterExpression(expression)
  .build();
User retrievedUser = dynamoDbTemplate.scan(scanRequest, User.class)
  .items()
  .stream()
  .findFirst()
  .get();

assertThat(retrievedUser)
  .isNotNull()
  .usingRecursiveComparison()
  .isEqualTo(user);

Here, we create an Expression object with a filter condition that matches the email attribute of a specific User. We wrap this expression inside a ScanEnhancedRequest object and pass it to the scan() method. Finally, we assert that the retrievedUser matches the user we’d initially saved.

6. IAM Permissions

We’ve used the LocalStack emulator for our demonstration. However, when working against the real DynamoDB service, we’ll need to assign the following IAM policy to the IAM user we’ve configured in our application:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "dynamodb:PutItem",
                "dynamodb:GetItem",
                "dynamodb:UpdateItem",
                "dynamodb:DeleteItem",
                "dynamodb:Scan"
            ],
            "Resource": "arn:aws:dynamodb:REGION:ACCOUNT_ID:table/users"
        }
    ]
}

Here, we grant permissions for the specific actions our application performs on the users table. Our IAM policy conforms to the least privilege principle, granting only the necessary permissions required by our application to function correctly.

We should remember to replace the REGION and ACCOUNT_ID placeholders with the actual values in the Resource ARN.

7. Conclusion

In this article, we’ve explored integrating Amazon DynamoDB into a Spring Boot application using Spring Cloud AWS.

We walked through the necessary configuration, defined our data model along with a custom table name resolver. Then, using Testcontainers, we started an ephemeral Docker container for the LocalStack service, creating a local test environment.

Finally, we used the DynamoDbTemplate to perform basic CRUD operations and scan operations on our provisioned DynamoDB table and discussed the required IAM permissions.

As always, all the code examples used in this article are available over on GitHub.

Baeldung Pro – NPI EA (cat = Baeldung)
announcement - icon

Baeldung Pro comes with both absolutely No-Ads as well as finally with Dark Mode, for a clean learning experience:

>> Explore a clean Baeldung

Once the early-adopter seats are all used, the price will go up and stay at $33/year.

Partner – Microsoft – NPI EA (cat = Spring Boot)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

Partner – Orkes – NPI EA (cat = Spring)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

Partner – Orkes – NPI EA (tag = Microservices)
announcement - icon

Modern software architecture is often broken. Slow delivery leads to missed opportunities, innovation is stalled due to architectural complexities, and engineering resources are exceedingly expensive.

Orkes is the leading workflow orchestration platform built to enable teams to transform the way they develop, connect, and deploy applications, microservices, AI agents, and more.

With Orkes Conductor managed through Orkes Cloud, developers can focus on building mission critical applications without worrying about infrastructure maintenance to meet goals and, simply put, taking new products live faster and reducing total cost of ownership.

Try a 14-Day Free Trial of Orkes Conductor today.

eBook – HTTP Client – NPI EA (cat=HTTP Client-Side)
announcement - icon

The Apache HTTP Client is a very robust library, suitable for both simple and advanced use cases when testing HTTP endpoints. Check out our guide covering basic request and response handling, as well as security, cookies, timeouts, and more:

>> Download the eBook

eBook – Java Concurrency – NPI EA (cat=Java Concurrency)
announcement - icon

Handling concurrency in an application can be a tricky process with many potential pitfalls. A solid grasp of the fundamentals will go a long way to help minimize these issues.

Get started with understanding multi-threaded applications with our Java Concurrency guide:

>> Download the eBook

eBook – Java Streams – NPI EA (cat=Java Streams)
announcement - icon

Since its introduction in Java 8, the Stream API has become a staple of Java development. The basic operations like iterating, filtering, mapping sequences of elements are deceptively simple to use.

But these can also be overused and fall into some common pitfalls.

To get a better understanding on how Streams work and how to combine them with other language features, check out our guide to Java Streams:

>> Join Pro and download the eBook

eBook – Persistence – NPI EA (cat=Persistence)
announcement - icon

Working on getting your persistence layer right with Spring?

Explore the eBook

Course – LS – NPI EA (cat=REST)

announcement - icon

Get started with Spring Boot and with core Spring, through the Learn Spring course:

>> CHECK OUT THE COURSE

Course – Spring Sale 2025 – NPI EA (cat= Baeldung)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

Course – Spring Sale 2025 – NPI (All)
announcement - icon

Yes, we're now running our Spring Sale. All Courses are 25% off until 26th May, 2025:

>> EXPLORE ACCESS NOW

Partner – Microsoft – NPI (cat=Spring)
announcement - icon

Azure Container Apps is a fully managed serverless container service that enables you to build and deploy modern, cloud-native Java applications and microservices at scale. It offers a simplified developer experience while providing the flexibility and portability of containers.

Of course, Azure Container Apps has really solid support for our ecosystem, from a number of build options, managed Java components, native metrics, dynamic logger, and quite a bit more.

To learn more about Java features on Azure Container Apps, visit the documentation page.

You can also ask questions and leave feedback on the Azure Container Apps GitHub page.

eBook Jackson – NPI EA – 3 (cat = Jackson)
eBook – eBook Guide Spring Cloud – NPI (cat=Cloud/Spring Cloud)