Free AWS SAA-C03 Practice Questions – Latest Exam Material to Pass Fast

Want to pass the AWS SAA-C03 exam fast? Download free SAA-C03 practice questions. Read or share comments. Start studying now—are you ready?

Sep 8, 2025 - 16:08
Sep 8, 2025 - 16:13
 0  1
Free AWS SAA-C03 Practice Questions – Latest Exam Material to Pass Fast

Free AWS SAA-C03 Practice Questions – Latest Exam Material to Pass Fast

NEW QUESTION 1

- (Exam Topic 1)

A solutions architect is designing the cloud architecture for a new application being deployed on AWS. The process should run in parallel while adding and removing application nodes as needed based on the number of jobs to be processed. The processor application is stateless. The solutions architect must ensure that the application is loosely coupled and the job items are durably stored.

Which design should the solutions architect use?

 

A.  Create an Amazon SNS topic to send the jobs that need to be processed Create an Amazon Machine Image (AMI) that consists of the processor application Create a launch configuration that uses the AMI Create an Auto Scaling group using the launch configuration Set the scaling policy for the Auto Scaling group to add and remove nodes based on CPU usage

B.  Create an Amazon SQS queue to hold the jobs that need to be processed Create an Amazon Machine image (AMI) that consists of the processor application Create a launch configuration that uses the AM' Create an Auto Scaling group using the launch configuration Set the scaling policy for the Auto Scaling group to add and remove nodes based on network usage

C.  Create an Amazon SQS queue to hold the jobs that needs to be processed Create an Amazon Machine image (AMI) that consists of the processor application Create a launch template that uses the AMI Create an Auto Scaling group using the launch template Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue

D.  Create an Amazon SNS topic to send the jobs that need to be processed Create an Amazon Machine Image (AMI) that consists of the processor application Create a launch template that uses the AMI Create an Auto Scaling group using the launch template Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of messages published to the SNS topic

 

Answer: C

 

Explanation:

"Create an Amazon SQS queue to hold the jobs that needs to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue"

In this case we need to find a durable and loosely coupled solution for storing jobs. Amazon SQS is ideal for this use case and can be configured to use dynamic scaling based on the number of jobs waiting in the queue.To configure this scaling you can use the backlog per instance metric with the target value being the acceptable backlog per instance to maintain. You can calculate these numbers as follows: Backlog per instance: To calculate your backlog per instance, start with the ApproximateNumberOfMessages queue attribute to determine the length of the SQS queue

 

 

NEW QUESTION 2

- (Exam Topic 1)

An application development team is designing a microservice that will convert large images to smaller, compressed images. When a user uploads an image through the web interface, the microservice should store the image in an Amazon S3 bucket, process and compress the image with an AWS Lambda function, and store the image in its compressed form in a different S3 bucket.

A solutions architect needs to design a solution that uses durable, stateless components to process the images automatically. Which combination of actions will meet these requirements? (Choose two.)

 

A.  Create an Amazon Simple Queue Service (Amazon SQS) queue Configure the S3 bucket to send a notification to the SQS queue when an image is uploaded to the S3 bucket

B.  Configure the Lambda function to use the Amazon Simple Queue Service (Amazon SQS) queue as the invocation source When the SQS message is successfully processed, delete the message in the queue

C.  Configure the Lambda function to monitor the S3 bucket for new uploads When an uploaded image is detected write the file name to a text file in memory and use the text file to keep track of the images that were processed

D.  Launch an Amazon EC2 instance to monitor an Amazon Simple Queue Service (Amazon SQS) queue When items are added to the queue log the file name in a text file on the EC2 instance and invoke the Lambda function

E.  Configure an Amazon EventBridge (Amazon CloudWatch Events) event to monitor the S3 bucket When an image is uploade

F.  send an alert to an Amazon Simple Notification Service (Amazon SNS) topic with the application owner's email address for further processing

 

Answer: AB

 

Explanation:

 Creating an Amazon Simple Queue Service (SQS) queue and configuring the S3 bucket to send a notification to the SQS queue when an image is uploaded to the S3 bucket will ensure that the Lambda function is triggered in a stateless and durable manner.

 Configuring the Lambda function to use the SQS queue as the invocation source, and deleting the message in the queue after it is successfully processed will ensure that the Lambda function processes the image in a stateless and durable manner.

Amazon SQS is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. SQS eliminates the complexity and overhead associated with managing and operating-message oriented middleware, and empowers developers to focus on differentiating work. When new images are uploaded to the S3 bucket, SQS will trigger the Lambda function to process the image and compress it. Once the image is processed, the SQS message is deleted, ensuring that the Lambda function is stateless and durable.

 

 

NEW QUESTION 3

- (Exam Topic 1)

A company is designing an application. The application uses an AWS Lambda function to receive information through Amazon API Gateway and to store the information in an Amazon Aurora PostgreSQL database.

During the proof-of-concept stage, the company has to increase the Lambda quotas significantly to handle the high volumes of data that the company needs to load into the database. A solutions architect must recommend a new design to improve scalability and minimize the configuration effort.

Which solution will meet these requirements?

 

A.  Refactor the Lambda function code to Apache Tomcat code that runs on Amazon EC2 instances.Connect the database by using native Java Database Connectivity (JDBC) drivers.

B.  Change the platform from Aurora to Amazon DynamoD

C.  Provision a DynamoDB Accelerator (DAX) cluste

D.  Use the DAX client SDK to point the existing DynamoDB API calls at the DAX cluster.

E.  Set up two Lambda function

F.  Configure one function to receive the informatio

G.  Configure the other function to load the information into the databas

H.  Integrate the Lambda functions by using Amazon Simple Notification Service (Amazon SNS).

 

I.  Set up two Lambda function

J.  Configure one function to receive the informatio

K.  Configure the other function to load the information into the databas

L.  Integrate the Lambda functions by using an Amazon Simple Queue Service (Amazon SQS) queue.

 

Answer: B

 

Explanation:

bottlenecks can be avoided with queues (SQS).

 

 

NEW QUESTION 4

- (Exam Topic 1)

A solutions architect is designing a new hybrid architecture to extend a company s on-premises infrastructure to AWS The company requires a highly available connection with consistent low latency to an AWS Region. The company needs to minimize costs and is willing to accept slower traffic if the primary connection fails.

What should the solutions architect do to meet these requirements?

 

A.  Provision an AWS Direct Connect connection to a Region Provision a VPN connection as a backup if the primary Direct Connect connection fails.

B.  Provision a VPN tunnel connection to a Region for private connectivit

C.  Provision a second VPN tunnel for private connectivity and as a backup if the primary VPN connection fails.

D.  Provision an AWS Direct Connect connection to a Region Provision a second Direct Connect connection to the same Region as a backup if the primary Direct Connect connection fails.

E.  Provision an AWS Direct Connect connection to a Region Use the Direct Connect failover attribute from the AWS CLI to automatically create a backup connection if the primary Direct Connect connection fails.

 

Answer: A

 

Explanation:

"In some cases, this connection alone is not enough. It is always better to guarantee a fallback connection as the backup of DX. There are several options, but implementing it with an AWS Site-To-Site VPN is a real cost-effective solution that can be exploited to reduce costs or, in the meantime, wait for the setup of a second DX."

https://www.proud2becloud.com/hybrid-cloud-networking-backup-aws-direct-connect-network-connection-with

 

 

NEW QUESTION 5

- (Exam Topic 1)

A company is hosting a web application on AWS using a single Amazon EC2 instance that stores

user-uploaded documents in an Amazon EBS volume. For better scalability and availability, the company duplicated the architecture and created a second EC2 instance and EBS volume in another Availability Zone placing both behind an Application Load Balancer After completing this change, users reported that, each time they refreshed the website, they could see one subset of their documents or the other, but never all of the documents at the same time.

What should a solutions architect propose to ensure users see all of their documents at once?

 

A.  Copy the data so both EBS volumes contain all the documents.

B.  Configure the Application Load Balancer to direct a user to the server with the documents

C.  Copy the data from both EBS volumes to Amazon EFS Modify the application to save new documents to Amazon EFS

D.  Configure the Application Load Balancer to send the request to both servers Return each document from the correct server.

 

Answer: A

 

Explanation:

Amazon EFS provides file storage in the AWS Cloud. With Amazon EFS, you can create a file system, mount the file system on an Amazon EC2 instance, and then read and write data to and from your file system. You can mount an Amazon EFS file system in your VPC, through the Network File System versions 4.0 and a4.1 (NFSv4) protocol. We recommend using a current generation Linux NFSv4.1 client, such as those found in the latest Amazon Linux, Redhat, and Ubuntu AMIs, in conjunction with the Amazon EFS Mount Helper. For instructions, see Using the amazon-efs-utils Tools.

For a list of Amazon EC2 Linux Amazon Machine Images (AMIs) that support this protocol, see NFS Support. For some AMIs, you'll need to install an NFS client to mount your file system on your Amazon EC2 instance. For instructions, see Installing the NFS Client.

You can access your Amazon EFS file system concurrently from multiple NFS clients, so applications that scale beyond a single connection can access a file system. Amazon EC2 instances running in multiple Availability Zones within the same AWS Region can access the file system, so that many users can access and share a common data source.

https://docs.aws.amazon.com/efs/latest/ug/how-it-works.html#how-it-works-ec2

 

 

NEW QUESTION 6

-  (Exam Topic 1)

A company is planning to use an Amazon DynamoDB table for data storage. The company is concerned about cost optimization. The table will not be used on most mornings. In the evenings, the read and write traffic will often be unpredictable. When traffic spikes occur, they will happen very quickly.

What should a solutions architect recommend?

 

A.  Create a DynamoDB table in on-demand capacity mode.

B.  Create a DynamoDB table with a global secondary index.

C.  Create a DynamoDB table with provisioned capacity and auto scaling.

D.  Create a DynamoDB table in provisioned capacity mode, and configure it as a global table.

 

Answer: A

 

 

NEW QUESTION 7

-  (Exam Topic 1)

An application allows users at a company's headquarters to access product data. The product data is stored in an Amazon RDS MySQL DB instance. The operations team has isolated an application performance slowdown and wants to separate read traffic from write traffic. A solutions architect needs to optimize the application's performance quickly.

What should the solutions architect recommend?

 

A.  Change the existing database to a Multi-AZ deploymen

B.  Serve the read requests from the primary Availability Zone.

C.  Change the existing database to a Multi-AZ deploymen

D.  Serve the read requests from the secondary Availability Zone.

E.  Create read replicas for the databas

F.  Configure the read replicas with half of the compute and storage resources as the source database.

G.  Create read replicas for the databas

H.  Configure the read replicas with the same compute and storage resources as the source database.

 

Answer: D

 

Explanation:

https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_MySQL.Replication.ReadReplicas.html

 

 

NEW QUESTION 8

-  (Exam Topic 1)

A company recently launched a variety of new workloads on Amazon EC2 instances in its AWS account. The company needs to create a strategy to access and administer the instances remotely and securely. The company needs to implement a repeatable process that works with native AWS services and follows the AWS Well-Architected Framework.

Which solution will meet these requirements with the LEAST operational overhead?

 

A.  Use the EC2 serial console to directly access the terminal interface of each instance for administration.

B.  Attach the appropriate IAM role to each existing instance and new instanc

C.  Use AWS Systems Manager Session Manager to establish a remote SSH session.

D.  Create an administrative SSH key pai

E.  Load the public key into each EC2 instanc

F.  Deploy a bastion host in a public subnet to provide a tunnel for administration of each instance.

G.  Establish an AWS Site-to-Site VPN connectio

H.  Instruct administrators to use their local on-premises machines to connect directly to the instances by using SSH keys across the VPN tunnel.

 

Answer: B

 

Explanation:

https://docs.aws.amazon.com/systems-manager/latest/userguide/setup-launch-managed-instance.html

 

 

NEW QUESTION 9

-  (Exam Topic 1)

A company recently launched Linux-based application instances on Amazon EC2 in a private subnet and launched a Linux-based bastion host on an Amazon EC2 instance in a public subnet of a VPC A solutions architect needs to connect from the on-premises network, through the company's internet connection to the bastion host and to the application servers The solutions architect must make sure that the security groups of all the EC2 instances will allow that access

Which combination of steps should the solutions architect take to meet these requirements? (Select TWO)

 

A.  Replace the current security group of the bastion host with one that only allows inbound access from the application instances

B.  Replace the current security group of the bastion host with one that only allows inbound access from the internal IP range for the company

C.  Replace the current security group of the bastion host with one that only allows inbound access from the external IP range for the company

D.  Replace the current security group of the application instances with one that allows inbound SSH access from only the private IP address of the bastion host

E.  Replace the current security group of the application instances with one that allows inbound SSH access from only the public IP address of the bastion host

 

Answer: CD

 

Explanation:

https://digitalcloud.training/ssh-into-ec2-in-private-subnet/

 

 

NEW QUESTION 10

-  (Exam Topic 1)

A company needs to store data in Amazon S3 and must prevent the data from being changed. The company wants new objects that are uploaded to Amazon S3 to remain unchangeable for a nonspecific amount of time until the company decides to modify the objects. Only specific users in the company’s AWS account can have the ability to delete the objects. What should a solutions architect do to meet these requirements?

 

A.  Create an S3 Glacier vault Apply a write-once, read-many (WORM) vault lock policy to the objects

B.  Create an S3 bucket with S3 Object Lock enabled Enable versioning Set a retention period of 100 years Use governance mode as the S3 bucket's default retention mode for new objects

C.  Create an S3 bucket Use AWS CloudTrail to (rack any S3 API events that modify the objects Upon notification, restore the modified objects from any backup versions that the company has

D.  Create an S3 bucket with S3 Object Lock enabled Enable versioning Add a legal hold to the objects Add the s3 PutObjectLegalHold permission to the IAM policies of users who need to delete the objects

 

Answer: D

 

Explanation:

"The Object Lock legal hold operation enables you to place a legal hold on an object version. Like setting a retention period, a legal hold prevents an object version from being overwritten or deleted. However, a legal hold doesn't have an associated retention period and remains in effect until removed." https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-legal-hold.html

 

 

NEW QUESTION 10

-  (Exam Topic 1)

A company that hosts its web application on AWS wants to ensure all Amazon EC2 instances. Amazon RDS DB instances. and Amazon Redshift clusters are configured with tags. The company wants to minimize the effort of configuring and operating this check.

What should a solutions architect do to accomplish this?

 

A.  Use AWS Config rules to define and detect resources that are not properly tagged.

B.  Use Cost Explorer to display resources that are not properly tagge

C.  Tag those resources manually.

D.  Write API calls to check all resources for proper tag allocatio

E.  Periodically run the code on an EC2 instance.

F.  Write API calls to check all resources for proper tag allocatio

G.  Schedule an AWS Lambda function through Amazon CloudWatch to periodically run the code.

 

Answer: A

 

 

NEW QUESTION 15

-  (Exam Topic 1)

A company has an on-premises application that generates a large amount of time-sensitive data that is backed up to Amazon S3. The application has grown and there are user complaints about internet bandwidth limitations. A solutions architect needs to design a long-term solution that allows for both timely backups to Amazon S3 and with minimal impact on internet connectivity for internal users.

Which solution meets these requirements?

 

A.  Establish AWS VPN connections and proxy all traffic through a VPC gateway endpoint

B.  Establish a new AWS Direct Connect connection and direct backup traffic through this new connection.

C.  Order daily AWS Snowball devices Load the data onto the Snowball devices and return the devices to AWS each day.

D.  Submit a support ticket through the AWS Management Console Request the removal of S3 service limits from the account.

 

Answer: B

 

 

NEW QUESTION 16

-  (Exam Topic 1)

A company wants to reduce the cost of its existing three-tier web architecture. The web, application, and database servers are running on Amazon EC2 instances for the development, test, and production environments. The EC2 instances average 30% CPU utilization during peak hours and 10% CPU utilization during non- peak hours.

The production EC2 instances run 24 hours a day. The development and test EC2 instances run for at least 8 hours each day. The company plans to implement automation to stop the development and test EC2 instances when they are not in use.

Which EC2 instance purchasing solution will meet the company's requirements MOST cost-effectively?

 

A.  Use Spot Instances for the production EC2 instance

B.  Use Reserved Instances for the development and test EC2 instances.

C.  Use Reserved Instances for the production EC2 instance

D.  Use On-Demand Instances for the development and test EC2 instances.

E.  Use Spot blocks for the production EC2 instance

F.  Use Reserved Instances for the development and test EC2 instances.

G.  Use On-Demand Instances for the production EC2 instance

H.  Use Spot blocks for the development and test EC2 instances.

 

Answer: B

 

 

NEW QUESTION 21

-  (Exam Topic 1)

A company's HTTP application is behind a Network Load Balancer (NLB). The NLB's target group is configured to use an Amazon EC2 Auto Scaling group with multiple EC2 instances that run the web service.

The company notices that the NLB is not detecting HTTP errors for the application. These errors require a manual restart of the EC2 instances that run the web service. The company needs to improve the application's availability without writing custom scripts or code.

What should a solutions architect do to meet these requirements?

 

A.  Enable HTTP health checks on the NL

B.  supplying the URL of the company's application.

C.  Add a cron job to the EC2 instances to check the local application's logs once each minut

D.  If HTTP errors are detected, the application will restart.

E.  Replace the NLB with an Application Load Balance

F.  Enable HTTP health checks by supplying the URL of the company's applicatio

G.  Configure an Auto Scaling action to replace unhealthy instances.

H.  Create an Amazon Cloud Watch alarm that monitors the UnhealthyHostCount metric for the NL

I.  Configure an Auto Scaling action to replace unhealthy instances when the alarm is in the ALARM state.

 

Answer: C

 

 

NEW QUESTION 25

-  (Exam Topic 1)

A company is launching a new application and will display application metrics on an Amazon CloudWatch dashboard. The company’s product manager needs to access this dashboard periodically. The product manager does not have an AWS account. A solution architect must provide access to the product manager by following the principle of least privilege.

Which solution will meet these requirements?

 

A.  Share the dashboard from the CloudWatch consol

B.  Enter the product manager’s email address, and complete the sharing step

C.  Provide a shareable link for the dashboard to the product manager.

D.  Create an IAM user specifically for the product manage

E.  Attach the CloudWatch Read Only Access managed policy to the use

F.  Share the new login credential with the product manage

G.  Share the browser URL of the correct dashboard with the product manager.

H.  Create an IAM user for the company’s employees, Attach the View Only Access AWS managed policy to the IAM use

 

I.  Share the new login credentials with the product manage

J.  Ask the product manager to navigate to the CloudWatch console and locate the dashboard by name in the Dashboards section.

K.  Deploy a bastion server in a public subne

L.  When the product manager requires access to the dashboard, start the server and share the RDP credential

M.  On the bastion server, ensure that the browser is configured to open the dashboard URL with cached AWS credentials that have appropriate permissions to view the dashboard.

 

Answer: B

 

 

NEW QUESTION 28

-  (Exam Topic 1)

A company has a large Microsoft SharePoint deployment running on-premises that requires Microsoft Windows shared file storage. The company wants to migrate this workload to the AWS Cloud and is considering various storage options. The storage solution must be highly available and integrated with Active Directory for access control.

Which solution will satisfy these requirements?

 

A.  Configure Amazon EFS storage and set the Active Directory domain for authentication

B.  Create an SMB Me share on an AWS Storage Gateway tile gateway in two Availability Zones

C.  Create an Amazon S3 bucket and configure Microsoft Windows Server to mount it as a volume

D.  Create an Amazon FSx for Windows File Server file system on AWS and set the Active Directory domain for authentication

 

Answer: D

 

 

NEW QUESTION 30

-  (Exam Topic 1)

A solutions architect is designing a two-tier web application The application consists of a public-facing web tier hosted on Amazon EC2 in public subnets The database tier consists of Microsoft SQL Server running on Amazon EC2 in a private subnet Security is a high priority for the company

How should security groups be configured in this situation? (Select TWO )

 

A.  Configure the security group for the web tier to allow inbound traffic on port 443 from 0.0.0.0/0.

B.  Configure the security group for the web tier to allow outbound traffic on port 443 from 0.0.0.0/0.

C.  Configure the security group for the database tier to allow inbound traffic on port 1433 from the securitygroup for the web tier.

D.  Configure the security group for the database tier to allow outbound traffic on ports 443 and 1433 to the security group for the web tier.

E.  Configure the security group for the database tier to allow inbound traffic on ports 443 and 1433 from the security group for the web tier.

 

Answer: AC

 

Explanation:

"Security groups create an outbound rule for every inbound rule." Not completely right. Statefull does NOT mean that if you create an inbound (or outbound) rule, it will create an outbound (or inbound) rule. What it does mean is: suppose you create an inbound rule on port 443 for the X ip. When a request enters on port 443 from X ip, it will allow traffic out for that request in the port 443. However, if you look at the outbound rules, there will not be any outbound rule on port 443 unless explicitly create it. In ACLs, which are stateless, you would have to create an inbound rule to allow incoming requests and an outbound rule to allow your application responds to those incoming requests.

https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html#SecurityGroupRules

 

 

NEW QUESTION 34

-  (Exam Topic 1)

A company needs guaranteed Amazon EC2 capacity in three specific Availability Zones in a specific AWS Region for an upcoming event that will last 1 week. What should the company do to guarantee the EC2 capacity?

 

A.  Purchase Reserved instances that specify the Region needed

B.  Create an On Demand Capacity Reservation that specifies the Region needed

C.  Purchase Reserved instances that specify the Region and three Availability Zones needed

D.  Create an On-Demand Capacity Reservation that specifies the Region and three Availability Zones needed

 

Answer: D

 

Explanation:

https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-capacity-reservations.html

Reserve instances: You will have to pay for the whole term (1 year or 3years) which is not cost effective

 

 

NEW QUESTION 39

-  (Exam Topic 1)

A hospital recently deployed a RESTful API with Amazon API Gateway and AWS Lambda The hospital uses API Gateway and Lambda to upload reports that are in PDF format and JPEG format The hospital needs to modify the Lambda code to identify protected health information (PHI) in the reports

Which solution will meet these requirements with the LEAST operational overhead?

 

A.  Use existing Python libraries to extract the text from the reports and to identify the PHI from the extracted text.

B.  Use Amazon Textract to extract the text from the reports Use Amazon SageMaker to identify the PHI from the extracted text.

C.  Use Amazon Textract to extract the text from the reports Use Amazon Comprehend Medical to identify the PHI from the extracted text

D.  Use Amazon Rekognition to extract the text from the reports Use Amazon Comprehend Medical to identify the PHI from the extracted text

 

Answer: C

 

 

NEW QUESTION 41

-  (Exam Topic 1)

A company needs to store its accounting records in Amazon S3. The records must be immediately accessible for 1 year and then must be archived for an additional 9 years. No one at the company, including administrative users and root users, can be able to delete the records during the entire 10-year period. The

 

records must be stored with maximum resiliency. Which solution will meet these requirements?

 

A.  Store the records in S3 Glacier for the entire 10-year perio

B.  Use an access control policy to deny deletion of the records for a period of 10 years.

C.  Store the records by using S3 Intelligent-Tierin

D.  Use an IAM policy to deny deletion of the records.After 10 years, change the IAM policy to allow deletion.

E.  Use an S3 Lifecycle policy to transition the records from S3 Standard to S3 Glacier Deep Archive after 1 yea

F.  Use S3 Object Lock in compliance mode for a period of 10 years.

G.  Use an S3 Lifecycle policy to transition the records from S3 Standard to S3 One Zone-Infrequent Access (S3 One Zone-IA) after 1 yea

H.  Use S3 Object Lock in governance mode for a period of 10 years.

 

Answer: C

 

 

NEW QUESTION 44

-  (Exam Topic 1)

A solutions architect is developing a multiple-subnet VPC architecture. The solution will consist of six subnets in two Availability Zones. The subnets are defined as public, private and dedicated for databases. Only the Amazon EC2 instances running in the private subnets should be able to access a database.

Which solution meets these requirements?

 

A.  Create a now route table that excludes the route to the public subnets' CIDR block

B.  Associate the route table to the database subnets.

C.  Create a security group that denies ingress from the security group used by instances in the public subnet

D.  Attach the security group to an Amazon RDS DB instance.

E.  Create a security group that allows ingress from the security group used by instances in the private subnet

F.  Attach the security group to an Amazon RDS DB instance.

G.  Create a new peering connection between the public subnets and the private subnet

H.  Create a different peering connection between the private subnets and the database subnets.

 

Answer: C

 

Explanation:

Security groups are stateful. All inbound traffic is blocked by default. If you create an inbound rule allowing traffic in, that traffic is automatically allowed back out again. You cannot block specific IP address using Security groups (instead use Network Access Control Lists).

"You can specify allow rules, but not deny rules." "When you first create a security group, it has no inbound rules. Therefore, no inbound traffic originating from another host to your instance is allowed until you add inbound rules to the security group." Source: https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html#VPCSecurityGroups

 

 

NEW QUESTION 46

-  (Exam Topic 1)

A company needs the ability to analyze the log files of its proprietary application. The logs are stored in JSON format in an Amazon S3 bucket Queries will be simple and will run on-demand A solutions architect needs to perform the analysis with minimal changes to the existing architecture

What should the solutions architect do to meet these requirements with the LEAST amount of operational overhead?

 

A.  Use Amazon Redshift to load all the content into one place and run the SQL queries as needed

B.  Use Amazon CloudWatch Logs to store the logs Run SQL queries as needed from the Amazon CloudWatch console

C.  Use Amazon Athena directly with Amazon S3 to run the queries as needed

D.  Use AWS Glue to catalog the logs Use a transient Apache Spark cluster on Amazon EMR to run the SQL queries as needed

 

Answer: C

 

Explanation:

Amazon Athena can be used to query JSON in S3

 

 

NEW QUESTION 48

-  (Exam Topic 1)

A development team runs monthly resource-intensive tests on its general purpose Amazon RDS for MySQL DB instance with Performance Insights enabled. The testing lasts for 48 hours once a month and is the only process that uses the database. The team wants to reduce the cost of running the tests without reducing the compute and memory attributes of the DB instance.

Which solution meets these requirements MOST cost-effectively?

 

A.  Stop the DB instance when tests are complete

B.  Restart the DB instance when required.

C.  Use an Auto Scaling policy with the DB instance to automatically scale when tests are completed.

D.  Create a snapshot when tests are complete

E.  Terminate the DB instance and restore the snapshot when required.

F.  Modify the DB instance to a low-capacity instance when tests are complete

G.  Modify the DB instance again when required.

 

Answer: A

 

 

NEW QUESTION 49

-  (Exam Topic 1)

A company's application integrates with multiple software-as-a-service (SaaS) sources for data collection. The company runs Amazon EC2 instances to receive the data and to upload the data to an Amazon S3 bucket for analysis. The same EC2 instance that receives and uploads the data also sends a notification to the user when an upload is complete. The company has noticed slow application performance and wants to improve the performance as much as possible.

Which solution will meet these requirements with the LEAST operational overhead?

 

A.  Create an Auto Scaling group so that EC2 instances can scale ou

 

B.  Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.

C.  Create an Amazon AppFlow flow to transfer data between each SaaS source and the S3 bucket.Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.

D.  Create an Amazon EventBridge (Amazon CloudWatch Events) rule for each SaaS source to send output dat

E.  Configure the S3 bucket as the rule's targe

F.  Create a second EventBridge (CloudWatch Events) rule to send events when the upload to the S3 bucket is complet

G.  Configure an Amazon Simple Notification Service (Amazon SNS) topic as the second rule's target.

H.  Create a Docker container to use instead of an EC2 instanc

I.  Host the containerized application on Amazon Elastic Container Service (Amazon ECS). Configure Amazon CloudWatch Container Insights to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3 bucket is complete.

 

Answer: B

 

Explanation:

Amazon AppFlow is a fully managed integration service that enables you to securely transfer data between Software-as-a-Service (SaaS) applications like Salesforce, SAP, Zendesk, Slack, and ServiceNow, and AWS services like Amazon S3 and Amazon Redshift, in just a few clicks. https://aws.amazon.com/appflow/

 

 

NEW QUESTION 52

-  (Exam Topic 1)

A company uses 50 TB of data for reporting. The company wants to move this data from on premises to AWS A custom application in the company's data center runs a weekly data transformation job. The company plans to pause the application until the data transfer is complete and needs to begin the transfer process as soon as possible.

The data center does not have any available network bandwidth for additional workloads A solutions architect must transfer the data and must configure the transformation job to continue to run in the AWS Cloud

Which solution will meet these requirements with the LEAST operational overhead?

 

A.  Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue

B.  Order an AWS Snowcone device to move the data Deploy the transformation application to the device

C.  Order an AWS Snowball Edge Storage Optimized devic

D.  Copy the data to the devic

E.  Create a custom transformation job by using AWS Glue

F.  Order an AWS

G.  Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a new EC2 instance on AWS to run the transformation application

 

Answer: C

 

 

NEW QUESTION 54

-  (Exam Topic 1)

A company is implementing a new business application. The application runs on two Amazon EC2 instances and uses an Amazon S3 bucket for document storage. A solutions architect needs to ensure that the EC2 instances can access the S3 bucket.

What should the solutions architect do to meet this requirement?

 

A.  Create an IAM role that grants access to the S3 bucke

B.  Attach the role to the EC2 instances.

C.  Create an IAM policy that grants access to the S3 bucke

D.  Attach the policy to the EC2 instances.

E.  Create an IAM group that grants access to the S3 bucke

F.  Attach the group to the EC2 instances.

G.  Create an IAM user that grants access to the S3 bucke

H.  Attach the user account to the EC2 instances.

 

Answer: A

 

Explanation:

https://aws.amazon.com/premiumsupport/knowledge-center/ec2-instance-access-s3-bucket/

 

 

NEW QUESTION 56

-  (Exam Topic 1)

A solutions architect is designing a VPC with public and private subnets. The VPC and subnets use IPv4 CIDR blocks. There is one public subnet and one private subnet in each of three Availability Zones (AZs) for high availability. An internet gateway is used to provide internet access for the public subnets. The private subnets require access to the internet to allow Amazon EC2 instances to download software updates.

What should the solutions architect do to enable Internet access for the private subnets?

 

A.  Create three NAT gateways, one for each public subnet in each A

B.  Create a private route table for each AZ that forwards non-VPC traffic to the NAT gateway in its AZ.

C.  Create three NAT instances, one for each private subnet in each A

D.  Create a private route table for each AZ that forwards non-VPC traffic to the NAT instance in its AZ.

E.  Create a second internet gateway on one of the private subnet

F.  Update the route table for the private subnets that forward non-VPC traffic to the private internet gateway.

G.  Create an egress-only internet gateway on one of the public subnet

H.  Update the route table for the private subnets that forward non-VPC traffic to the egress- only internet gateway.

 

Answer: A

 

Explanation:

https://aws.amazon.com/about-aws/whats-new/2018/03/introducing-amazon-vpc-nat-gateway-in-the-aws-govclo https://docs.aws.amazon.com/vpc/latest/userguide/vpc-nat-comparison.html

 

NEW QUESTION 59

-  (Exam Topic 1)

A company has an Amazon S3 bucket that contains critical data. The company must protect the data from accidental deletion. Which combination of steps should a solutions architect take to meet these requirements? (Choose two.)

 

A.  Enable versioning on the S3 bucket.

B.  Enable MFA Delete on the S3 bucket.

C.  Create a bucket policy on the S3 bucket.

D.  Enable default encryption on the S3 bucket.

E.  Create a lifecycle policy for the objects in the S3 bucket.

 

Answer: AB

 

 

NEW QUESTION 60

-  (Exam Topic 1)

A company hosts a data lake on AWS. The data lake consists of data in Amazon S3 and Amazon RDS for PostgreSQL. The company needs a reporting solution that provides data visualization and includes all the data sources within the data lake. Only the company's management team should have full access to all the visualizations. The rest of the company should have only limited access.

Which solution will meet these requirements?

 

A.  Create an analysis in Amazon QuickSigh

B.  Connect all the data sources and create new dataset

C.  Publish dashboards to visualize the dat

D.  Share the dashboards with the appropriate IAM roles.

E.  Create an analysis in Amazon OuickSigh

F.  Connect all the data sources and create new dataset

G.  Publish dashboards to visualize the dat

H.  Share the dashboards with the appropriate users and groups.

I.  Create an AWS Glue table and crawler for the data in Amazon S3. Create an AWS Glue extract, transform, and load (ETL) job to produce report

J.  Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.

K.  Create an AWS Glue table and crawler for the data in Amazon S3. Use Amazon Athena Federated Query to access data within Amazon RDS for PoslgreSQ

L.  Generate reports by using Amazon Athen

M.  Publish the reports to Amazon S3. Use S3 bucket policies to limit access to the reports.

 

Answer: A

 

 

NEW QUESTION 64

-  (Exam Topic 1)

A social media company allows users to upload images to its website. The website runs on Amazon EC2 instances. During upload requests, the website resizes the images to a standard size and stores the resized images in Amazon S3. Users are experiencing slow upload requests to the website.

The company needs to reduce coupling within the application and improve website performance. A solutions architect must design the most operationally efficient process for image uploads.

Which combination of actions should the solutions architect take to meet these requirements? (Choose two.)

 

A.  Configure the application to upload images to S3 Glacier.

B.  Configure the web server to upload the original images to Amazon S3.

C.  Configure the application to upload images directly from each user's browser to Amazon S3 through the use of a presigned URL.

D.  Configure S3 Event Notifications to invoke an AWS Lambda function when an image is uploade

E.  Use the function to resize the image

F.  Create an Amazon EventBridge (Amazon CloudWatch Events) rule that invokes an AWS Lambda function on a schedule to resize uploaded images.

 

Answer: BD

 

 

NEW QUESTION 67

-  (Exam Topic 1)

A bicycle sharing company is developing a multi-tier architecture to track the location of its bicycles during peak operating hours The company wants to use these data points in its existing analytics platform A solutions architect must determine the most viable multi-tier option to support this architecture The data points must be accessible from the REST API.

Which action meets these requirements for storing and retrieving location data?

 

A.  Use Amazon Athena with Amazon S3

B.  Use Amazon API Gateway with AWS Lambda

C.  Use Amazon QuickSight with Amazon Redshift.

D.  Use Amazon API Gateway with Amazon Kinesis Data Analytics

 

Answer: D

 

Explanation:

https://aws.amazon.com/solutions/implementations/aws-streaming-data-solution-for-amazon-kinesis/

 

 

NEW QUESTION 72

-  (Exam Topic 1)

A company has registered its domain name with Amazon Route 53. The company uses Amazon API Gateway in the ca-central-1 Region as a public interface for its backend microservice APIs. Third-party services consume the APIs securely. The company wants to design its API Gateway URL with the company's domain name and corresponding certificate so that the third-party services can use HTTPS.

Which solution will meet these requirements?

 

A.  Create stage variables in API Gateway with Name="Endpoint-URL" and Value="Company Domain Name" to overwrite the default UR

B.  Import the public certificate associated with the company's domain name into AWS Certificate Manager (ACM).

 

C.  Create Route 53 DNS records with the company's domain nam

D.  Point the alias record to the Regional API Gateway stage endpoin

E.  Import the public certificate associated with the company's domain name into AWS Certificate Manager (ACM) in the us-east-1 Region.

F.  Create a Regional API Gateway endpoin

G.  Associate the API Gateway endpoint with the company's domain nam

H.  Import the public certificate associated with the company's domain name into AWS Certificate Manager (ACM) in the same Regio

I.  Attach the certificate to the API Gateway endpoin

J.  Configure Route 53 to route traffic to the API Gateway endpoint.

K.  Create a Regional API Gateway endpoin

L.  Associate the API Gateway endpoint with the company's domain nam

M.  Import the public certificate associated with the company's domain name into AWS Certificate Manager (ACM) in the us-east-1 Regio

N.  Attach the certificate to the API Gateway APIs.Create Route 53 DNS records with the company's domain nam

O.  Point an A record to the company's domain name.

 

Answer: D

 

 

NEW QUESTION 75

-  (Exam Topic 1)

A company has a website hosted on AWS. The website is behind an Application Load Balancer (ALB) that is configured to handle HTTP and HTTPS separately. The company wants to forward all requests to the website so that the requests will use HTTPS.

What should a solutions architect do to meet this requirement?

 

A.  Update the ALB's network ACL to accept only HTTPS traffic

B.  Create a rule that replaces the HTTP in the URL with HTTPS.

C.  Create a listener rule on the ALB to redirect HTTP traffic to HTTPS.

D.  Replace the ALB with a Network Load Balancer configured to use Server Name Indication (SNI).

 

Answer: C

 

Explanation:

https://aws.amazon.com/premiumsupport/knowledge-center/elb-redirect-http-to-https-using-alb/

How can I redirect HTTP requests to HTTPS using an Application Load Balancer? Last updated: 2020-10-30 I want to redirect HTTP requests to HTTPS using Application Load Balancer listener rules. How can I do this? Resolution Reference:

https://aws.amazon.com/premiumsupport/knowledge-center/elb-redirect-http-to-https-using-alb/

 

 

NEW QUESTION 80

-  (Exam Topic 1)

A company runs an on-premises application that is powered by a MySQL database The company is migrating the application to AWS to Increase the application's elasticity and availability

The current architecture shows heavy read activity on the database during times of normal operation Every 4 hours the company's development team pulls a full export of the production database to populate a database in the staging environment During this period, users experience unacceptable application latency The development team is unable to use the staging environment until the procedure completes

A solutions architect must recommend replacement architecture that alleviates the application latency issue The replacement architecture also must give the development team the ability to continue using the staging environment without delay

Which solution meets these requirements?

 

A.  Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for productio

B.  Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.

C.  Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production Use database cloning to create the staging database on-demand

D.  Use Amazon RDS for MySQL with a Mufti AZ deployment and read replicas for production Use the standby instance tor the staging database.

E.  Use Amazon RDS for MySQL with a Multi-AZ deployment and read replicas for productio

F.  Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.

 

Answer: B

 

Explanation:

https://aws.amazon.com/blogs/aws/amazon-aurora-fast-database-cloning/

 

 

NEW QUESTION 84

-  (Exam Topic 1)

A company is developing an application that provides order shipping statistics for retrieval by a REST API. The company wants to extract the shipping statistics, organize the data into an easy-to-read HTML format, and send the report to several email addresses at the same time every morning.

Which combination of steps should a solutions architect take to meet these requirements? (Choose two.)

 

A.  Configure the application to send the data to Amazon Kinesis Data Firehose.

B.  Use Amazon Simple Email Service (Amazon SES) to format the data and to send the report by email.

C.  Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Glue job to query the application's API for the data.

D.  Create an Amazon EventBridge (Amazon CloudWatch Events) scheduled event that invokes an AWS Lambda function to query the application's API for the data.

E.  Store the application data in Amazon S3. Create an Amazon Simple Notification Service (Amazon SNS) topic as an S3 event destination to send the report by

 

Answer: DE

 

 

NEW QUESTION 85

-  (Exam Topic 1)

A company has a data ingestion workflow that consists the following:

An Amazon Simple Notification Service (Amazon SNS) topic for notifications about new data deliveries An AWS Lambda function to process the data and record metadata

 

The company observes that the ingestion workflow fails occasionally because of network connectivity issues. When such a failure occurs, the Lambda function does not ingest the corresponding data unless the company manually reruns the job.

Which combination of actions should a solutions architect take to ensure that the Lambda function ingests all data in the future? (Select TWO.)

 

A.  Configure the Lambda function In multiple Availability Zones.

B.  Create an Amazon Simple Queue Service (Amazon SQS) queue, and subscribe It to me SNS topic.

C.  Increase the CPU and memory that are allocated to the Lambda function.

D.  Increase provisioned throughput for the Lambda function.

E.  Modify the Lambda function to read from an Amazon Simple Queue Service (Amazon SQS) queue

 

Answer: BE

 

 

NEW QUESTION 90

-  (Exam Topic 1)

A company has an application that generates a large number of files, each approximately 5 MB in size. The files are stored in Amazon S3. Company policy requires the files to be stored for 4 years before they can be

deleted Immediate accessibility is always required as the files contain critical business data that is not easy to reproduce. The files are frequently accessed in the first 30 days of the object creation but are rarely accessed after the first 30 days

Which storage solution is MOST cost-effective?

 

A.  Create an S3 bucket lifecycle policy to move Mm from S3 Standard to S3 Glacier 30 days from object creation Delete the Tiles 4 years after object creation

B.  Create an S3 bucket lifecycle policy to move tiles from S3 Standard to S3 One Zone-infrequent Access (S3 One Zone-IA] 30 days from object creatio

C.  Delete the fees 4 years after object creation

D.  Create an S3 bucket lifecycle policy to move files from S3 Standard-infrequent Access (S3 Standard-lA) 30 from object creatio

E.  Delete the ties 4 years after object creation

F.  Create an S3 bucket Lifecycle policy to move files from S3 Standard to S3 Standard-Infrequent Access (S3 Standard-IA) 30 days from object creation Move the files to S3 Glacier 4 years after object carton.

 

Answer: B

 

Explanation:

https://aws.amazon.com/s3/storage-classes/?trk=66264cd8-3b73-416c-9693-ea7cf4fe846a&sc_channel=ps&s_k

 

 

NEW QUESTION 93

-  (Exam Topic 1)

A company is deploying a new public web application to AWS. The application will run behind an Application Load Balancer (ALB). The application needs to be encrypted at the edge with an SSL/TLS certificate that is issued by an external certificate authority (CA). The certificate must be rotated each year before the certificate expires.

What should a solutions architect do to meet these requirements?

 

A.  Use AWS Certificate Manager (ACM) to issue an SSL/TLS certificat

B.  Apply the certificate to the AL

C.  Use the managed renewal feature to automatically rotate the certificate.

D.  Use AWS Certificate Manager (ACM) to issue an SSL/TLS certificat

E.  Import the key material from the certificat

F.  Apply the certificate to the AL

G.  Use the managed renewal feature to automatically rotate the certificate.

H.  Use AWS Certificate Manager (ACM) Private Certificate Authority to issue an SSL/TLS certificate from the root C

I.  Apply the certificate to the AL

J.  Use the managed renewal feature to automatically rotate the certificate.

K.  Use AWS Certificate Manager (ACM) to import an SSL/TLS certificat

L.  Apply the certificate to the AL

M.  Use Amazon EventBridge (Amazon CloudWatch Events) to send a notification when the certificate is nearing expiratio

N.  Rotate the certificate manually.

 

Answer: D

 

 

NEW QUESTION 94

-  (Exam Topic 1)

A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from tens of gigabytes to hundreds of terabytes The application data must be stored in a standard file system structure The company wants a solution that scales automatically, is highly available, and requires minimum operational overhead.

Which solution will meet these requirements?

 

A.  Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for storage

B.  Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic Block Store (Amazon EBS) for storage

C.  Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou

D.  Use Amazon Elastic File System (Amazon EFS) for storage.

E.  Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou

F.  Use Amazon Elastic Block Store (Amazon EBS) for storage.

 

Answer: C

 

Explanation:

EFS is a standard file system, it scales automatically and is highly available.

 

 

NEW QUESTION 96

-  (Exam Topic 1)

A company has thousands of edge devices that collectively generate 1 TB of status alerts each day. Each alert is approximately 2 KB in size. A solutions architect

 

needs to implement a solution to ingest and store the alerts for future analysis.

The company wants a highly available solution. However, the company needs to minimize costs and does not want to manage additional infrastructure. Ad ditionally, the company wants to keep 14 days of data available for immediate analysis and archive any data older than 14 days.

What is the MOST operationally efficient solution that meets these requirements?

 

A.  Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon S3 bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days

B.  Launch Amazon EC2 instances across two Availability Zones and place them behind an Elastic Load Balancer to ingest the alerts Create a script on the EC2 instances that will store tne alerts m an Amazon S3 bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days

C.  Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon Elasticsearch Service (Amazon ES) duster Set up the Amazon ES cluster to take manual snapshots every day and delete data from the duster that is older than 14 days

D.  Create an Amazon Simple Queue Service (Amazon SQS i standard queue to ingest the alerts and set the message retention period to 14 days Configure consumers to poll the SQS queue check the age of the message and analyze the message data as needed If the message is 14 days old the consumer should copy the message to an Amazon S3 bucket and delete the message from the SQS queue

 

Answer: A

 

Explanation:

https://aws.amazon.com/kinesis/data-firehose/features/?nc=sn&loc=2#:~:text=into%20Amazon%20S3%2C%20

 

 

NEW QUESTION 100

-  (Exam Topic 1)

A company hosts more than 300 global websites and applications. The company requires a platform to analyze more than 30 TB of clickstream data each day. What should a solutions architect do to transmit and process the clickstream data?

 

A.  Design an AWS Data Pipeline to archive the data to an Amazon S3 bucket and run an Amazon EMR duster with the data to generate analytics

B.  Create an Auto Scaling group of Amazon EC2 instances to process the data and send it to an Amazon S3 data lake for Amazon Redshift to use tor analysis

C.  Cache the data to Amazon CloudFron: Store the data in an Amazon S3 bucket When an object is added to the S3 bucket, run an AWS Lambda function to process the data tor analysis.

D.  Collect the data from Amazon Kinesis Data Stream

E.  Use Amazon Kinesis Data Firehose to transmit the data to an Amazon S3 data lake Load the data in Amazon Redshift for analysis

 

Answer: D

 

Explanation:

https://aws.amazon.com/es/blogs/big-data/real-time-analytics-with-amazon-redshift-streaming-ingestion/

 

 

NEW QUESTION 104

-  (Exam Topic 2)

A large media company hosts a web application on AWS. The company wants to start caching confidential media files so that users around the world will have reliable access to the files. The content is stored in Amazon S3 buckets. The company must deliver the content quickly, regardless of where the requests originate geographically.

Which solution will meet these requirements?

 

A.  Use AWS DataSync to connect the S3 buckets to the web application.

B.  Deploy AWS Global Accelerator to connect the S3 buckets to the web application.

C.  Deploy Amazon CloudFront to connect the S3 buckets to CloudFront edge servers.

D.  Use Amazon Simple Queue Service (Amazon SQS) to connect the S3 buckets to the web application.

 

Answer: C

 

Explanation:

CloudFront uses a local cache to provide the response, AWS Global accelerator proxies requests and connects to the application all the time for the response. https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-restricting-access-to-s3

 

 

NEW QUESTION 106

-  (Exam Topic 2)

A company uses a three-tier web application to provide training to new employees. The application is accessed for only 12 hours every day. The company is using an Amazon RDS for MySQL DB instance to store information and wants to minimize costs.

What should a solutions architect do to meet these requirements?

 

A.  Configure an IAM policy for AWS Systems Manager Session Manage

B.  Create an IAM role for the polic

C.  Update the trust relationship of the rol

D.  Set up automatic start and stop for the DB instance.

E.  Create an Amazon ElastiCache for Redis cache cluster that gives users the ability to access the data from the cache when the DB instance is stoppe

F.  Invalidate the cache after the DB instance is started.

G.  Launch an Amazon EC2 instanc

H.  Create an IAM role that grants access to Amazon RD

I.  Attach the role to the EC2 instanc

J.  Configure a cron job to start and stop the EC2 instance on the desired schedule.

K.  Create AWS Lambda functions to start and stop the DB instanc

L.  Create Amazon EventBridge (Amazon CloudWatch Events) scheduled rules to invoke the Lambda function

M.  Configure the Lambda functions as event targets for the rules

 

Answer: C

 

 

NEW QUESTION 108

 

-  (Exam Topic 2)

A company stores its application logs in an Amazon CloudWatch Logs log group. A new policy requires the company to store all application logs in Amazon OpenSearch Service (Amazon Elasticsearch Service) in near-real time.

Which solution will meet this requirement with the LEAST operational overhead?

 

A.  Configure a CloudWatch Logs subscription to stream the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service).

B.  Create an AWS Lambda functio

C.  Use the log group to invoke the function to write the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service).

D.  Create an Amazon Kinesis Data Firehose delivery strea

E.  Configure the log group as the delivery stream's sourc

F.  Configure Amazon OpenSearch Service (Amazon Elasticsearch Service) as the delivery stream's destination.

G.  Install and configure Amazon Kinesis Agent on each application server to deliver the logs to Amazon Kinesis Data Stream

H.  Configure Kinesis Data Streams to deliver the logs to Amazon OpenSearch Service (Amazon Elasticsearch Service)

 

Answer: B

 

Explanation:

https://computingforgeeks.com/stream-logs-in-aws-from-cloudwatch-to-elasticsearch/

 

 

NEW QUESTION 109

-  (Exam Topic 2)

A gaming company hosts a browser-based application on AWS. The users of the application consume a large number of videos and images that are stored in Amazon S3. This content is the same for all users.

The application has increased in popularity, and millions of users worldwide are accessing these media files. The company wants to provide the files to the users while reducing the load on the origin.

Which solution meets these requirements MOST cost-effectively?

 

A.  Deploy an AWS Global Accelerator accelerator in front of the web servers.

B.  Deploy an Amazon CloudFront web distribution in front of the S3 bucket.

C.  Deploy an Amazon ElastiCache for Redis instance in front of the web servers.

D.  Deploy an Amazon ElastiCache for Memcached instance in front of the web servers.

 

Answer: B

 

 

NEW QUESTION 111

-  (Exam Topic 2)

A company is migrating its on-premises PostgreSQL database to Amazon Aurora PostgreSQL. The

on-premises database must remain online and accessible during the migration. The Aurora database must remain synchronized with the on-premises database. Which combination of actions must a solutions architect take to meet these requirements? (Choose two.)

 

A.  Create an ongoing replication task.

B.  Create a database backup of the on-premises database

C.  Create an AWS Database Migration Service (AWS DMS) replication server

D.  Convert the database schema by using the AWS Schema Conversion Tool (AWS SCT).

E.  Create an Amazon EventBridge (Amazon CloudWatch Events) rule to monitor the database synchronization

 

Answer: CD

 

 

NEW QUESTION 115

-  (Exam Topic 2)

A company wants to direct its users to a backup static error page if the company's primary website is unavailable. The primary website's DNS records are hosted in Amazon Route 53. The domain is pointing to an Application Load Balancer (ALB). The company needs a solution that minimizes changes and infrastructure overhead.

Which solution will meet these requirements?

 

A.  Update the Route 53 records to use a latency routing polic

B.  Add a static error page that is hosted in an Amazon S3 bucket to the records so that the traffic is sent to the most responsive endpoints.

C.  Set up a Route 53 active-passive failover configuratio

D.  Direct traffic to a static error page that is hosted in an Amazon S3 bucket when Route 53 health checks determine that the ALB endpoint is unhealthy.

E.  Set up a Route 53 active-active configuration with the ALB and an Amazon EC2 instance that hosts a static error page as endpoint

F.  Configure Route 53 to send requests to the instance only if the health checks fail for the ALB.

G.  Update the Route 53 records to use a multivalue answer routing polic

H.  Create a health chec

I.  Direct traffic to the website if the health check passe

J.  Direct traffic to a static error page that is hosted in Amazon S3 if the health check does not pass.

 

Answer: B

 

 

NEW QUESTION 117

-  (Exam Topic 2)

A company has an AWS account used for software engineering. The AWS account has access to the company's on-premises data center through a pair of AWS Direct Connect connections. All non-VPC traffic routes to the virtual private gateway.

A development team recently created an AWS Lambda function through the console. The development team needs to allow the function to access a database that runs in a private subnet in the company's data center.

Which solution will meet these requirements?

 

A.  Configure the Lambda function to run in the VPC with the appropriate security group.

B.  Set up a VPN connection from AWS to the data cente

C.  Route the traffic from the Lambda function through the VPN.

 

D.  Update the route tables in the VPC to allow the Lambda function to access the on-premises data center through Direct Connect.

E.  Create an Elastic IP addres

F.  Configure the Lambda function to send traffic through the Elastic IP address without an elastic network interface.

 

Answer: A

 

Explanation:

https://docs.aws.amazon.com/lambda/latest/dg/configuration-vpc.html#vpc-managing-eni

 

 

NEW QUESTION 122

-  (Exam Topic 2)

A solutions architect must design a solution that uses Amazon CloudFront with an Amazon S3 origin to store a static website. The company's security policy requires that all website traffic be inspected by AWS WAR

How should the solutions architect comply with these requirements?

 

A.  Configure an S3 bucket policy lo accept requests coming from the AWS WAF Amazon Resource Name (ARN) only.

B.  Configure Amazon CloudFront to forward all incoming requests to AWS WAF before requesting content from the S3 origin.

C.  Configure a security group that allows Amazon CloudFront IP addresses to access Amazon S3 only.Associate AWS WAF to CloudFront.

D.  Configure Amazon CloudFront and Amazon S3 to use an origin access identity (OAI) to restrict access to the S3 bucke

E.  Enable AWS WAF on the distribution.

 

Answer: D

 

Explanation:

https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-restricting-access-to-s3 https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/distribution-web-awswaf.html

 

 

NEW QUESTION 123

-  (Exam Topic 2)

A company wants to run applications in containers in the AWS Cloud. These applications are stateless and can tolerate disruptions within the underlying infrastructure. The company needs a solution that minimizes cost and operational overhead.

What should a solutions architect do to meet these requirements?

 

A.  Use Spot Instances in an Amazon EC2 Auto Scaling group to run the application containers.

B.  Use Spot Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group.

C.  Use On-Demand Instances in an Amazon EC2 Auto Scaling group to run the application containers.

D.  Use On-Demand Instances in an Amazon Elastic Kubernetes Service (Amazon EKS) managed node group.

 

Answer: A

 

Explanation:

https://aws.amazon.com/cn/blogs/compute/cost-optimization-and-resilience-eks-with-spot-instances/

 

 

NEW QUESTION 128

-  (Exam Topic 2)

A company owns an asynchronous API that is used to ingest user requests and, based on the request type, dispatch requests to the appropriate microservice for processing. The company is using Amazon API Gateway to deploy the API front end, and an AWS Lambda function that invokes Amazon DynamoDB to store user requests before dispatching them to the processing microservices.

The company provisioned as much DynamoDB throughput as its budget allows, but the company is still experiencing availability issues and is losing user requests. What should a solutions architect do to address this issue without impacting existing users?

 

A.  Add throttling on the API Gateway with server-side throttling limits.

B.  Use DynamoDB Accelerator (DAX) and Lambda to buffer writes to DynamoDB.

C.  Create a secondary index in DynamoDB for the table with the user requests.

D.  Use the Amazon Simple Queue Service (Amazon SQS) queue and Lambda to buffer writes to DynamoDB.

 

Answer: D

 

Explanation:

By using an SQS queue and Lambda, the solutions architect can decouple the API front end from the processing microservices and improve the overall scalability and availability of the system. The SQS queue acts as a buffer, allowing the API front end to continue accepting user requests even if the processing microservices are experiencing high workloads or are temporarily unavailable. The Lambda function can then retrieve requests from the SQS queue and write them to DynamoDB, ensuring that all user requests are stored and processed. This approach allows the company to scale the processing microservices independently from the API front end, ensuring that the API remains available to users even during periods of high demand.

 

 

NEW QUESTION 130

-  (Exam Topic 2)

A company is concerned about the security of its public web application due to recent web attacks. The application uses an Application Load Balancer (ALB). A solutions architect must reduce the risk of DDoS attacks against the application.

What should the solutions architect do to meet this requirement?

 

A.  Add an Amazon Inspector agent to the ALB.

B.  Configure Amazon Macie to prevent attacks.

C.  Enable AWS Shield Advanced to prevent attacks.

D.  Configure Amazon GuardDuty to monitor the ALB.

 

Answer: C

 

NEW QUESTION 133

-  (Exam Topic 2)

A company has a highly dynamic batch processing job that uses many Amazon EC2 instances to complete it. The job is stateless in nature, can be started and stopped at any given time with no negative impact, and typically takes upwards of 60 minutes total to complete. The company has asked a solutions architect to design a scalable and cost-effective solution that meets the requirements of the job.

What should the solutions architect recommend?

 

A.  Implement EC2 Spot Instances

B.  Purchase EC2 Reserved Instances

C.  Implement EC2 On-Demand Instances

D.  Implement the processing on AWS Lambda

 

Answer: A

 

 

NEW QUESTION 138

-  (Exam Topic 2)

A company wants to build a scalable key management Infrastructure to support developers who need to encrypt data in their applications. What should a solutions architect do to reduce the operational burden?

 

A.  Use multifactor authentication (MFA) to protect the encryption keys.

B.  Use AWS Key Management Service (AWS KMS) to protect the encryption keys

C.  Use AWS Certificate Manager (ACM) to create, store, and assign the encryption keys

D.  Use an IAM policy to limit the scope of users who have access permissions to protect the encryption keys

 

Answer: B

 

Explanation:

https://aws.amazon.com/kms/faqs/#:~:text=If%20you%20are%20a%20developer%20who%20needs%20to%20d

 

 

NEW QUESTION 142

-  (Exam Topic 2)

A company has two applications: a sender application that sends messages with payloads to be processed and a processing application intended to receive the messages with payloads. The company wants to implement an AWS service to handle messages between the two applications. The sender application can send about 1.000 messages each hour. The messages may take up to 2 days to be processed. If the messages fail to process, they must be retained so that they do not impact the processing of any remaining messages.

Which solution meets these requirements and is the MOST operationally efficient?

 

A.  Set up an Amazon EC2 instance running a Redis databas

B.  Configure both applications to use the instanc

C.  Store, process, and delete the messages, respectively.

D.  Use an Amazon Kinesis data stream to receive the messages from the sender applicatio

E.  Integrate the processing application with the Kinesis Client Library (KCL).

F.  Integrate the sender and processor applications with an Amazon Simple Queue Service (Amazon SQS) queu

G.  Configure a dead-letter queue to collect the messages that failed to process.

H.  Subscribe the processing application to an Amazon Simple Notification Service (Amazon SNS) topic to receive notifications to proces

I.  Integrate the sender application to write to the SNS topic.

 

Answer: C

 

Explanation:

https://aws.amazon.com/blogs/compute/building-loosely-coupled-scalable-c-applications-with-amazon-sqs-and- https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-dead-letter-queues.htm

 

 

NEW QUESTION 144

-  (Exam Topic 2)

A solutions architect needs to implement a solution to reduce a company's storage costs. All the company's data is in the Amazon S3 Standard storage class. The company must keep all data for at least 25 years. Data from the most recent 2 years must be highly available and immediately retrievable.

Which solution will meet these requirements?

 

A.  Set up an S3 Lifecycle policy to transition objects to S3 Glacier Deep Archive immediately.

B.  Set up an S3 Lifecycle policy to transition objects to S3 Glacier Deep Archive after 2 years.

C.  Use S3 Intelligent-Tierin

D.  Activate the archiving option to ensure that data is archived in S3 Glacier Deep Archive.

E.  Set up an S3 Lifecycle policy to transition objects to S3 One Zone-Infrequent Access (S3 One Zone-IA) immediately and to S3 Glacier Deep Archive after 2 years.

 

Answer: B

 

 

NEW QUESTION 146

-  (Exam Topic 2)

A solutions architect needs to help a company optimize the cost of running an application on AWS. The application will use Amazon EC2 instances, AWS Fargate, and AWS Lambda for compute within the architecture.

The EC2 instances will run the data ingestion layer of the application. EC2 usage will be sporadic and unpredictable. Workloads that run on EC2 instances can be interrupted at any time. The application front end will run on Fargate, and Lambda will serve the API layer. The front-end utilization and API layer utilization will be predictable over the course of the next year.

Which combination of purchasing options will provide the MOST cost-effective solution for hosting this application? (Choose two.)

 

A.  Use Spot Instances for the data ingestion layer

B.  Use On-Demand Instances for the data ingestion layer

 

C.  Purchase a 1-year Compute Savings Plan for the front end and API layer.

D.  Purchase 1-year All Upfront Reserved instances for the data ingestion layer.

E.  Purchase a 1-year EC2 instance Savings Plan for the front end and API layer.

 

Answer: AC

 

 

NEW QUESTION 147

-  (Exam Topic 2)

A company is developing a file-sharing application that will use an Amazon S3 bucket for storage. The company wants to serve all the files through an Amazon CloudFront distribution. The company does not want the files to be accessible through direct navigation to the S3 URL.

What should a solutions architect do to meet these requirements?

 

A.  Write individual policies for each S3 bucket to grant read permission for only CloudFront access.

B.  Create an IAM use

C.  Grant the user read permission to objects in the S3 bucke

D.  Assign the user to CloudFront.

E.  Write an S3 bucket policy that assigns the CloudFront distribution ID as the Principal and assigns the target S3 bucket as the Amazon Resource Name (ARN).

F.  Create an origin access identity (OAI). Assign the OAI to the CloudFront distributio

G.  Configure the S3 bucket permissions so that only the OAI has read permission.

 

Answer: D

 

Explanation:

https://aws.amazon.com/premiumsupport/knowledge-center/cloudfront-access-to-amazon-s3/ https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-restricting-access-to-s3

 

 

NEW QUESTION 148

-  (Exam Topic 2)

A company is planning to build a high performance computing (HPC) workload as a service solution that Is hosted on AWS A group of 16 AmazonEC2Ltnux Instances requires the lowest possible latency for

node-to-node communication. The instances also need a shared block device volume for high-performing storage.

Which solution will meet these requirements?

 

A.  Use a duster placement grou

B.  Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon E BS) volume to all the instances by using Amazon EBS Multi-Attach

C.  Use a cluster placement grou

D.  Create shared 'lie systems across the instances by using Amazon Elastic File System (Amazon EFS)

E.  Use a partition placement grou

F.  Create shared tile systems across the instances by using Amazon Elastic File System (Amazon EFS).

G.  Use a spread placement grou

H.  Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon EBS) volume to all the instances by using Amazon EBS Multi-Attach

 

Answer: A

 

 

NEW QUESTION 150

-  (Exam Topic 2)

A company wants to migrate its on-premises data center to AWS. According to the company's compliance requirements, the company can use only the ap- northeast-3 Region. Company administrators are not permitted to connect VPCs to the internet.

Which solutions will meet these requirements? (Choose two.)

 

A.  Use AWS Control Tower to implement data residency guardrails to deny internet access and deny access to all AWS Regions except ap-northeast-3.

B.  Use rules in AWS WAF to prevent internet acces

C.  Deny access to all AWS Regions except ap-northeast-3 in the AWS account settings.

D.  Use AWS Organizations to configure service control policies (SCPS) that prevent VPCs from gaining internet acces

E.  Deny access to all AWS Regions except ap-northeast-3.

F.  Create an outbound rule for the network ACL in each VPC to deny all traffic from 0.0.0.0/0. Create an IAM policy for each user to prevent the use of any AWS Region other than ap-northeast-3.

G.  Use AWS Config to activate managed rules to detect and alert for internet gateways and to detect and alert for new resources deployed outside of ap- northeast-3.

 

Answer: AC

 

 

NEW QUESTION 153

-  (Exam Topic 2)

A solutions architect is optimizing a website for an upcoming musical event. Videos of the performances will be streamed in real time and then will be available on demand. The event is expected to attract a global online audience.

Which service will improve the performance of both the real-lime and on-demand streaming?

 

A.  Amazon CloudFront

B.  AWS Global Accelerator

C.  Amazon Route 53

D.  Amazon S3 Transfer Acceleration

 

Answer: A

 

Explanation:

You can use CloudFront to deliver video on demand (VOD) or live streaming video using any HTTP origin. One way you can set up video workflows in the cloud is by using CloudFront together with AWS Media Services.

 

https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/on-demand-streaming-video.html

 

 

NEW QUESTION 157

-  (Exam Topic 2)

A company’s website provides users with downloadable historical performance reports. The website needs a solution that will scale to meet the company’s website demands globally. The solution should be

cost-effective, limit the provisioning of infrastructure resources, and provide the fastest possible response time. Which combination should a solutions architect recommend to meet these requirements?

 

A.  Amazon CloudFront and Amazon S3

B.  AWS Lambda and Amazon DynamoDB

C.  Application Load Balancer with Amazon EC2 Auto Scaling

D.  Amazon Route 53 with internal Application Load Balancers

 

Answer: A

 

Explanation:

Cloudfront for rapid response and s3 to minimize infrastructure.

 

 

NEW QUESTION 161

-  (Exam Topic 2)

A company wants to migrate its existing on-premises monolithic application to AWS.

The company wants to keep as much of the front- end code and the backend code as possible. However, the company wants to break the application into smaller applications. A different team will manage each application. The company needs a highly scalable solution that minimizes operational overhead.

Which solution will meet these requirements?

 

A.  Host the application on AWS Lambda Integrate the application with Amazon API Gateway.

B.  Host the application with AWS Amplif

C.  Connect the application to an Amazon API Gateway API that is integrated with AWS Lambda.

D.  Host the application on Amazon EC2 instance

E.  Set up an Application Load Balancer with EC2 instances in an Auto Scaling group as targets.

F.  Host the application on Amazon Elastic Container Service (Amazon ECS) Set up an Application Load Balancer with Amazon ECS as the target.

 

Answer: D

 

Explanation:

https://aws.amazon.com/blogs/compute/microservice-delivery-with-amazon-ecs-and-application-load-balancers/

 

 

NEW QUESTION 164

-  (Exam Topic 2)

A company has a dynamic web application hosted on two Amazon EC2 instances. The company has its own SSL certificate, which is on each instance to perform SSL termination.

There has been an increase in traffic recently, and the operations team determined that SSL encryption and decryption is causing the compute capacity of the web servers to reach their maximum limit.

What should a solutions architect do to increase the application's performance?

 

A.  Create a new SSL certificate using AWS Certificate Manager (ACM) install the ACM certificate on each instance

B.  Create an Amazon S3 bucket Migrate the SSL certificate to the S3 bucket Configure the EC2 instances to reference the bucket for SSL termination

C.  Create another EC2 instance as a proxy server Migrate the SSL certificate to the new instance and configure it to direct connections to the existing EC2 instances

D.  Import the SSL certificate into AWS Certificate Manager (ACM) Create an Application Load Balancer with an HTTPS listener that uses the SSL certificate from ACM

 

Answer: D

 

Explanation:

https://aws.amazon.com/certificate-manager/:

"With AWS Certificate Manager, you can quickly request a certificate, deploy it on ACM-integrated AWS resources, such as Elastic Load Balancers, Amazon CloudFront distributions, and APIs on API Gateway, and let AWS Certificate Manager handle certificate renewals. It also enables you to create private certificates for

your internal resources and manage the certificate lifecycle centrally."

 

 

NEW QUESTION 169

-  (Exam Topic 2)

A security team wants to limit access to specific services or actions in all of the team's AWS accounts. All accounts belong to a large organization in AWS Organizations. The solution must be scalable and there must be a single point where permissions can be maintained.

What should a solutions architect do to accomplish this?

 

A.  Create an ACL to provide access to the services or actions.

B.  Create a security group to allow accounts and attach it to user groups.

C.  Create cross-account roles in each account to deny access to the services or actions.

D.  Create a service control policy in the root organizational unit to deny access to the services or actions.

 

Answer: D

 

Explanation:

Service control policies (SCPs) are one type of policy that you can use to manage your organization. SCPs offer central control over the maximum available permissions for all accounts in your organization, allowing you to ensure your accounts stay within your organization's access control guidelines. See https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scp.html.

 

 

NEW QUESTION 174

-  (Exam Topic 2)

A company runs a production application on a fleet of Amazon EC2 instances. The application reads the data from an Amazon SQS queue and processes the messages in parallel. The message volume is unpredictable and often has intermittent traffic. This application should continually process messages without any downtime.

Which solution meets these requirements MOST cost-effectively?

 

A.  Use Spot Instances exclusively to handle the maximum capacity required.

B.  Use Reserved Instances exclusively to handle the maximum capacity required.

C.  Use Reserved Instances for the baseline capacity and use Spot Instances to handle additional capacity.

D.  Use Reserved Instances for the baseline capacity and use On-Demand Instances to handle additional capacity.

 

Answer: D

 

Explanation:

We recommend that you use On-Demand Instances for applications with short-term, irregular workloads that cannot be interrupted. https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-on-demand-instances.html

 

 

NEW QUESTION 179

-  (Exam Topic 2)

A reporting team receives files each day in an Amazon S3 bucket. The reporting team manually reviews and copies the files from this initial S3 bucket to an analysis S3 bucket each day at the same time to use with Amazon QuickSight. Additional teams are starting to send more files in larger sizes to the initial S3 bucket.

The reporting team wants to move the files automatically analysis S3 bucket as the files enter the initial S3 bucket. The reporting team also wants to use AWS Lambda functions to run pattern-matching code on the copied data. In addition, the reporting team wants to send the data files to a pipeline in Amazon SageMaker Pipelines.

What should a solutions architect do to meet these requirements with the LEAST operational overhead?

 

A.  Create a Lambda function to copy the files to the analysis S3 bucke

B.  Create an S3 event notification for the analysis S3 bucke

C.  Configure Lambda and SageMaker Pipelines as destinations of the event notificatio

D.  Configure s30bjectCreated:Put as the event type.

E.  Create a Lambda function to copy the files to the analysis S3 bucke

F.  Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.

G.  Configure S3 replication between the S3 bucket

H.  Create an S3 event notification for the analysis S3 bucke

I.  Configure Lambda and SageMaker Pipelines as destinations of the event notificatio

J.  Configure s30bjectCreated:Put as the event type.

K.  Configure S3 replication between the S3 bucket

L.  Configure the analysis S3 bucket to send event notifications to Amazon EventBridge (Amazon CloudWatch Events). Configure an ObjectCreated rule in EventBridge (CloudWatch Events). Configure Lambda and SageMaker Pipelines as targets for the rule.

 

Answer: A

 

 

NEW QUESTION 180

-  (Exam Topic 2)

A company runs a high performance computing (HPC) workload on AWS. The workload required low-latency network performance and high network throughput with tightly coupled node-to-node communication. The Amazon EC2 instances are properly sized for compute and storage capacity, and are launched using default options.

What should a solutions architect propose to improve the performance of the workload?

 

A.  Choose a cluster placement group while launching Amazon EC2 instances.

B.  Choose dedicated instance tenancy while launching Amazon EC2 instances.

C.  Choose an Elastic Inference accelerator while launching Amazon EC2 instances.

D.  Choose the required capacity reservation while launching Amazon EC2 instances.

 

Answer: A

 

Explanation:

https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-ec2-placementgroup.html "A cluster placement group is a logical grouping of instances within a single Availability Zone that benefit from low network latency, high network throughput"

 

 

NEW QUESTION 183

-  (Exam Topic 2)

A company recently started using Amazon Aurora as the data store for its global ecommerce application When large reports are run developers report that the ecommerce application is performing poorly After reviewing metrics in Amazon CloudWatch, a solutions architect finds that the ReadlOPS and CPUUtilization metrics are spiking when monthly reports run.

What is the MOST cost-effective solution?

 

A.  Migrate the monthly reporting to Amazon Redshift.

B.  Migrate the monthly reporting to an Aurora Replica

C.  Migrate the Aurora database to a larger instance class

D.  Increase the Provisioned IOPS on the Aurora instance

 

Answer: B

 

Explanation:

https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/Aurora.Replication.html

 

#Aurora.Replication.Replicas Aurora Replicas have two main purposes. You can issue queries to them to scale the read operations for your application. You typically do so by connecting to the reader endpoint of the cluster. That way, Aurora can spread the load for read-only connections across as many Aurora Replicas as you have in the cluster. Aurora Replicas also help to increase availability. If the writer instance in a cluster becomes unavailable, Aurora automatically promotes one of the reader instances to take its place as the new writer. https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/Aurora.Overview.html

 

 

NEW QUESTION 185

-  (Exam Topic 2)

A corporation has recruited a new cloud engineer who should not have access to the CompanyConfidential Amazon S3 bucket. The cloud engineer must have read and write permissions on an S3 bucket named AdminTools.

Which IAM policy will satisfy these criteria?

 

A.  Text, letter Description automatically generated

 

 

 

 

 

B.  Text Description automatically generated

 

C.  Text, application Description automatically generated

 

D.  Text, application Description automatically generated

 

 

 

 

Answer: A

 

Explanation:

https://docs.amazonaws.cn/en_us/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html

 

 

NEW QUESTION 187

-  (Exam Topic 2)

A company produces batch data that comes from different databases. The company also produces live stream data from network sensors and application APIs. The company needs to consolidate all the data into one place for business analytics. The company needs to process the incoming data and then stage the data in different Amazon S3 buckets. Teams will later run one-time queries and import the data into a business intelligence tool to show key performance indicators (KPIs).

Which combination of steps will meet these requirements with the LEAST operational overhead? (Choose two.)

 

A.  Use Amazon Athena foe one-time queries Use Amazon QuickSight to create dashboards for KPIs

B.  Use Amazon Kinesis Data Analytics for one-time queries Use Amazon QuickSight to create dashboards for KPIs

C.  Create custom AWS Lambda functions to move the individual records from me databases to an Amazon Redshift duster

D.  Use an AWS Glue extract transform, and toad (ETL) job to convert the data into JSON format Load the data into multiple Amazon OpenSearch Service (Amazon Elasticsearch Service) dusters

E.  Use blueprints in AWS Lake Formation to identify the data that can be ingested into a data lake Use AWS Glue to crawl the source extract the data and load the data into Amazon S3 in Apache Parquet format

 

Answer: CE

 

 

NEW QUESTION 192

-  (Exam Topic 2)

A company sells ringtones created from clips of popular songs. The files containing the ringtones are stored in Amazon S3 Standard and are at least 128 KB in size. The company has millions of files, but downloads are infrequent for ringtones older than 90 days. The company needs to save money on storage while keeping the most accessed files readily available for its users.

Which action should the company take to meet these requirements MOST cost-effectively?

 

A.  Configure S3 Standard-Infrequent Access (S3 Standard-IA) storage for the initial storage tier of the objects.

B.  Move the files to S3 Intelligent-Tiering and configure it to move objects to a less expensive storage tier after 90 days.

C.  Configure S3 inventory to manage objects and move them to S3 Standard-Infrequent Access (S3 Standard-1A) after 90 days.

D.  Implement an S3 Lifecycle policy that moves the objects from S3 Standard to S3 Standard-Infrequent Access (S3 Standard-1A) after 90 days.

 

Answer: D

 

 

NEW QUESTION 196

-  (Exam Topic 2)

A company has a mulli-tier application that runs six front-end web servers in an Amazon EC2 Auto Scaling group in a single Availability Zone behind an Application Load Balancer (ALB). A solutions architect needs lo modify the infrastructure to be highly available without modifying the application.

Which architecture should the solutions architect choose that provides high availability?

 

A.  Create an Auto Scaling group that uses three Instances across each of tv/o Regions.

B.  Modify the Auto Scaling group to use three instances across each of two Availability Zones.

C.  Create an Auto Scaling template that can be used to quickly create more instances in another Region.

D.  Change the ALB in front of the Amazon EC2 instances in a round-robin configuration to balance traffic to the web tier.

 

Answer: B

 

Explanation:

High availability can be enabled for this architecture quite simply by modifying the existing Auto Scaling group to use multiple availability zones. The ASG will automatically balance the load so you don't actually need to specify the instances per AZ.

 

NEW QUESTION 197

-  (Exam Topic 2)

A company runs a stateless web application in production on a group of Amazon EC2 On-Demand Instances behind an Application Load Balancer. The application experiences heavy usage during an 8-hour period each business day. Application usage is moderate and steady overnight Application usage is low during weekends.

The company wants to minimize its EC2 costs without affecting the availability of the application. Which solution will meet these requirements?

 

A.  Use Spot Instances for the entire workload.

B.  Use Reserved instances for the baseline level of usage Use Spot Instances for any additional capacity that the application needs.

C.  Use On-Demand Instances for the baseline level of usag

D.  Use Spot Instances for any additional capacity that the application needs

E.  Use Dedicated Instances for the baseline level of usag

F.  Use On-Demand Instances for any additional capacity that the application needs

 

Answer: B

 

 

NEW QUESTION 202

-  (Exam Topic 3)

A company is migrating a Linux-based web server group to AWS. The web servers must access files in a shared file store for some content. The company must not make any changes to the application.

What should a solutions architect do to meet these requirements?

 

A.  Create an Amazon S3 Standard bucket with access to the web servers.

B.  Configure an Amazon CloudFront distribution with an Amazon S3 bucket as the origin.

C.  Create an Amazon Elastic File System (Amazon EFS) file syste

D.  Mount the EFS file system on all web servers.

E.  Configure a General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS) volum

F.  Mount the EBS volume to all web servers.

 

Answer: C

 

 

NEW QUESTION 205

-  (Exam Topic 3)

An ecommerce company needs to run a scheduled daily job to aggregate and filler sales records for analytics. The company stores the sales records in an Amazon S3 bucket. Each object can be up to 10 G6 in size Based on the number of sales events, the job can take up to an hour to complete. The CPU and memory usage of the fob are constant and are known in advance.

A solutions architect needs to minimize the amount of operational effort that is needed for the job to run. Which solution meets these requirements?

 

A.  Create an AWS Lambda function that has an Amazon EventBridge notification Schedule the EventBridge event to run once a day

B.  Create an AWS Lambda function Create an Amazon API Gateway HTTP API, and integrate the API with the function Create an Amazon EventBridge scheduled avert that calls the API and invokes the function.

C.  Create an Amazon Elastic Container Service (Amazon ECS) duster with an AWS Fargate launch type.Create an Amazon EventBridge scheduled event that launches an ECS task on the cluster to run the job.

D.  Create an Amazon Elastic Container Service (Amazon ECS) duster with an Amazon EC2 launch type and an Auto Scaling group with at least one EC2 instanc

E.  Create an Amazon EventBridge scheduled event that launches an ECS task on the duster to run the job.

 

Answer: C

 

 

NEW QUESTION 208

-  (Exam Topic 3)

A company wants to deploy a new public web application on AWS The application includes a web server tier that uses Amazon EC2 instances The application also includes a database tier that uses an Amazon RDS for MySQL DB instance

The application must be secure and accessible for global customers that have dynamic IP addresses How should a solutions architect configure the security groups to meet these requirements'?

 

A.  Configure the security group tor the web servers lo allow inbound traffic on port 443 from 0.0.0. 0/0) Configure the security group for the DB instance to allow inbound traffic on port 3306 from the security group of the web servers

B.  Configure the security group for the web servers to allow inbound traffic on port 443 from the IP addresses of the customers Configure the security group for the DB instance lo allow inbound traffic on port 3306 from the security group of the web servers

C.  Configure the security group for the web servers to allow inbound traffic on port 443 from the IP addresses of the customers Configure the security group for the DB instance to allow inbound traffic on port 3306 from the IP addresses of the customers

D.  Configure the security group for the web servers to allow inbound traffic on port 443 from 0.0.0.0.0 Configure the security group for the DB instance to allow inbound traffic on port 3306 from 0.0.0.0/0)

 

Answer: A

 

 

NEW QUESTION 210

-  (Exam Topic 3)

A company is planning to migrate a commercial off-the-shelf application from is on-premises data center to AWS. The software has a software licensing model using sockets and cores with predictable capacity and uptime requirements. The company wants to use its existing licenses, which were purchased earlier this year.

Which Amazon EC2 pricing option is the MOST cost-effective?

 

A.  Dedicated Reserved Hosts

B.  Dedicated On-Demand Hosts

C.  Dedicated Reserved Instances

D.  Dedicated On-Oemand Instances

 

Answer: A

 

 

NEW QUESTION 212

-  (Exam Topic 3)

A company needs to create an Amazon Elastic Kubernetes Service (Amazon EKS) cluster to host a digital

media streaming application. The EKS cluster will use a managed node group that is backed by Amazon Elastic Block Store (Amazon EBS) volumes for storage. The company must encrypt all data at rest by using a customer managed key that is stored in AWS Key Management Service (AWS KMS)

Which combination of actions will meet this requirement with the LEAST operational overhead? (Select TWO.)

 

A.  Use a Kubernetes plugin that uses the customer managed key to perform data encryption.

B.  After creation of the EKS cluster, locate the EBS volume

C.  Enable encryption by using the customer managed key.

D.  Enable EBS encryption by default in the AWS Region where the EKS cluster will be create

E.  Select the customer managed key as the default key.

F.  Create the EKS cluster Create an IAM role that has cuwlicy that grants permission to the customer managed ke

G.  Associate the role with the EKS cluster.

H.  Store the customer managed key as a Kubernetes secret in the EKS cluste

I.  Use the customer managed key to encrypt the EBS volumes.

 

Answer: AD

 

 

NEW QUESTION 214

-  (Exam Topic 3)

An image-hosting company stores its objects in Amazon S3 buckets. The company wants to avoid accidental exposure of the objects in the S3 buckets to the public. All S3 objects in the entire AWS account need to remain private

Which solution will meal these requirements?

 

A.  Use Amazon GuardDuty to monitor S3 bucket policies Create an automatic remediation action rule that uses an AWS Lambda function to remediate any change that makes the objects public

B.  Use AWS Trusted Advisor to find publicly accessible S3 Dockets Configure email notifications In Trusted Advisor when a change is detected manually change the S3 bucket policy if it allows public access

C.  Use AWS Resource Access Manager to find publicly accessible S3 buckets Use Amazon Simple Notification Service (Amazon SNS) to invoke an AWS Lambda function when a change it detected.Deploy a Lambda function that programmatically remediates the change.

D.  Use the S3 Block Public Access feature on the account leve

E.  Use AWS Organizations to create a service control policy (SCP) that prevents IAM users from changing the setting Apply tie SCP to tie account

 

Answer: D

 

 

NEW QUESTION 215

-  (Exam Topic 3)

A company has deployed a Java Spring Boot application as a pod that runs on Amazon Elastic Kubernetes Service (Amazon EKS) in private subnets. The application needs to write data to an Amazon DynamoDB table. A solutions architect must ensure that the application can interact with the DynamoDB table without exposing traffic to the internet.

Which combination of steps should the solutions architect take to accomplish this goal? (Choose two.)

 

A.  Attach an IAM role that has sufficient privileges to the EKS pod.

B.  Attach an IAM user that has sufficient privileges to the EKS pod.

C.  Allow outbound connectivity to the DynamoDB table through the private subnets’ network ACLs.

D.  Create a VPC endpoint for DynamoDB.

E.  Embed the access keys in the Java Spring Boot code.

 

Answer: AD

 

Explanation:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/vpc-endpoints-dynamodb.html https://aws.amazon.com/about-aws/whats- new/2019/09/amazon-eks-adds-support-to-assign-iam-permissions-to-

 

 

NEW QUESTION 220

-  (Exam Topic 3)

An ecommerce company is building a distributed application that involves several serverless functions and AWS services to complete order-processing tasks. These tasks require manual approvals as part of the workflow A solutions architect needs to design an architecture for the order-processing application The solution must be able to combine multiple AWS Lambda functions into responsive serverless applications The solution also must orchestrate data and services that run on Amazon EC2 instances, containers, or on-premises servers

Which solution will meet these requirements with the LEAST operational overhead?

 

A.  Use AWS Step Functions to build the application.

B.  Integrate all the application components in an AWS Glue job

C.  Use Amazon Simple Queue Service (Amazon SQS) to build the application

D.  Use AWS Lambda functions and Amazon EventBridge (Amazon CloudWatch Events) events to build the application

 

Answer: D

 

 

NEW QUESTION 224

-  (Exam Topic 3)

A company has a Microsoft NET application that runs on an on-premises Windows Server Trie application stores data by using an Oracle Database Standard Edition server The company is planning a migration to AWS and wants to minimize development changes while moving the application The AWS application environment should be highly available

Which combination of actions should the company take to meet these requirements? (Select TWO )

 

A.  Refactor the application as serverless with AWS Lambda functions running NET Cote

 

B.  Rehost the application in AWS Elastic Beanstalk with the NET platform in a Multi-AZ deployment

C.  Replatform the application to run on Amazon EC2 with the Amazon Linux Amazon Machine Image (AMI)

D.  Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Amazon DynamoDB in a Multi-AZ deployment

E.  Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Oracle on Amazon RDS in a Multi-AZ deployment

 

Answer: BE

 

 

NEW QUESTION 228

-  (Exam Topic 3)

A company wants to migrate its 1 PB on-premises image repository to AWS. The images will be used by a serverless web application Images stored in the repository are rarely accessed, but they must be immediately available Additionally, the images must be encrypted at rest and protected from accidental deletion Which solution meets these requirements?

 

A.  Implement client-side encryption and store the images in an Amazon S3 Glacier vault Set a vault lock to prevent accidental deletion

B.  Store the images in an Amazon S3 bucket in the S3 Standard-Infrequent Access (S3 Standard-IA) storage class Enable versioning default encryption and MFA Delete on the S3 bucket.

C.  Store the images in an Amazon FSx for Windows File Server file share Configure the Amazon FSx file share to use an AWS Key Management Service (AWS KMS) customer master key (CMK) to encrypt the images in the file share Use NTFS permission sets on the images to prevent accidental deletion

D.  Store the images in an Amazon Elastic File System (Amazon EFS) file share in the Infrequent Access storage class Configure the EFS file share to use an AWS Key Management Service (AWS KMS) customer master key (CMK) to encrypt the images in the file shar

E.  Use NFS permission sets on the images to prevent accidental deletion

 

Answer: B

 

 

NEW QUESTION 232

-  (Exam Topic 3)

A research laboratory needs to process approximately 8 TB of data The laboratory requires sub-millisecond latencies and a minimum throughput of 6 GBps for the storage subsystem Hundreds of Amazon EC2 instances that run Amazon Linux will distribute and process the data

Which solution will meet the performance requirements?

 

A.  Create an Amazon FSx for NetApp ONTAP file system Set each volume's tiering policy to ALL Import the raw data into the file system Mount the file system on the EC2 instances

B.  Create an Amazon S3 bucket to stofe the raw data Create an Amazon FSx for Lustre file system that uses persistent SSD storage Select the option to import data from and export data to Amazon S3 Mount the file system on the EC2 instances

C.  Create an Amazon S3 bucket to store the raw data Create an Amazon FSx for Lustre file system that uses persistent HDD storage Select the option to import data from and export data to Amazon S3 Mount the file system on the EC2 instances

D.  Create an Amazon FSx for NetApp ONTAP file system Set each volume's tienng policy to NON

E.  Import the raw data into the file system Mount the file system on the EC2 instances

 

Answer: B

 

Explanation:

Create an Amazon S3 bucket to store the raw data Create an Amazon FSx for Lustre file system that uses persistent SSD storage Select the option to import data from and export data to Amazon S3 Mount the file system on the EC2 instances. Amazon FSx for Lustre uses SSD storage for sub-millisecond latencies and up to 6 GBps throughput, and can import data from and export data to Amazon S3. Additionally, the option to select persistent SSD storage will ensure that the data is stored on the disk and not lost if the file system is stopped.

 

 

NEW QUESTION 234

-  (Exam Topic 3)

A company is designing a cloud communications platform that is driven by APIs. The application is hosted on Amazon EC2 instances behind a Network Load Balancer (NLB). The company uses Amazon API Gateway to provide external users with access to the application through APIs. The company wants to protect the platform against web exploits like SQL injection and also wants to detect and mitigate large, sophisticated DDoS attacks.

Which combination of solutions provides the MOST protection? (Select TWO.)

 

A.  Use AWS WAF to protect the NLB.

B.  Use AWS Shield Advanced with the NLB.

C.  Use AWS WAF to protect Amazon API Gateway.

D.  Use Amazon GuardDuty with AWS Shield Standard.

E.  Use AWS Shield Standard with Amazon API Gateway.

 

Answer: BC

 

Explanation:

AWS Shield Advanced provides expanded DDoS attack protection for your Amazon EC2 instances, Elastic Load Balancing load balancers, CloudFront distributions, Route 53 hosted zones, and AWS Global Accelerator standard accelerators.

AWS WAF is a web application firewall that lets you monitor the HTTP and HTTPS requests that are forwarded to your protected web application resources. You can protect the following resource types:

Amazon CloudFront distribution Amazon API Gateway REST API Application Load Balancer AWS AppSync GraphQL API Amazon Cognito user pool https://docs.aws.amazon.com/waf/latest/developerguide/what-is-aws-waf.html

 

 

NEW QUESTION 238

-  (Exam Topic 3)

A company has an Amazon S3 data lake that is governed by AWS Lake Formation The company wants to create a visualization in Amazon QuickSight by joining the data in the data lake with operational data that is stored in an Amazon Aurora MySQL database The company wants to enforce column-level authorization so that the company's marketing team can access only a subset of columns in the database

Which solution will meet these requirements with the LEAST operational overhead?

 

A.  Use Amazon EMR to ingest the data directly from the database to the QuickSight SPICE engine Include only the required columns

 

B.  Use AWS Glue Studio to ingest the data from the database to the S3 data lake Attach an IAM policy tothe QuickSight users to enforce column-level access contro

C.  Use Amazon S3 as the data source in QuickSight

D.  Use AWS Glue Elastic Views to create a materialized view for the database in Amazon S3 Create an S3 bucket policy to enforce column-level access control for the QuickSight users Use Amazon S3 as the data source in QuickSight.

E.  Use a Lake Formation blueprint to ingest the data from the database to the S3 data lake Use Lake Formation to enforce column-level access control for the QuickSight users Use Amazon Athena as the data source in QuickSight

 

Answer: D

 

 

NEW QUESTION 243

-  (Exam Topic 3)

A company has an On-premises volume backup solution that has reached its end of life. The company wants to use AWS as part of a new backup solution and wants to maintain local access to all the data while it is backed up on AWS. The company wants to ensure that the data backed up on AWS is automatically and securely transferred.

Which solution meets these requirements?

 

A.  Use AWS Snowball to migrate data out of the on-premises solution to Amazon S3. Configure on-premises systems to mount the Snowball S3 endpoint to provide local access to the data.

B.  Use AWS Snowball Edge to migrate data out of the on-premises solution to Amazon S3.Use the Snowball Edge file interface to provide on-premises systems with local access to the data.

C.  Use AWS Storage Gateway and configure a cached volume gatewa

D.  Run the Storage Gateway software application on premises and configure a percentage of data to cache locall

E.  Mount the gateway storage volumes to provide local access to the data.

F.  Use AWS Storage Gateway and configure a stored volume gatewa

G.  Run the Storage software application on premises and map the gateway storage volumes to on-premises storag

H.  Mount the gateway storage volumes to provide local access to the data.

 

Answer: C

 

 

NEW QUESTION 246

-  (Exam Topic 3)

A company runs a public three-Tier web application in a VPC The application runs on Amazon EC2 instances across multiple Availability Zones. The EC2 instances that run in private subnets need to communicate with a license server over the internet The company needs a managed solution that minimizes operational maintenance

Which solution meets these requirements''

 

A.  Provision a NAT instance in a public subnet Modify each private subnets route table with a default route that points to the NAT instance

B.  Provision a NAT instance in a private subnet Modify each private subnet's route table with a default route that points to the NAT instance

C.  Provision a NAT gateway in a public subnet Modify each private subnet's route table with a default route that points to the NAT gateway

D.  Provision a NAT gateway in a private subnet Modify each private subnet's route table with a default route that points to the NAT gateway .

 

Answer: C

 

 

NEW QUESTION 250

-  (Exam Topic 3)

A company has a three-tier application for image sharing. The application uses an Amazon EC2 instance for the front-end layer, another EC2 instance for the application layer, and a third EC2 instance for a MySQL database. A solutions architect must design a scalable and highly available solution that requires the least amount of change to the application.

Which solution meets these requirements?

 

A.  Use Amazon S3 to host the front-end laye

B.  Use AWS Lambda functions for the application laye

C.  Move the database to an Amazon DynamoDB tabl

D.  Use Amazon S3 to store and serve users’ images.

E.  Use load-balanced Multi-AZ AWS Elastic Beanstalk environments for the front-end layer and the application laye

F.  Move the database to an Amazon RDS DB instance with multiple read replicas to serve users’ images.

G.  Use Amazon S3 to host the front-end laye

H.  Use a fleet of EC2 instances in an Auto Scaling group for the application laye

I.  Move the database to a memory optimized instance type to store and serve users’ images.

J.  Use load-balanced Multi-AZ AWS Elastic Beanstalk environments for the front-end layer and the application laye

K.  Move the database to an Amazon RDS Multi-AZ DB instanc

L.  Use Amazon S3 to store and serve users’ images.

 

Answer: D

 

Explanation:

for "Highly available": Multi-AZ & for "least amount of changes to the application": Elastic Beanstalk automatically handles the deployment, from capacity provisioning, load balancing, auto-scaling to application health monitoring

 

 

NEW QUESTION 254

-  (Exam Topic 3)

A company wants to migrate an Oracle database to AWS. The database consists of a single table that contains millions of geographic information systems (GIS) images that are high resolution and are identified by a geographic code.

When a natural disaster occurs tens of thousands of images get updated every few minutes. Each geographic code has a single image or row that is associated with it. The company wants a solution that is highly available and scalable during such events

Which solution meets these requirements MOST cost-effectively?

 

A.  Store the images and geographic codes in a database table Use Oracle running on an Amazon RDS Multi-AZ DB instance

 

B.  Store the images in Amazon S3 buckets Use Amazon DynamoDB with the geographic code as the key and the image S3 URL as the value

C.  Store the images and geographic codes in an Amazon DynamoDB table Configure DynamoDB Accelerator (DAX) during times of high load

D.  Store the images in Amazon S3 buckets Store geographic codes and image S3 URLs in a database table Use Oracle running on an Amazon RDS Multi-AZ DB instance.

 

Answer: A

 

 

NEW QUESTION 259

-  (Exam Topic 3)

A company hosts a three-tier ecommerce application on a fleet of Amazon EC2 instances. The instances run in an Auto Scaling group behind an Application Load Balancer (ALB) All ecommerce data is stored in an Amazon RDS for ManaDB Multi-AZ DB instance

The company wants to optimize customer session management during transactions The application must store session data durably Which solutions will meet these requirements? (Select TWO )

 

A.  Turn on the sticky sessions feature (session affinity) on the ALB

B.  Use an Amazon DynamoDB table to store customer session information

C.  Deploy an Amazon Cognito user pool to manage user session information

D.  Deploy an Amazon ElastiCache for Redis cluster to store customer session information

E.  Use AWS Systems Manager Application Manager in the application to manage user session information

 

Answer: AB

 

 

NEW QUESTION 261

-  (Exam Topic 3)

A company has a web application that is based on Java and PHP The company plans to move the application from on premises to AWS The company needs the ability to test new site features frequently. The company also needs a highly available and managed solution that requires minimum operational overhead

Which solution will meet these requirements?

 

A.  Create an Amazon S3 bucket Enable static web hosting on the S3 bucket Upload the static content to the S3 bucket Use AWS Lambda to process all dynamic content

B.  Deploy the web application to an AWS Elastic Beanstalk environment Use URL swapping to switch between multiple Elastic Beanstalk environments for feature testing

C.  Deploy the web application lo Amazon EC2 instances that are configured with Java and PHP Use Auto Scaling groups and an Application Load Balancer to manage the website's availability

D.  Containerize the web application Deploy the web application to Amazon EC2 instances Use the AWS Load Balancer Controller to dynamically route traffic between containers thai contain the new site features for testing

 

Answer: B

 

 

NEW QUESTION 265

-  (Exam Topic 3)

A company is using Amazon CloudFront with this website. The company has enabled logging on the CloudFront distribution, and logs are saved in one of the company's Amazon S3 buckets The company needs to perform advanced analyses on the logs and build visualizations

What should a solutions architect do to meet these requirements'?

 

A.  Use standard SQL queries in Amazon Athena to analyze the CloudFront togs in the S3 bucket Visualize the results with AWS Glue

B.  Use standard SQL queries in Amazon Athena to analyze the CloudFront togs in the S3 bucket Visualize the results with Amazon QuickSight

C.  Use standard SQL queries in Amazon DynamoDB to analyze the CloudFront logs m the S3 bucket Visualize the results with AWS Glue

D.  Use standard SQL queries in Amazon DynamoDB to analyze the CtoudFront logs m the S3 bucket Visualize the results with Amazon QuickSight

 

Answer: D

 

 

NEW QUESTION 270

-  (Exam Topic 3)

A company wants to migrate a Windows-based application from on premises to the AWS Cloud. The application has three tiers, a business tier, and a database tier with Microsoft SQL Server. The company wants to use specific features of SQL Server such as native backups and Data Quality Services. The company also needs to share files for process between the tiers.

How should a solution architect design the architecture to meet these requirements?

 

A.  Host all three on Amazon instance

B.  Use Mmazon FSx File Gateway for file sharing between tiers.

C.  Host all three on Amazon EC2 instance

D.  Use Amazon FSx for Windows file sharing between the tiers.

E.  Host the application tier and the business tier on Amazon EC2 instance

F.  Host the database tier on Amazon RD

G.  Use Amazon Elastic File system (Amazon EFS) for file sharing between the tiers.

H.  Host the application tier and the business tier on Amazon EC2 instance

I.  Host the database tier on Amazon RD

J.  Use a Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volume for file sharing between the tiers.

 

Answer: B

 

 

NEW QUESTION 272

-  (Exam Topic 3)

A company needs to provide its employee with secure access to confidential and sensitive files. The company wants to ensure that the files can be accessed only by authorized users. The files must be downloaded security to the employees devices.

The files are stored in an on-premises Windows files server. However, due to an increase in remote usage, the file server out of capacity. Which solution will meet these requirement?

 

A.  Migrate the file server to an Amazon EC2 instance in a public subne

B.  Configure the security group to limit inbound traffic to the employees ‚IP addresses.

C.  Migrate the files to an Amazon FSx for Windows File Server file syste

D.  Integrate the Amazon FSx file system with the on-premises Active Directory Configure AWS Client VPN.

E.  Migrate the files to Amazon S3, and create a private VPC endpoin

F.  Create a signed URL to allow download.

G.  Migrate the files to Amazon S3, and create a public VPC endpoint Allow employees to sign on with AWS IAM identity Center (AWS Sing-On).

 

Answer: C

 

 

NEW QUESTION 275

-  (Exam Topic 3)

A data analytics company wants to migrate its batch processing system to AWS. The company receives thousands of small data files periodically during the day through FTP. A on-premises batch job processes the data files overnight. However, the batch job takes hours to finish running.

The company wants the AWS solution to process incoming data files are possible with minimal changes to the FTP clients that send the files. The solution must delete the incoming data files the files have been processed successfully. Processing for each file needs to take 3-8 minutes.

Which solution will meet these requirements in the MOST operationally efficient way?

 

A.  Use an Amazon EC2 instance that runs an FTP server to store incoming files as objects in Amazon S3 Glacier Flexible Retrieva

B.  Configure a job queue in AWS Batc

C.  Use Amazon EventBridge rules to invoke the job to process the objects nightly from S3 Glacier Flexible Retrieva

D.  Delete the objects after the job has processed the objects.

E.  Use an Amazon EC2 instance that runs an FTP server to store incoming files on an Amazon Elastic Block Store (Amazon EBS) volum

F.  Configure a job queue in AWS Batc

G.  Use Amazon EventBridge rules to invoke the process the files nightly from the EBS volum

H.  Delete the files after the job has processed the files.

I.  Use AWS Transfer Family to create an FTP server to store incoming files on an Amazon Elastic Block Store (Amazon EBS) volum

J.  Configure a job queue in AWS Batc

K.  Use an Amazon S3 event notification when each files arrives to invoke the job in AWS Batc

L.  Delete the files after the job has processed the files.

M.  Use AWS Transfer Family to create an FTP server to store incoming files in Amazon S3 Standard.Create an AWS Lambda function to process the files and to delete the files after they are proessed.yse an S3 event notification to invoke the lambda function when the fils arrive

 

Answer: C

 

 

NEW QUESTION 277

-  (Exam Topic 3)

A company wants to configure its Amazon CloudFront distribution to use SSL/TLS certificates. The company does not want to use the default domain name for the distribution. Instead, the company wants to use a different domain name for the distribution.

Which solution will deploy the certificate with icurring any additional costs?

 

A.  Request an Amazon issued private certificate from AWS Certificate Manager (ACM) in the us-east-1 Region

B.  Request an Amazon issued private certificate from AWS Certificate Manager (ACM) in the us-west-1 Region.

C.  Request an Amazon issued public certificate from AWS Certificate Manager (ACU) in the us-east-1 Region

D.  Request an Amazon issued public certificate from AWS Certificate Manager (ACU) in the us-west-1 Regon.

 

Answer: B

 

 

NEW QUESTION 279

-  (Exam Topic 3)

An application that is hosted on Amazon EC2 instances needs to access an Amazon S3 bucket Traffic must not traverse the internet How should a solutions architect configure access to meet these requirements?

 

A.  Create a private hosted zone by using Amazon Route 53

B.  Set up a gateway VPC endpoint for Amazon S3 in the VPC

C.  Configure the EC2 instances to use a NAT gateway to access the S3 bucket

D.  Establish an AWS Site-to-Site VPN connection between the VPC and the S3 bucket

 

Answer: B

 

 

NEW QUESTION 283

-  (Exam Topic 3)

A gaming company is moving its public scoreboard from a data center to the AWS Cloud. The company uses Amazon EC2 Windows Server instances behind an Application Load Balancer to host its dynamic application. The company needs a highly available storage solution for the application. The application consists of static files and dynamic server-side code.

Which combination of steps should a solutions architect take to meet these requirements? (Select TWO.) ' A. Store the static files on Amazon S3. Use Amazon A. CloudFront to cache objects at the edge.

 

A.  Store the static files on Amazon S3. Use Amazon ElastiCache to cache objects at the edge.

B.  Store the server-side code on Amazon Elastic File System (Amazon EFS). Mount the EFS volume on each EC2 instance to share the files.

C.  Store the server-side code on Amazon FSx for Windows File Serve

D.  Mount the FSx for Windows File Server volume on each EC2 instance to share the files.

E.  Store the server-side code on a General Purpose SSD (gp2) Amazon Elastic Block Store (Amazon EBS) volum

F.  Mount the EBS volume on each EC2 instance to share the files.

 

Answer: AE

 

 

NEW QUESTION 285

 

-  (Exam Topic 3)

A company has deployed a database in Amazon RDS for MySQL. Due to increased transactions, the database support team is reporting slow reads against the DB instance and recommends adding a read replica.

Which combination of actions should a solutions architect take before implementing this change? (Choose two.)

 

A.  Enable binlog replication on the RDS primary node.

B.  Choose a failover priority for the source DB instance.

C.  Allow long-running transactions to complete on the source DB instance.

D.  Create a global table and specify the AWS Regions where the table will be available.

E.  Enable automatic backups on the source instance by setting the backup retention period to a value other than 0.

 

Answer: CE

 

Explanation:

"An active, long-running transaction can slow the process of creating the read replica. We recommend that you wait for long-running transactions to complete before creating a read replica. If you create multiple read replicas in parallel from the same source DB instance, Amazon RDS takes only one snapshot at the start of the first create action. When creating a read replica, there are a few things to consider. First, you must enable automatic backups on the source DB instance by setting the backup retention period to a value other than 0. This requirement also applies to a read replica that is the source DB instance for another read replica" https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.html

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0
Deenadayalan Welcome to TNNEWS, your trusted source for news and updates from around the world. Our goal is to provide timely, informative content across various topics, including world news, technology, health, education, movies, and more. Please note that TNNEWS is an independent news platform created to share knowledge, useful links, and updates for your benefit. We are not affiliated with any government or political organization. The content here is purely for informational purposes and aims to provide insights into global and local trends. Feel free to explore, read, and stay informed!