Amazon Exam, SAA-C02 Exam Dumps, SAA-C02 Exam Questions, SAA-C02 PDF Dumps, SAA-C02 VCE Dumps

[August-2022]New Braindump2go SAA-C02 PDF and SAA-C02 VCE Dumps[Q1047-Q1067]

August/2022 Latest Braindump2go SAA-C02 Exam Dumps with PDF and VCE Free Updated Today! Following are some new SAA-C02 Real Exam Questions!

QUESTION 1047
A company wants to analyze and troubleshoot Access Denied errors and unauthorized errors that ate related to IAM permissions. The company has AWS ClouTrail turned on. Which solution will meet these requirements with the LEAST effort?

A. Use AWS Glue and mile custom scripts lo query CloudTrail logs for the errors.
B. Use AWS Batch and write custom scripts to query CloudTrail logs for the errors.
C. Search CloudTrail logs will Amazon Athena queries to identify the errors
D. Search CloudTrail logs with Amazon QuicKSight Create a dashboard to identify the errors

Answer: C

QUESTION 1048
A company is running several business applications in three separate VPCs within me us-east-1 Region. The applications must be able to communicate between VPCs. The applications also must be able to consistently send hundreds to gigabytes of data each day to a latency-sensitive application that runs in a single on-premises data center.
A solutions architect needs to design a network connectivity solution that maximizes cost- effectiveness
Which solution moots those requirements?

A. Configure three AWS Site-to-Site VPN connections from the data center to AWS.
Establish connectivity by configuring one VPN connection for each VPC.
B. Launch a third-party virtual network appliance in each VPC.
Establish an iPsec VPN tunnel between the Data center and each virtual appliance.
C. Set up three AWS Direct Connect connections from the data center to a Direct Connect gateway in us-east-1.
Establish connectivity by configuring each VPC to use one of the Direct Connect connections.
D. Set up one AWS Direct Connect connection from the data center to AWS.
Create a transit gateway, and attach each VPC to the transit gateway.
Establish connectivity between the Direct Connect connection and the transit gateway.

Answer: C

QUESTION 1049
A company wants to measure the effectiveness of its recent marketing campaigns. The company performs batch processing on csv files of sales data and stores the results in an Amazon S3 bucket once every hour. The S3 bi petabytes of objects. The company runs one-time queries in Amazon Athena to determine which products are most popular on a particular date for a particular region Queries sometimes fail or take longer than expected to finish.
Which actions should a solutions architect take to improve the query performance and reliability? (Select TWO.)

A. Reduce the S3 object sizes to less than 126 MB
B. Partition the data by date and region n Amazon S3
C. Store the files as large, single objects in Amazon S3.
D. Use Amazon Kinosis Data Analytics to run the Queries as pan of the batch processing operation
E. Use an AWS duo extract, transform, and load (ETL) process to convert the csv files into Apache Parquet format.

Answer: CE

QUESTION 1050
A company wants to establish connectivity between its on-premlses data center and AWS (or an existing workload. The workload runs on Amazon EC2 Instances in two VPCs In different AWS Regions. The VPCs need to communicate with each other. The company needs to provide connectivity from Its data center to both VPCs. The solution must support a bandwidth of 600 Mbps to the data center.
Which solution will meet these requirements?

A. Set up an AWS Site-to-Site VPN connection between the data center and one VPC.
Create a VPC peering connection between the VPCs.
B. Set up an AWS Site-to-Site VPN connection between the data center and each VPC.
Create a VPC peering connection between the VPCs.
C. Set up an AWS Direct Connect connection between the data center and one VPC.
Create a VPC peering connection between the VPCs.
D. Create a transit gateway. Attach both VPCs to the transit gateway.
Create an AWS Slte-to-Site VPN tunnel to the transit gateway.

Answer: B

QUESTION 1051
A company uses NFS to store large video files in on-premises network attached storage. Each video file ranges in size from 1MB to 500 GB. The total storage is 70 TB and is no longer growing. The company decides to migrate the video files to Amazon S3. The company must migrate the video files as soon as possible while using the least possible network bandwidth.
Which solution will meet these requirements?

A. Create an S3 bucket Create an IAM role that has permissions to write to the S3 bucket.
Use the AWS CLI to copy all files locally to the S3 bucket.
B. Create an AWS Snowball Edge job.
Receive a Snowball Edge device on premises.
Use the Snowball Edge client to transfer data to the device.
Return the device so that AWS can import the data into Amazon S3.
C. Deploy an S3 File Gateway on premises.
Create a public service endpoint to connect to the S3 File Gateway.
Create an S3 bucket.
Create a new NFS file share on the S3 File Gateway.
Point the new file share to the S3 bucket.
Transfer the data from the existing NFS file share to the S3 File Gateway.
D. Set up an AWS Direct Connect connection between the on-premises network and AWS.
Deploy an S3 File Gateway on premises.
Create a public virtual interlace (VIF) to connect to the S3 File Gateway.
Create an S3 bucket.
Create a new NFS file share on the S3 File Gateway.
Point the new file share to the S3 bucket.
Transfer the data from the existing NFS file share to the S3 File Gateway.

Answer: C

QUESTION 1052
A company needs to move data from an Amazon EC2 instance to an Amazon S3 bucket. The company mutt ensure that no API calls and no data aim routed through public internet routes Only the EC2 instance can have access to upload data to the S3 bucket.
Which solution will meet these requirements?

A. Create an interlace VPC endpoinl for Amazon S3 in the subnet where the EC2 instance is located.
Attach a resource policy to the S3 bucket to only allow the EC2 instance’s IAM rote for access.
B. Create a gateway VPC endpoinl for Amazon S3 in the Availability Zone where the EC2 instance is located.
Attach appropriate security groups to the endpoint.
Attach a resource policy to the S3 bucket to only allow the EC2 instance’s lAM tote for access
C. Run the nslookup toot from inside the EC2 instance to obtain the private IP address of the S3 bucket’s service API endpoint.
Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucket.
Attach a resource policy to the S3 bucket to only allow the EC2 instance’s AM role for access.
D. Use the AWS provided publicly available ip-ranges |son file to obtam the pnvate IP address of the S3 bucket’s service API endpoint Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucket.
Attach a resource policy to the S3 bucket to only allow the EC2 instance’s IAM role for access.

Answer: B

QUESTION 1053
A company is building a new dynamic ordering website. The company wants to minimize server maintenance and patching. The website must be highly available and must scale read and write capacity as quickly as possible to meet changes in user demand.

A. Host static content in Amazon S3.
Host dynamic content by using Amazon API Gateway and AWS Lambda.
Use Amazon DynamoDB with on-demand capacity for the database.
Configure Amazon CloudFront to deliver the website content.
B. Host static content in Amazon S3.
Host dynamic content by using Amazon API Gateway and AWS Lambda.
Use Amazon Aurora with Aurora Auto Scaling for the database.
Configure Amazon CloudFront to deliver the website content.
C. Host all the website content on Amazon EC2 instances.
Create an Auto Scaling group to scale the EC2 instances.
Use an Application Load Balancer to distribute traffic.
Use Amazon DynamoDB with provisioned write capacity for the database.
D. Host all the website content on Amzon EC2 instances.
Create an Auto Scaling group to scale the EC2 instances.
Use an Application Load Balancer to distribute traffic.
Use Amazon Aurora with Aurora Auto Scaling for the database.

Answer: C

QUESTION 1054
A solutions architect needs to implement a solution to reduce a company’s storage costs. All the company’s data is in the Amazon S3 Standard storage class. The company must keep all data for at least 25 years Data from the most recent 2 year must be highly available and immediately retrievable. Which solution will meet these requirements?

A. Set up an S3 Lifecycle policy to transition objects to S3 Glacier Deep Archive immediately
B. Set up an S3 Lifecycle policy to transition objects to S3 Glader Deep Archive after 2 years
C. Use S3 Intelligent-Tiering Activate the archiving option to ensure that data is archived in S3 Glader Deep Archive
D. Set up an S3 Lifecycle policy to transition objects to S3 One Zone-Infrequent Access (S3 One Zone- IA) immediately and to S3 Glacier Deep Archive after 2 years

Answer: D

QUESTION 1055
A company needs to store data in Amazon S3 and must prevent the data from being changed. The company wants new objects that are uploaded to Amazon S3 to remain unchangeable for a nonspecific amount of time until the company decides to modify the objects.
Only specific users in the company’s AWS account can have the ability to delete the objects.
What should a solutions architect do to meet these requirements?

A. Create an S3 Glacier vault.
Apply a write-once, read-many (WORM) vault lock policy to the objects
B. Create an S3 bucket with S3 Object Lock enabled Enable versioning.
Set a retention period of 100 years.
Use governance mode as the S3 bucket’s default retention mode for new objects
C. Create an S3 bucket.
Use AWS CloudTrail to (rack any S3 API events that modify the objects.
Upon notification, restore the modified objects from any backup versions that the company has
D. Create an S3 bucket with S3 Object Lock enabled Enable versioning.
Add a legal hold to the objects.
Add the s3 PutObjectLegalHold permission to the IAM policies of users who need to delete the objects.

Answer: D

QUESTION 1056
A company is using a SQL database to store movie data that is publicly accessible. The database runs on an Amazon RDS Single-AZ DB instance A script runs queries at random intervals each day to record the number of new movies that have been added to the database. The script must report a final total during business hours The company’s development team notices that the database performance is inadequate for development tasks when the script is running. A solutions architect must recommend a solution to resolve this issue.
Which solution will meet this requirement with the LEAST operational overhead?

A. Modify the DB instance to be a Multi-AZ deployment
B. Create a read replica of the database.
Configure the script to query only the read replica
C. Instruct the development team to manually export the entries in the database at the end of each day
D. Use Amazon ElastiCache to cache the common queries that the script runs against the database

Answer: D

QUESTION 1057
A company runs a global web application on Amazon EC2 instances behind an Application Load Balancer.
The application stores data in Amazon Aurora. The company needs to create a disaster recovery solution and can tolerate up to 30 minutes of downtime and potential data loss. The solution does not need to handle the load when the primary infrastructure is healthy
What should a solutions architect do to meet these requirements?

A. Deploy the application with the required infrastructure elements in place.
Use Amazon Route 53 to configure active-passive failover.
Create an Aurora Replica in a second AWS Region.
B. Host a scaled-down deployment of the application in a second AWS Region.
Use Amazon Route 53 to configure active-active failover.
Create an Aurora Replica in the second Region.
C. Replicate the primary infrastructure in a second AWS Region.
Use Amazon Route 53 to configure active-active failover.
Create an Aurora database that is restored from the latest snapshot.
D. Back up data with AWS Backup.
Use the backup to create the required infrastructure in a second AWS Region.
Use Amazon Route 53 to configure active-passive failover.
Create an Aurora second primary instance in the second Region.

Answer: C

QUESTION 1058
An ecommerce company has an order-processing application that uses Amazon API Gateway and an AWS Lambda function. The application stores data in an Amazon Aurora PostgreSQL database. During a recent sales event, a sudden surge in customer orders occurred. Some customers experienced timeouts and the application did not process the orders of those customers. A solutions architect determined that the CPU utilization and memory utilization were high on the database because of a large number of open connections. The solutions architect needs to prevent the timeout errors while making the least possible changes to the application.
Which solution will meet these requirements?

A. Configure provisioned concurrency for the Lambda function.
Modify the database to be a global database in multiple AWS Regions
B. Use Amazon RDS Proxy to create a proxy for the database.
Modify the Lambda function to use the RDS Proxy endpoint instead of the database endpoint
C. Create a read replica for the database in a different AWS Region.
Use query string parameters in API Gateway to route traffic to the read replica.
D. Migrate the data from Aurora PostgreSQL to Amazon DynamoDB by using AWS Database Migration Service (AWS DMS).
Modify the Lambda function to use the OynamoDB table

Answer: C

QUESTION 1059
A company uses a popular content management system (CMS) tot its corporate website. However, the required patching and maintenance are burdensome. The company is redesigning its website and wants a new solution. The website will be updated tour times a year and does not need to have any dynamic content available. The solution must provide high scalability and enhanced security.
Which combination of changes will meet those requirements with the LEAST operational overhead? (Select TWO)

A. Deploy an AWS WAF web ACL in front of the website to provide HTTPS functionality
B. Create and deploy an AWS Lambda function to manage and serve the website content
C. Create the new website and an Amazon S3 bucket.
Deploy the website on the S3 bucket with static website hosting enabled
D. Create the new website.
Deploy the website by using an Auto Scaling group of Amazon EC2 instances behind an Application Load Balancer.

Answer: D

QUESTION 1060
A company is building a solution that will report Amazon EC2 Auto Scaling events across all the applications In an AWS account. The company needs to use a serverless solution to store the EC2 Auto Scaling status data in Amazon S3. The company then will use the data m Amazon S3 to provide near-real time updates in a dashboard. The solution must not affect the speed of EC2 instance launches.
How should the company move the data to Amazon S3 to meet these requirements?

A. Use an Amazon CioudWatch metric stream to send the EC2 Auto Scaling status data to Amazon Kinesis Data Firehose.
Store the data in Amazon S3.
B. Launch an Amazon EMR duster to collect the EC2 Auto Scaling status data and send the data to Amazon Kinesis Data Firehose.
Store the data in Amazon S3.
C. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda function on a schedule.
Configure the Lambda function to send the EC2 Auto Scaling status data directly to Amazon S3.
D. Use a bootstrap script during the launch of an EC2 instance to install Amazon Kinesis Agent.
Configure Kinesis Agent to collect the EC2 Auto Scaling status data and send the data to Amazon Kinesis Data Firehose.
Store the data in Amazon S3.

Answer: B

QUESTION 1061
A company has a three-tier web application that is deployed on AWS. The web servers are deployed in a public subnet in a VPC. The application servers and database servers are deployed in private subnets in the same VPC. The company has deployed a third-party virtual firewall appliance from AWS Marketplace in an inspection VPC. The appliance is configured with an IP interface that can accept IP packets.
A solutions architect needs to Integrate the web application with the appliance to inspect all traffic to the application before the traffic teaches the web server. Which solution will moot these requirements with the LEAST operational overhead?

A. Create a Network Load Balancer the public subnet of the application’s VPC to route the traffic to the appliance for packet inspection
B. Create an Application Load Balancer in the public subnet of the application’s VPC to route the traffic to the appliance for packet inspection
C. Deploy a transit gateway m the inspection VPC.
Configure route tables to route the incoming pockets through the transit gateway.
D. Deploy a Gateway Load Balancer in the inspection VPC.
Create a Gateway Load Balancer endpoint to receive the incoming packets and forward the packets to the appliance.

Answer: D

QUESTION 1062
A company’s application Is having performance issues. The application staleful and needs to complete m-memory tasks on Amazon EC2 instances. The company used AWS CloudFormation to deploy infrastructure and used the M5 EC2 Instance family. As traffic increased, the application performance degraded. Users are reporting delays when the users attempt to access the application.
Which solution will resolve these issues in the MOST operationally efficient way?

A. Replace the EC2 Instances with T3 EC2 instances that run in an Auto Scaling group.
Made the changes by using the AWS Management Console.
B. Modify the CloudFormation templates to run the EC2 instances in an Auto Scaling group.
Increase the desired capacity and the maximum capacity of the Auto Scaling group manually when an increase is necessary
C. Modify the CloudFormation templates.
Replace the EC2 instances with R5 EC2 instances.
Use Amazon CloudWatch built-in EC2 memory metrics to track the application performance for future capacity planning.
D. Modify the CloudFormation templates.
Replace the EC2 instances with R5 EC2 instances.
Deploy the Amazon CloudWatch agent on the EC2 instances to generate custom application latency metrics for future capacity planning.

Answer: D

QUESTION 1063
A company is building a containerized application on premises and decides to move the application to AWS. The application will have thousands of users soon after li is deployed. The company Is unsure how to manage the deployment of containers at scale. The company needs to deploy the containerized application in a highly available architecture that minimizes operational overhead.
Which solution will meet these requirements?

A. Store container images In an Amazon Elastic Container Registry (Amazon ECR) repository.
Use an Amazon Elastic Container Service (Amazon ECS) cluster with the AWS Fargate launch type to run the containers.
Use target tracking to scale automatically based on demand.
B. Store container images in an Amazon Elastic Container Registry (Amazon ECR) repository.
Use an Amazon Elastic Container Service (Amazon ECS) cluster with the Amazon EC2 launch type to run the containers.
Use target tracking to scale automatically based on demand.
C. Store container images in a repository that runs on an Amazon EC2 instance.
Run the containers on EC2 instances that are spread across multiple Availability Zones.
Monitor the average CPU utilization in Amazon CloudWatch. Launch new EC2 instances as needed
D. Create an Amazon EC2 Amazon Machine Image (AMI) that contains the container image Launch EC2 Instances in an Auto Scaling group across multiple Availability Zones.
Use an Amazon CloudWatch alarm to scale out EC2 instances when the average CPU utilization threshold is breached.

Answer: A

QUESTION 1064
A company has an event-driven application that invokes AWS Lambda functions up to 800 times each minute with varying runtimes. The Lambda functions access data that is stored in an Amazon Aurora MySQL OB cluster. The company is noticing connection timeouts as user activity increases. The database shows no signs of being overloaded. CPU, memory, and disk access metrics are all low.
Which solution will resolve this issue with the LEAST operational overhead?

A. Adjust the size of the Aurora MySQL nodes to handle more connections.
Configure retry logic in the Lambda functions for attempts to connect to the database.
B. Set up Amazon ElastiCache tor Redls to cache commonly read items from the database.
Configure the Lambda functions to connect to ElastiCache for reads.
C. Add an Aurora Replica as a reader node.
Configure the Lambda functions to connect to the reader endpoint of the OB cluster rather than lo the writer endpoint.
D. Use Amazon ROS Proxy to create a proxy.
Set the DB cluster as the target database.
Configure the Lambda functions lo connect to the proxy rather than to the DB cluster.

Answer: D

QUESTION 1065
A company is planning to build a high performance computing (HPC) workload as a service solution that Is hosted on AWS.
A group of 16 AmazonEC2Ltnux Instances requires the lowest possible latency for node-to-node communication.
The instances also need a shared block device volume for high-performing storage.
Which solution will meet these requirements?

A. Use a duster placement group.
Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon E BS) volume to all the instances by using Amazon EBS Multi-Attach
B. Use a cluster placement group.
Create shared ‘lie systems across the instances by using Amazon Elastic File System (Amazon EFS)
C. Use a partition placement group.
Create shared tile systems across the instances by using Amazon Elastic File System (Amazon EFS).
D. Use a spread placement group.
Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon EBS) volume to all the instances by using Amazon EBS Multi-Attach

Answer: A

QUESTION 1066
A company has an ecommerce checkout workflow that writes an order to a database and calls a service to process the payment. Users are experiencing timeouts during the checkout process.
When users resubmit the checkout form, multiple unique orders are created for the same desired transaction.
How should a solutions architect refactor this workflow to prevent the creation of multiple orders?

A. Configure the web application to send an order message to Amazon Kinesis Data Firehose.
Set the payment service to retrieve the message from Kinesis Data Firehose and process the order.
B. Create a rule in AWS CloudTrail to invoke an AWS Lambda function based on the logged application path request.
Use Lambda to query the database, call the payment service, and pass in the order information.
C. Store the order in the database.
Send a message that includes the order number to Amazon Simple Notification Service (Amazon SNS).
Set the payment service to poll Amazon SNS. retrieve the message, and process the order.
D. Store the order in the database.
Send a message that includes the order number to an Amazon Simple Queue Service (Amazon SQS) FIFO queue.
Set the payment service to retrieve the message and process the order.
Delete the message from the queue.

Answer: D

QUESTION 1067
A company hosts a two-tier application on Amazon EC2 instances and Amazon RDS. The application’s demand varies based on the time of day. The load is minimal after work hours and on weekends. The EC2 instances run in an EC2 Auto Scaling group that is configured with a minimum of two instances and a maximum of five instances. The application must be available at all times, but the company is concerned about overall cost.
Which solution meets the availability requirement MOST cost-effectively?

A. Use all EC2 Spot Instances. Stop the RDS database when it is not in use.
B. Purchase EC2 Instance Savings Plans to cover five EC2 instances.
Purchase an RDS Reserved DB Instance
C. Purchase two EC2 Reserved Instances Use up to three additional EC2 Spot Instances as needed.
Stop the RDS database when it is not in use.
D. Purchase EC2 Instance Savings Plans to cover two EC2 instances.
Use up to three additional EC2 On-Demand Instances as needed.
Purchase an RDS Reserved DB Instance.

Answer: D


Resources From:

1.2022 Latest Braindump2go SAA-C02 Exam Dumps (PDF & VCE) Free Share:
https://www.braindump2go.com/saa-c02.html

2.2022 Latest Braindump2go SAA-C02 PDF and SAA-C02 VCE Dumps Free Share:
https://drive.google.com/drive/folders/1_5IK3H_eM74C6AKwU7sKaLn1rrn8xTfm?usp=sharing

3.2021 Free Braindump2go SAA-C02 Exam Questions Download:
https://www.braindump2go.com/free-online-pdf/SAA-C02-PDF-Dumps(1047-1067).pdf

Free Resources from Braindump2go,We Devoted to Helping You 100% Pass All Exams!