AZ-204 Developing Solutions for Microsoft Azure Dumps
If you are looking for free AZ-204 dumps than here we have some sample question answers available. You can prepare from our Microsoft AZ-204 exam questions notes and prepare exam with this practice test. Check below our updated AZ-204 exam dumps.
DumpsGroup are top class study material providers and our inclusive range of AZ-204 Real exam questions would be your key to success in Microsoft Microsoft Certified: Azure Developer Associate Certification Exam in just first attempt. We have an excellent material covering almost all the topics of Microsoft AZ-204 exam. You can get this material in Microsoft AZ-204 PDF and AZ-204 practice test engine formats designed similar to the Real Exam Questions. Free AZ-204 questions answers and free Microsoft AZ-204 study material is available here to get an idea about the quality and accuracy of our study material.
Sample Question 4
You are developing a Java application that uses Cassandra to store key and value data.
You plan to use a new Azure Cosmos DB resource and the Cassandra API in the
application. You create an Azure Active Directory (Azure AD) group named Cosmos DB
Creators to enable provisioning of Azure Cosmos accounts, databases, and containers.
The Azure AD group must not be able to access the keys that are required to access the
data.
You need to restrict access to the Azure AD group.
Which role-based access control should you use?
A. DocumentDB Accounts Contributor B. Cosmos Backup Operator C. Cosmos DB Operator D. Cosmos DB Account Reader
Answer: C
Explanation:
Azure Cosmos DB now provides a new RBAC role, Cosmos DB Operator. This new role
lets you provision Azure Cosmos accounts, databases, and containers, but can’t access
the keys that are required to access the data. This role is intended for use in scenarios
where the ability to grant access to Azure Active Directory service principals to manage
deployment operations for Cosmos DB is needed, including the account, database, and
You develop a solution that uses Azure Virtual Machines (VMs).
The VMs contain code that must access resources in an Azure resource group. You grant
the VM access to the resource group in Resource Manager.
You need to obtain an access token that uses the VMs system-assigned managed identity.
Which two actions should you perform? Each correct answer presents part of the solution.
A. Use PowerShell on a remote machine to make a request to the local managed identity
for Azure resources endpoint. B. Use PowerShell on the VM to make a request to the local managed identity for Azureresources endpoint. C. From the code on the VM. call Azure Resource Manager using an access token. D. From the code on the VM. call Azure Resource Manager using a SAS token. E. From the code on the VM. generate a user delegation SAS token.
Answer: B,C
Sample Question 6
You develop and add several functions to an Azure Function app that uses the latestruntime host. The functions contain several REST API endpoints secured by using SSL.The Azure Function app runs in a Consumption plan.You must send an alert when any of the function endpoints are unavailable or respondingtoo slowly.You need to monitor the availability and responsiveness of the functions.What should you do?
A. Create a URL ping test. B. Create a timer triggered function that calls TrackAvailability() and send the results to
Application
Insights. C. Create a timer triggered function that calls GetMetric("Request Size") and send the
results to C. Create a timer triggered function that calls GetMetric("Request Size") and send the
results to
Application Insights. D. Add a new diagnostic setting to the Azure Function app. Enable the FunctionAppLogs
and Send to Log Analytics options.
Answer: B Explanation: ] You can create an Azure Function with TrackAvailability() that will run periodically
according to the configuration given in the TimerTrigger function with your own business logic.
The results of this test will be sent to your Application Insights resource, where you will be
able to query for and alert on the availability results data. This allows you to create
customized tests similar to what you can do via Availability Monitoring in the portal.
Customized tests will allow you to write more complex availability tests than is possible
using the portal UI, monitor an app inside of your Azure VNET, change the endpoint
address, or create an availability test even if this feature is not available in your region.
D18912E1457D5D1DDCBD40AB3BF70D5D
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/availability-azure-functions
Sample Question 7
You are developing a web application by using the Azure SDK. The web applicationaccesses data m a zone-redundant BlockBlobStorage storage accountThe application must determine whether the data has changed since the application lastreao the data. Update operations must use the latest data changes when writing data to thestorages..................You need to implement the update operations.Which values should you use? To answer, select the appropriate option m the answerarea.NOTE Each correct selection is worth one point.
Answer: See the Explanation below:
Explanation:
See the answer in below image.
Sample Question 8
You develop and deploy an Azure App Service web app named App1. You create a new
Azure Key Vault named Vault 1. You import several API keys, passwords, certificates, and
cryptographic keys into Vault1.You need to grant App1 access to Vault1 and automatically rotate credentials Credentials
must not be stored in code.
What should you do?
A. Enable App Service authentication for Appt. Assign a custom RBAC role to Vault1. B. Add a TLS/SSL binding to App1. C. Assign a managed identity to App1. D. Upload a self-signed client certificate to Vault1. Update App1 to use the client certificate.
Answer: B
Sample Question 9
You are developing a web application that runs as an Azure Web App. The web application
stores data in Azure SQL Database and stores files in an Azure Storage account. The web
application makes HTTP requests to external services as part of normal operations.
The web application is instrumented with Application Insights. The external services are
OpenTelemetry compliant.
You need to ensure that the customer ID of the signed in user is associated with all
operations throughout the overall system.
What should you do?
A. Create a new SpanContext with the TraceRags value set to the customer ID for the
signed in user. B. On the current SpanContext, set the Traceld to the customer ID for the signed in user. C. Add the customer ID for the signed in user to the CorrelationContext in the web application. D. Set the header Ocp-Apim-Trace to the customer ID for the signed in user.
Answer: C
Sample Question 10
An organization hosts web apps in Azure. The organization uses Azure Monitor You
discover that configuration changes were made to some of the web apps. You need to
identify the configuration changes. Which Azure Monitor log should you review?
A. AppServiceEnvironmentPlatformLogs B. AppServiceApplogs C. AppServiceAuditLogs D. AppServiceConsoteLogs
Answer: C
Sample Question 11
You develop Azure solutions.You must connect to a No-SQL globally-distributed database by using the .NET API.You need to create an object to configure and execute requests in the database.Which code segment should you use?
A. new Container(EndpointUri, PrimaryKey); B. new Database(Endpoint, PrimaryKey); C. new CosmosClient(EndpointUri, PrimaryKey);
Answer: C
Sample Question 12
You have an existing Azure storage account that stores large volumes of data acrossmultiple containers.You need to copy all data from the existing storage account to a new storage account. Thecopy process must meet the following requirements:Automate data movement.Minimize user input required to perform the operation.Ensure that the data movement process is recoverable.What should you use?
A. AzCopy B. Azure Storage Explorer C. Azure portal D. .NET Storage Client Library
Answer: A
Sample Question 13
You develop and deploy a web app to Azure App Service. The Azure App Service uses aBasic plan in a single region.You need to capture the telemetry.Which three actions should you perform? Each correct answer presents part of the solutionNOTE; Each correct selection is worth one pewit
A. Upgrade the Azure App Service plan to Premium. B. Enable remote debugging. C. Enable Profiler D. Restart an apps in the App Service plan E. Enable Snapshot debugger F. Enable Application Insights site extensions. G. Enable the Always On setting for the app service.
Answer: C,D,F
Sample Question 14
A development team is creating a new REST API. The API will store data in Azure Blobstorage. You plan to deploy the API to Azure App Service.Developers must access the Azure Blob storage account to develop the API for the nexttwo months. The Azure Blob storage account must not be accessible by the developersafter the two-month time period.You need to grant developers access to the Azure Blob storage account.What should you do?
A. Generate a shared access signature (SAS) for the Azure Blob storage account and
provide the SAS to all developers. B. Create and apply a new lifecycle management policy to include a last accessed date
value. Apply the policy to the Azure Blob storage account. C. Provide all developers with the access key for the Azure Blob storage account. Update
the API to include the Coordinated Universal Time (UTC) timestamp for the request
header. D. Grant all developers access to the Azure Blob storage account by assigning role-based
access control (RBAC) roles.
Answer: A
Sample Question 15
You are developing an Azure App Service REST API.The API must be called by an Azure App Service web app. The API must retrieve andupdate user profile information stored in Azure Active Directory (Azure AD).You need to configure the API to make the updates.Which two tools should you use? Each correct answer presents part of the solution.NOTE: Each correct selection is worth one point.
A. Microsoft Graph API B. Microsoft Authentication Library (MSAL) C. Azure API Management D. Microsoft Azure Security Center E. Microsoft Azure Key Vault SDK
Answer: A,C
Sample Question 16
You are developing an Azure function that connects to an Azure SQL Database instance.
The function is triggered by an Azure Storage queue.You receive reports of numerous System.InvalidOperationExceptions with the following
message: “Timeout expired. The timeout period elapsed prior to obtaining a connection
from the pool. This may have occurred because all pooled connections were in use and
max pool size was reached.”
You need to prevent the exception.
What should you do?
A. In the host.json file, decrease the value of the batchSize option B. Convert the trigger to Azure Event Hub C. Convert the Azure Function to the Premium plan D. In the function.json file, change the value of the type option to queueScaling
Answer: C
Sample Question 17
Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated
goals. Some question sets might have more than one correct solution, while others
might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a
result, these questions will not appear in the review screen.You develop a software as a service (SaaS) offering to manage photographs. Users upload
photos to a web service which then stores the photos in Azure Storage Blob storage. The
storage account type is General-purpose V2.When photos are uploaded, they must be processed to produce and save a mobile-friendly
version of the image. The process to produce a mobile-friendly version of the image must
start in less than one minute.You need to design the process that starts the photo processing.Does the solution meet the goal? Solution: Use the Azure Blob Storage change feed to trigger photo processing.
A. Yes B. No
Answer: B
Explanation:
The change feed is a log of changes that are organized into hourly segments but appended
to and updated every few minutes. These segments are created only when there are blob
change events that occur in that hour.
Instead catch the triggered event, so move the photo processing to an Azure Function
triggered from the blob upload.
Sample Question 18
You develop and deploy an Azure App Service web app. The app is deployed to multipleregions and uses Azure Traffic Manager. Application Insights is enabled for the app.You need to analyze app uptime for each month.Which two solutions win achieve the goal? Each correct answer presents a completesolutionNOTE: Each correct selection is worth one point
A. Application Insights alerts B. Application Insights web tests C. Azure Monitor logs D. Azure Monitor metrics
Answer: A,B
Sample Question 19
You manage a data processing application that receives requests from an Azure Storagequeue.You need to manage access to the queue. You have the following requirements:Provide other applications access to the Azure queue.Ensure that you can revoke access to the queue without having to regenerate thestorage account keys. Specify access at the queue level and not at the storage account level.Which type of shared access signature (SAS) should you use?
A. Service SAS with a stored access policy B. Account SAS C. User Delegation SAS D. Service SAS with ad hoc SAS
Answer: A Explanation:
A service SAS is secured with the storage account key. A service SAS delegates access to
a resource in only one of the Azure Storage services: Blob storage, Queue storage, Table
storage, or Azure Files.
Stored access policies give you the option to revoke permissions for a service SAS without
having to regenerate the storage account keys.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview
Sample Question 20
Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated
goals. Some question sets might have more than one correct solution, while others
might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a
result, these questions will not appear in the review screen.You are developing a website that will run as an Azure Web App. Users will authenticate by
using their Azure Active Directory (Azure AD) credentials.You plan to assign users one of the following permission levels for the website: admin,
normal, and reader. A user’s Azure AD group membership must be used to determine the
permission level.You need to configure authorization.Solution:
Configure and use Integrated Windows Authentication in the website.
In the website, query Microsoft Graph API to load the group to which the user is a
member.Does the solution meet the goal?
A. Yes B. No
Answer: B
Explanation:
Microsoft Graph is a RESTful web API that enables you to access Microsoft Cloud service
resources.
Instead in the Azure AD application’s manifest, set value of the groupMembershipClaims
option to All. In the website, use the value of the groups claim from the JWT for the user to
determine permissions.
Sample Question 21
You are developing a web application that uses Azure Cache for Redis. You anticipate thatthe cache will frequently fill and you will need to evict keys.You must configure Azure Cache for Redis based on the following predicted usage pattern:A small subset of elements will be accessed much more often than the rest.You need to configure the Azure Cache for Redis to optimize performance for the predictedusage pattern.Which two eviction policies will achieve the goal?NOTE: Each correct selection is worth one point.
A. noeviction B. allkeys-lru C. volatile-lru D. allkeys-random E. volatile-ttl F. volatile-random
Answer: B,C
Explanation:
B: The allkeys-lru policy evict keys by trying to remove the less recently used (LRU) keys
first, in order to make space for the new data added. Use the allkeys-lru policy when you
expect a power-law distribution in the popularity of your requests, that is, you expect that a
subset of elements will be accessed far more often than the rest.
C: volatile-lru: evict keys by trying to remove the less recently used (LRU) keys first, but
only among keys that have an expire set, in order to make space for the new data added.
Note: The allkeys-lru policy is more memory efficient since there is no need to set an expire
You are creating an app that will use CosmosDB for data storage. The app will process
batches of relational data.
You need to select an API for the app.
Which API should you use?
A. MongoDBAPI B. Table API C. SQL API D. Cassandra API
You are developing a solution that will use a multi-partitioned Azure Cosmos DB database.
You plan to use the latest Azure Cosmos DB SDK for development.
The solution must meet the following requirements:
Send insert and update operations to an Azure Blob storage account.
Process changes to all partitions immediately.
Allow parallelization of change processing.
You need to process the Azure Cosmos DB operations.What are two possible ways to achieve this goal? Each correct answer presents a
complete solution.
NOTE: Each correct selection is worth one point.
A. Create an Azure App Service API and implement the change feed estimator of the SDK.
Scale the API by using multiple Azure App Service instances. B. Create a background job in an Azure Kubernetes Service and implement the change
feed feature of the SDK. C. Create an Azure Function to use a trigger for Azure Cosmos DB. Configure the trigger to
connect to the container. D. Create an Azure Function that uses a Feedlterator object that processes the change
feed by using the pull model on the container. Use a FeedRange objext to parallelize the
processing of the change feed across multiple functions.
Answer: C,D
Explanation:
Azure Functions is the simplest option if you are just getting started using the change feed.
Due to its simplicity, it is also the recommended option for most change feed use cases.
When you create an Azure Functions trigger for Azure Cosmos DB, you select the
container to connect, and the Azure Function gets triggered whenever there is a change in
the container. Because Azure Functions uses the change feed processor behind the
scenes, it automatically parallelizes change processing across your container's partitions.
Note: You can work with change feed using the following options:
You develop and deploy a web application to Azure App Service. The application accesses
data stored in an Azure Storage account. The account contains several containers with
several blobs with large amounts of data. You deploy all Azure resources to a single
region.
You need to move the Azure Storage account to the new region. You must copy all data to
the new region.
What should you do first?
A. Export the Azure Storage account Azure Resource Manager template B. Initiate a storage account failover C. Configure object replication for all blobs D. Use the AzCopy command line tool E. Create a new Azure Storage account in the current region F. Create a new subscription in the current region
Answer: A
Explanation:
To move a storage account, create a copy of your storage account in another region. Then,
move your data to that account by using AzCopy, or another tool of your choice, and finally,
delete the resources in the source region. To get started, export, and then modify a Resource Manager template.
You deploy an Azure App Service web app. You create an app registration for the app in
Azure Active Directory (Azure AD) and Twitter. the app must authenticate users and must
use SSL for all communications. The app must use Twitter as the identity provider. You
need to validate the Azure AD request in the app code. What should you validate?
A. HTTP response code B. ID token header C. ID token signature D. Tenant ID
Answer: B
Sample Question 26
You develop and deploy an Azure Logic app that calls an Azure Function app. The AzureFunction app includes an OpenAPl (Swagger) definition and uses an Azure Blob storageaccount. All resources are secured by using Azure Active Directory (Azure AD).The Azure Logic app must securely access the Azure Blob storage account. Azure ADresources must remain if the Azure Logic app is deleted.You need to secure the Azure Logic app. What should you do?
A. Create an Azure AD custom role and assign role-based access controls. B. Create an Azure AD custom role and assign the role to the Azure Blob storage account. C. Create an Azure Key Vault and issue a client certificate. D. Create a user-assigned managed identity and assign role-based access controls. E. Create a system-assigned managed identity and issue a client certificate.
Answer: D Explanation:
To give a managed identity access to an Azure resource, you need to add a role to the
target resource for that identity.
Note: To easily authenticate access to other resources that are protected by Azure Active
Directory (Azure AD) without having to sign in and provide credentials or secrets, your logic
app can use a managed identity (formerly known as Managed Service Identity or MSI).
Azure manages this identity for you and helps secure your credentials because you don't
have to provide or rotate secrets.
If you set up your logic app to use the system-assigned identity or a manually created,
user-assigned identity, the function in your logic app can also use that same identity for
authentication.
Reference:
https://docs.microsoft.com/en-us/azure/logic-apps/create-managed-service-identityhttps://docs.microsoft.com/en-us/azure/api-management/api-management-howto-mutualcertificates-for-cl...
Sample Question 27
Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated
goals. Some question sets might have more than one correct solution, while others
might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a
result, these questions will not appear in the review screen.
You develop and deploy an Azure App Service API app to a Windows-hosted deployment
slot named Development. You create additional deployment slots named Testing and
Production. You enable auto swap on the Production deployment slot.
You need to ensure that scripts run and resources are available before a swap operation
occurs.
Solution: Update the app with a method named status check to run the scripts. Update the
app settings for the app. Set the WEBSITE_SWAP_WARMUP_PING_PATH and
WEBSITE_SWAP_WARMUP_PING_STATUSES with a path to the new method and
appropriate response codes.
Does the solution meet the goal?
A. Yes B. No
Answer: B
Sample Question 28
You develop Azure solutions.A .NET application needs to receive a message each time an Azure virtual machine
finishes processing data. The messages must NOT persist after being processed by the
receiving application.You need to implement the .NET object that will receive the messages.Which object should you use?
A. QueueClient B. SubscriptionClient C. TopicClient D. CloudQueueClient
Answer: A
Sample Question 29
You develop a REST API. You implement a user delegation SAS token to communicatewith Azure Blobstorage.The token is compromised.You need to revoke the token.What are two possible ways to achieve this goal? Each correct answer presents acomplete solution.NOTE: Each correct selection is worth one point.
A. Revoke the delegation keys B. Delete the stored access policy. C. Regenerate the account key. D. Remove the role assignment for the security principle.
Answer: A,B
Explanation: A: Revoke a user delegation SAS
To revoke a user delegation SAS from the Azure CLI, call the az storage account revokedelegation-keys command. This command revokes all of the user delegation keys
associated with the specified storage account. Any shared access signatures associated
with those keys are invalidated. B: To revoke a stored access policy, you can either delete it, or rename it by changing the
signed identifier.
Changing the signed identifier breaks the associations between any existing signatures and
the stored access policy. Deleting or renaming the stored access policy immediately effects
all of the shared access signatures associated with it. D18912E1457D5D1DDCBD40AB3BF70D5D
Sample Question 30
You are developing an Azure messaging solution.
You need to ensure that the solution that meets the following requirements:• Provide transactional support
• Provide duplicate detection.
• Store the messages for an unlimited period of timeWhich two technologies will meet the requirements? Each correct answer presents a
complete solution NOTE Each correct selection is worth one point.
A. Azure Service Bus Queue B. Azure Storage Queue C. Azure Service Bus Topic D Azure Event Hub
Answer: A,C
Explanation:
Sample Question 31
You ate developing a web application that uses the Microsoft identity platform to
authenticate users and resources. The web application calls several REST APIs.The APIs require an access token from the Microsoft identity platform. You need to request a token.Which three properties should you use? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Application secret B. Redirect URI/URL C. Application name D. Supported account type E. Application ID
Answer: A,B,E
Sample Question 32
Note: This question is part of a series of questions that present the same scenario. Eachquestion in the series contains a unique solution that might meet the stated goals. Somequestion sets might have more than one correct solution, while others might not have acorrect solution.After you answer a question in this question, you will NOT be able to return to it. As aresult, these questions will not appear in the review screen.You are developing a website that will run as an Azure Web App. Users will authenticate byusing their Azure Active Directory (Azure AD) credentials.You plan to assign users one of the following permission levels for the website: admin,normal, and reader. A user’s Azure AD group membership must be used to determine thepermission level. You need to configure authorization.Solution:•Create a new Azure AD application’s manifest, set value of the groupMembershipClaimsoption to All.•In the website, use the value of the groups claim from the JWI for the user to determinepermissions.Does the solution meet the goal?
A. Yes B. No
Answer: A
Explanation:
To configure Manifest to include Group Claims in Auth Token
1. Go to Azure Active Directory to configure the Manifest. Click on Azure Active Directory,
and go to App registrations to find your application:
2. Click on your application (or search for it if you have a lot of apps) and edit the Manifest
by clicking on it.
3. Locate the “groupMembershipClaims” setting. Set its value to either “SecurityGroup” or
“All”. To help you decide which:
“SecurityGroup” - groups claim will contain the identifiers of all security groups of which the
user is a member.
“All” - groups claim will contain the identifiers of all security groups and all distribution lists
of which the user is a member
Now your application will include group claims in your manifest and you can use this fact in
your code.
References:
https://blogs.msdn.microsoft.com/waws/2017/03/13/azure-app-service-authentication-aadgroups/
Sample Question 33
You need to audit the retail store sales transactions.What are two possible ways to achieve the goal? Each correct answer presents a completesolution.NOTE: Each correct selection is worth one point.
A. Update the retail store location data upload process to include blob index tags. Createan Azure Function to process the blob index tags and filter by store location B. Enable blob versioning for the storage account. Use an Azure Function to process a listof the blob versions per day. C. Process an Azure Storage blob inventory report by using an Azure Function. Create rulefilters on the blob inventory report, D. Subscribe to blob storage events by using an Azure Function and Azure Event Grid.Filter the events by store location. E. Process the change feed logs of the Azure Blob storage account by using an AzureFunction. Specify a time range for the change feed data.
Answer: D,E
Explanation:
Scenario: Audit store sale transaction information nightly to validate data, process sales
financials, and reconcile inventory.
"Process the change feed logs of the Azure Blob storage account by using an Azure
Function. Specify a time range for the change feed data": Change feed support is well
suited for scenarios that process data based on objects that have changed. For example,
applications can:
Store, audit, and analyze changes to your objects, over any period of time, for security,
compliance or intelligence for enterprise data management.
"Subscribe to blob storage events by using an Azure Function and Azure Event Grid. Filter
the events by store location": Azure Storage events allow applications to react to events,
such as the creation and deletion of blobs. It does so without the need for complicated
code or expensive and inefficient polling services. The best part is you only pay for what
you use.
Blob storage events are pushed using Azure Event Grid to subscribers such as Azure
Functions, Azure Logic Apps, or even to your own http listener. Event Grid provides reliable
event delivery to your applications through rich retry policies and dead-lettering.
You need to implement a solution to resolve the retail store location data issue.Which three Azure Blob features should you enable? Each correct answer presents pan olthe solution.NOTE Each correct selection is worth one point
A. Immutability B. Snapshots C. Versioning D. Soft delete E. Object replication F. Change feed
Answer: C,D,F
Explanation:
Scenario: You must perform a point-in-time restoration of the retail store location data due
to an unexpected and accidental deletion of data.
Before you enable and configure point-in-time restore, enable its prerequisites for the
storage account: soft delete, change feed, and blob versioning.
You need to secure the Azure Functions to meet the security requirements.Which two actions should you perform? Each correct answer presents part of the solution.NOTE: Each correct selection is worth one point.
A. Store the RSA-HSM key in Azure Cosmos DB. Apery the built-in policies for customermanaged keys and allowed locations. B. Create a free tier Azure App Configuration instance with a new Azure AD serviceprincipal. C. Store the RSA-HSM key in Azure Key Vault with soft-delete and purge-protectionfeatures enabled. D. Store the RSA-HSM key in Azure Blob storage with an Immutability policy applied to thecontainer. E. Create a standard tier Azure App Configuration instance with an assigned Azure AD managed identity.
Answer: C,E
Explanation:
Scenario: All Azure Functions must centralize management and distribution of configuration
data for different environments and geographies, encrypted by using a company-provided
RSA-HSM key.
Microsoft Azure Key Vault is a cloud-hosted management service that allows users to
encrypt keys and small secrets by using keys that are protected by hardware security
modules (HSMs).
You need to create a managed identity for your application.
You need to access data from the user claim object in the e-commerce web app.What should you do first?
A. Write custom code to make a Microsoft Graph API call from the e-commerce web app. B. Assign the Contributor RBAC role to the e-commerce web app by using the ResourceManager create role assignment API. C. Update the e-commerce web app to read the HTTP request header values. D. Using the Azure CLI, enable Cross-origin resource sharing (CORS) from the ecommerce checkout API to the e-commerce web app.
Answer: C
Explanation:
Methods to Get User Identity and Claims in a .NET Azure Functions App include:
ClaimsPrincipal from the Request Context
The ClaimsPrincipal object is also available as part of the request context and can be
extracted from the HttpRequest.HttpContext. User Claims from the Request Headers.
App Service passes user claims to the app by using special request headers.
You need to ensure the security policies are met.What code do you add at line CS07 of ConfigureSSE.ps1?
A. –PermissionsToKeys create, encrypt, decrypt B. –PermissionsToCertificates create, encrypt, decrypt C. –PermissionsToCertificates wrapkey, unwrapkey, get D. –PermissionsToKeys wrapkey, unwrapkey, get
Answer: B
Explanation:
Scenario: All certificates and secrets used to secure data must be stored in Azure Key
Vault.
You must adhere to the principle of least privilege and provide privileges which are
essential to perform the intended function.
The Set-AzureRmKeyValutAccessPolicy parameter -PermissionsToKeys specifies an array
of key operation permissions to grant to a user or service principal. The acceptable values
for this parameter: decrypt, encrypt, unwrapKey, wrapKey, verify, sign, get, list, update,
You need to resolve the log capacity issue. What should you do?
A. Create an Application Insights Telemetry Filter B. Change the minimum log level in the host.json file for the function C. Implement Application Insights Sampling D. Set a LogCategoryFilter during startup
Answer: C
Explanation:
Scenario, the log capacity issue: Developers report that the number of log message in the
trace output for the processor is too high, resulting in lost log messages.
Sampling is a feature in Azure Application Insights. It is the recommended way to reduce
telemetry traffic and storage, while preserving a statistically correct analysis of application
data. The filter selects items that are related, so that you can navigate between items when
you are doing diagnostic investigations. When metric counts are presented to you in the
portal, they are renormalized to take account of the sampling, to minimize any effect on the
statistics.
Sampling reduces traffic and data costs, and helps you avoid throttling.
You need to resolve the capacity issue. What should you do?
A. Convert the trigger on the Azure Function to an Azure Blob storage trigger B. Ensure that the consumption plan is configured correctly to allow scaling C. Move the Azure Function to a dedicated App Service Plan D. Update the loop starting on line PC09 to process items in parallel
Answer: D
Explanation:
If you want to read the files in parallel, you cannot use forEach. Each of the async callback
function calls does return a promise. You can await the array of promises that you'll get
with Promise.all.
Scenario: Capacity issue: During busy periods, employees report long delays between the
time they upload the receipt and when it appears in the web application.
You need to ensure receipt processing occurs correctly.What should you do?
A. Use blob properties to prevent concurrency problems B. Use blob SnapshotTime to prevent concurrency problems C. Use blob metadata to prevent concurrency problems D. Use blob leases to prevent concurrency problems
Answer: D
Explanation:
You can create a snapshot of a blob. A snapshot is a read-only version of a blob that's
taken at a point in time. Once a snapshot has been created, it can be read, copied, or
deleted, but not modified. Snapshots provide a way to back up a blob as it appears at a
moment in time.
Scenario: Processing is performed by an Azure Function that uses version 2 of the Azure
Function runtime. Once processing is completed, results are stored in Azure Blob Storage
and an Azure SQL database. Then, an email summary is sent to the user with a link to the
processing report. The link to the report must remain valid if the email is forwarded to
You need to deploy the CheckUserContent Azure function. The solution must meet thesecurity and cost requirements.Which hosting model should you use?
A. Consumption plan B. Premium plan C. App Service plan
Answer: A
Sample Question 42
You need to investigate the http server log output to resolve the issue with the ContentUploadService. Which command should you use first?
A. az webapp log B. az ams live-output C. az monitor activity-log D. az container attach
Answer: C
Explanation: Scenario: Users of the ContentUploadService report that they occasionally see HTTP 502 responses on specific pages. "502 bad gateway" and "503 service unavailable" are common errors in your app hosted in Azure App Service. Microsoft Azure publicizes each time there is a service interruption or performance degradation. The az monitor activity-log command manages activity logs. Note: Troubleshooting can be divided into three distinct tasks, in sequential order: Observe and monitor application behavior Collect data Mitigate the issue Reference: https://docs.microsoft.com/en-us/cli/azure/monitor/activity-log
Sample Question 43
You need to resolve a notification latency issue.
Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. Set Always On to true. B. Ensure that the Azure Function is using an App Service plan. C. Set Always On to false. D. Ensure that the Azure Function is set to use a consumption plan.
Answer: A,B
Explanation:
Azure Functions can run on either a Consumption Plan or a dedicated App Service Plan. If you run in a
dedicated mode, you need to turn on the Always On setting for your Function App to run properly. The
Function runtime will go idle after a few minutes of inactivity, so only HTTP triggers will actually "wake up"
your functions. This is similar to how WebJobs must have Always On enabled.
Scenario: Notification latency: Users report that anomaly detection emails can sometimes arrive several
minutes after an anomaly is detected.
Anomaly detection service: You have an anomaly detection service that analyzes log information for
anomalies. It is implemented as an Azure Machine Learning model. The model is deployed as a web service. If
an anomaly is detected, an Azure Function that emails administrators is called by using an HTTP WebHook.
Note: This question is part of a series of questions that present the same scenario. Each question in the
series contains a unique solution that might meet the stated goals. Some question sets might have more
than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these
questions will not appear in the review screen.
You are developing an Azure solution to collect point-of-sale (POS) device data from 2,000 stores located
throughout the world. A single device can produce 2 megabytes (MB) of data every 24 hours. Each store
location has one to five devices that send data.
You must store the device data in Azure Blob storage. Device data must be correlated based on a device
identifier. Additional stores are expected to open in the future.
You need to implement a solution to receive the device data.
Solution: Provision an Azure Event Grid. Configure the machine identifier as the partition key and enable
capture.
Does the solution meet the goal?
Note: This question is part of a series of questions that present the same scenario. Each question in theseries contains a unique solution that might meet the stated goals. Some question sets might have morethan one correct solution, while others might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a result, thesequestions will not appear in the review screen.You are developing an Azure Service application that processes queue data when it receives a message from amobile application. Messages may not be sent to the service consistently.You have the following requirements:Queue size must not grow larger than 80 gigabytes (GB).Use first-in-first-out (FIFO) ordering of messages.Minimize Azure costs.You need to implement the messaging solution.Solution: Use the .Net API to add a message to an Azure Storage Queue from the mobile application. Createan Azure Function App that uses an Azure Storage Queue trigger.Does the solution meet the goal?
A. Yes B. No
Answer: B
Explanation:
Create an Azure Function App that uses an Azure Service Bus Queue trigger.
You have an application that includes an Azure Web app and several Azure Function apps. Application secrets
including connection strings and certificates are stored in Azure Key Vault.
Secrets must not be stored in the application or application runtime environment. Changes to Azure Active
Directory (Azure AD) must be minimized.
You need to design the approach to loading application secrets.
What should you do?
A. Create a single user-assigned Managed Identity with permission to access Key Vault and configure each
App Service to use that Managed Identity. B. Create a single Azure AD Service Principal with permission to access Key Vault and use a client secret
from within the App Services to access Key Vault C. Create a system assigned Managed Identity in each App Service with permission to access Key Vault. D. Create an Azure AD Service Principal with Permissions to access Key Vault for each App Service and
use a certificate from within the App Services to access Key Vault.
Answer: C
Explanation:
Use Key Vault references for App Service and Azure Functions.
Key Vault references currently only support system-assigned managed identities. User-assigned identities
Note: This question is part of a series of questions that present the same scenario. Each question in theseries contains a unique solution that might meet the stated goals. Some question sets might have morethan one correct solution, while others might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a result, thesequestions will not appear in the review screen.You are developing a website that will run as an Azure Web App. Users will authenticate by using their AzureActive Directory (Azure AD) credentials.You plan to assign users one of the following permission levels for the website: admin, normal, and reader. Auser’s Azure AD group membership must be used to determine the permission level.You need to configure authorization.Solution:Create a new Azure AD application. In the application’s manifest, define application roles that matchthe required permission levels for the application.Assign the appropriate Azure AD group to each role. In the website, use the value of the roles claimfrom the JWT for the user to determine permissions.Does the solution meet the goal?
A. Yes B. NO
Answer: B
Explanation:
To configure Manifest to include Group Claims in Auth Token
Go to Azure Active Directory to configure the Manifest. Click on Azure Active Directory, and go to
App registrations to find your application:
Click on your application (or search for it if you have a lot of apps) and edit the Manifest by clicking on
it.
Locate the “groupMembershipClaims” setting. Set its value to either “SecurityGroup” or “All”. To help
you decide which:
“SecurityGroup” - groups claim will contain the identifiers of all security groups of which the user is a
member.
“All” - groups claim will contain the identifiers of all security groups and all distribution lists of which
the user is a member
Now your application will include group claims in your manifest and you can use this fact in your code.
You are developing an Azure Function App that processes images that are uploaded to an Azure Blob
container.
Images must be processed as quickly as possible after they are uploaded, and the solution must minimize
latency. You create code to process images when the Function App is triggered.
You need to configure the Function App.
What should you do?
A. Use an App Service plan. Configure the Function App to use an Azure Blob Storage input trigger. B. Use a Consumption plan. Configure the Function App to use an Azure Blob Storage trigger. C. Use a Consumption plan. Configure the Function App to use a Timer trigger. D. Use an App Service plan. Configure the Function App to use an Azure Blob Storage trigger. E. Use a Consumption plan. Configure the Function App to use an Azure Blob Storage input trigger.
You are developing a medical records document management website. The website is used to store scanned
copies of patient intake forms. If the stored intake forms are downloaded from storage by a third party, the
content of the forms must not be compromised.
You need to store the intake forms according to the requirements.Solution: Create an Azure Cosmos DB database with Storage Service Encryption enabled. Store the intake forms in the Azure Cosmos DB database.Does the solution meet the goal?
A. Yes B. No
Answer: B
Explanation:
Instead use an Azure Key vault and public key encryption. Store the encrypted from in Azure Storage Blob
storage.
Sample Question 51
Note: This question is part of a series of questions that present the same scenario. Each question in theseries contains a unique solution that might meet the stated goals. Some question sets might have morethan one correct solution, while others might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a result, thesequestions will not appear in the review screen.You are developing an Azure Service application that processes queue data when it receives a message from amobile application. Messages may not be sent to the service consistently.You have the following requirements:Queue size must not grow larger than 80 gigabytes (GB).Use first-in-first-out (FIFO) ordering of messages.Minimize Azure costs.You need to implement the messaging solution.Solution: Use the .Net API to add a message to an Azure Storage Queue from the mobile application. Createan Azure VM that is triggered from Azure Storage Queue events.Does the solution meet the goal?
A. Yes B. NO
Answer: B
Explanation:
Don't use a VM, instead create an Azure Function App that uses an Azure Service Bus Queue trigger.
Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated
goals. Some question sets might have more than one correct solution, while others
might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a
result, these questions will not appear in the review screen.You develop an HTTP triggered Azure Function app to process Azure Storage blob data.
The app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data.You need to ensure the app does not time out and processes the blob data.
Solution: Use the Durable Function async pattern to process the blob data.Does the solution meet the goal?
A. Yes B. No
Answer: B Explanation: Instead pass the HTTP trigger payload into an Azure Service Bus queue to be processed
by a queue trigger function and return an immediate HTTP success response. Note: Large, long-running functions can cause unexpected timeout issues. General best
practices include: Whenever possible, refactor large functions into smaller function sets that work together
and return responses fast. For example, a webhook or HTTP trigger function might require
an acknowledgment response within a certain time limit; it's common for webhooks to
require an immediate response. You can pass the HTTP trigger payload into a queue to be
processed by a queue trigger function. This approach lets you defer the actual work and
return an immediate response. Reference: https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices
Sample Question 54
Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated
goals. Some question sets might have more than one correct solution, while others
might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a
result, these questions will not appear in the review screen.You develop and deploy an Azure App Service API app to a Windows-hosted deployment
slot named Development. You create additional deployment slots named Testing and
Production. You enable auto swap on the Production deployment slot.You need to ensure that scripts run and resources are available before a swap operation
occurs.Solution: Disable auto swap. Update the app with a method named statuscheck to run the
scripts. Re-enable auto swap and deploy the app to the Production slot.Does the solution meet the goal?
A. Yes B. No
Answer: A Explanation: Instead update the web.config file to include the applicationInitialization configuration
element. Specify custom initialization actions to run the scripts. Note: Some apps might require custom warm-up actions before the swap. The
applicationInitialization configuration element in web.config lets you specify custom
initialization actions. The swap operation waits for this custom warm-up to finish before
swapping with the target slot. Here's a sample web.config fragment.
You are developing an e-commerce solution that uses a microservice architecture.You need to design a communication backplane for communicating transactional
messages between various parts of the solution. Messages must be communicated in firstin-first-out (FIFO) order. What should you use?
A. Azure Storage Queue B. Azure Event Hub C. Azure Service Bus D. Azure Event Grid
Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated
goals. Some question sets might have more than one correct solution, while others
might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a
result, these questions will not appear in the review screen.You are developing an Azure Service application that processes queue data when it
receives a message from a mobile application. Messages may not be sent to the service
consistently. You have the following requirementsQueue size must not grow larger than 80 gigabytes (GB).
Use first-in-first-out (FIFO) ordering of messages.
Minimize Azure costs.You need to implement the messaging solution.Solution: Use the .Net API to add a message to an Azure Service Bus Queue from the
mobile application. Create an Azure Windows VM that is triggered from Azure Service Bus
Queue.Does the solution meet the goal?
You are developing a Java application that uses Cassandra to store key and value data.
You plan to use a new Azure Cosmos DB resource and the Cassandra API in the
application. You create an Azure Active Directory (Azure AD) group named Cosmos DB
Creators to enable provisioning of Azure Cosmos accounts, databases, and containers.The Azure AD group must not be able to access the keys that are required to access the
data.You need to restrict access to the Azure AD groupWhich role-based access control should you use?
A. DocumentDB Accounts Contributor B. Cosmos Backup Operator C. Cosmos DB Operator D. Cosmos DB Account Reader
Answer: C Explanation: Azure Cosmos DB now provides a new RBAC role, Cosmos DB Operator. This new role
lets you provision Azure Cosmos accounts, databases, and containers, but can’t access
the keys that are required to access the data. This role is intended for use in scenarios
where the ability to grant access to Azure Active Directory service principals to manage
deployment operations for Cosmos DB is needed, including the account, database, and
containers. Reference: https://azure.microsoft.com/en-us/updates/azure-cosmos-db-operator-role-for-role-basedaccess-control...
Sample Question 58
You are developing an ASP.NET Core website that uses Azure FrontDoor. The website is
used to build custom weather data sets for researchers. Data sets are downloaded by
users as Comma Separated Value (CSV) files. The data is refreshed every 10 hours.Specific files must be purged from the FrontDoor cache based upon Response Header
values.You need to purge individual assets from the Front Door cache.Which type of cache purge should you use?
A. single path B. wildcard C. root domain
Answer: A Explanation: These formats are supported in the lists of paths to purge:
Single path purge: Purge individual assets by specifying the full path of the asset
(without the protocol and domain), with the file extension, for example,
/pictures/strasbourg.png;
Wildcard purge: Asterisk (*) may be used as a wildcard. Purge all folders,
subfolders, and files under an endpoint with /* in the path or purge all subfolders
and files under a specific folder by specifying the folder followed by /*, for example,
/pictures/*.
Root domain purge: Purge the root of the endpoint with "/" in the path. Reference: https://docs.microsoft.com/en-us/azure/frontdoor/front-door-caching
Sample Question 59
You are developing a medical records document management website. The website is
used to store scanned copies of patient intake forms. If the stored intake forms are
downloaded from storage by a third party, the content of the forms must not be
compromised.You need to store the intake forms according to the requirements.Solution: Store the intake forms as Azure Key Vault secrets. Does the solution meet the goal?
A. Yes B. No
Answer: B
Sample Question 60
You develop an app that allows users to upload photos and videos to Azure storage. The
app uses a storage REST API call to upload the media to a blob storage account named
Account1. You have blob storage containers named Container1 and Container2.Uploading of videos occurs on an irregular basis.You need to copy specific blobs from Container1 to Container2 when a new video is
uploaded.What should you do?
A. Copy blobs to Container2 by using the Put Blob operation of the Blob Service REST
API B. Create an Event Grid topic that uses the Start-AzureStorageBlobCopy cmdlet C. Use AzCopy with the Snapshot switch to copy blobs to Container2 D. Download the blob to a virtual machine and then upload the blob to Container2
Answer: B Explanation: The Start-AzureStorageBlobCopy cmdlet starts to copy a blob.
Example 1: Copy a named blob
C:\PS>Start-AzureStorageBlobCopy -SrcBlob "ContosoPlanning2015" -DestContainer
"ContosoArchives" -SrcContainer "ContosoUploads" This command starts the copy operation of the blob named ContosoPlanning2015 from the
container named ContosoUploads to the container named ContosoArchives. Reference: https://docs.microsoft.com/en-us/powershell/module/azure.storage/startazurestorageblobcopy?view=azur...
Exam Code: AZ-204Exam Name: Developing Solutions for Microsoft AzureLast Update: May 13, 2024Questions: 383