Az-204 2
Az-204 2
Az-204 2
Microsoft
Exam Questions AZ-204
Developing Solutions for Microsoft Azure
NEW QUESTION 1
- (Topic 8)
You are developing several Azure API Management (APIM) hosted APIs. The APIs have the following requirements:
Require a subscription key to access all APIs.
• Include terms of use that subscribers must accept to use the APIs.
• Administrators must review and accept or reject subscription attempts.
• Limit the count of multiple simultaneous subscriptions. You need to implement the APIs.
What should you do? OB.
Answer: B
NEW QUESTION 2
- (Topic 8)
You develop and deploy a web application to Azure App Service. The application accesses data stored in an Azure Storage account. The account contains several
containers with several blobs with large amounts of data. You deploy all Azure resources to a single region.
You need to move the Azure Storage account to the new region. You must copy all data to the new region.
What should you do first?
Answer: A
Explanation:
To move a storage account, create a copy of your storage account in another region. Then, move your data to that account by using AzCopy, or another tool of
your choice and finally, delete the resources in the source region.
To get started, export, and then modify a Resource Manager template.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-account- move?tabs=azure-portal
NEW QUESTION 3
- (Topic 8)
You are developing an Azure Durable Function to manage an online ordering process. The process must call an external API to gather product discount
information.
You need to implement Azure Durable Function.
Which Azure Durable Function types should you use? Each correct answer presents part of the solution
NOTE: Each correct selection is worth ore point
A. Orchestrator
B. Entity
C. Activity
D. Client
Answer: AB
Explanation:
https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-types- features-overview
NEW QUESTION 4
- (Topic 8)
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the
stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You develop an HTTP triggered Azure Function app to process Azure Storage blob data. The app is triggered using an output binding on the blob.
The app continues to time out after four minutes. The app must process the blob data. You need to ensure the app does not time out and processes the blob data.
Solution: Update the functionTimeout property of the host.json project file to 10 minutes. Does the solution meet the goal?
A. Yes
B. No
Answer: B
Explanation:
Instead pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a queue trigger function and return an immediate HTTP success
response.
Note: Large, long-running functions can cause unexpected timeout issues. General best practices include:
Whenever possible, refactor large functions into smaller function sets that work together and return responses fast. For example, a webhook or HTTP trigger
function might require an acknowledgment response within a certain time limit; it's common for webhooks to require an immediate response. You can pass the
NEW QUESTION 5
- (Topic 8)
You develop and add several functions to an Azure Function app that uses the latest runtime host. The functions contain several REST API endpoints secured by
using SSL. The Azure Function app runs in a Consumption plan.
You must send an alert when any of the function endpoints are unavailable or responding too slowly.
You need to monitor the availability and responsiveness of the functions. What should you do?
Answer: B
Explanation:
You can create an Azure Function with TrackAvailability() that will run periodically
according to the configuration given in TimerTrigger function with your own business logic. The results of this test will be sent to your Application Insights resource,
where you will be able to query for and alert on the availability results data. This allows you to create customized tests similar to what you can do via Availability
Monitoring in the portal. Customized tests will allow you to write more complex availability tests than is possible using the portal UI, monitor an app inside of your
Azure VNET, change the endpoint address, or create an availability test even if this feature is not available in your region.
D18912E1457D5D1DDCBD40AB3BF70D5D
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/availability-azure-functions
NEW QUESTION 6
HOTSPOT - (Topic 8)
You provisioned an Azure Cosmos DB for NoSQL account named account1 with the default consistency level.
You plan to configure the consistency level on a per request basis The level needs to be set for consistent prefix for read and write operations to account1.
You need to identify the resulting consistency level for read and write operations. Which levels should you configure? To answer, select the appropriate options in
the
answer area.
NOTE: Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
NEW QUESTION 7
- (Topic 8)
You are developing an ASP.NET Core website that uses Azure FrontDoor. The website is used to build custom weather data sets for researchers. Data sets are
downloaded by users as Comma Separated Value (CSV) files. The data is refreshed every 10 hours.
Specific files must be purged from the FrontDoor cache based upon Response Header values.
You need to purge individual assets from the Front Door cache. Which type of cache purge should you use?
A. single path
B. wildcard
C. root domain
Answer: A
Explanation:
These formats are supported in the lists of paths to purge:
? Single path purge: Purge individual assets by specifying the full path of the asset (without the protocol and domain), with the file extension, for example,
/pictures/strasbourg.png;
? Wildcard purge: Asterisk (*) may be used as a wildcard. Purge all folders, subfolders, and files under an endpoint with /* in the path or purge all subfolders and
files under a specific folder by specifying the folder followed by /*, for example,
/pictures/*.
? Root domain purge: Purge the root of the endpoint with "/" in the path.
Reference:
https://docs.microsoft.com/en-us/azure/frontdoor/front-door-caching
NEW QUESTION 8
DRAG DROP - (Topic 8)
You are developing Azure WebJobs.
You need to recommend a WebJob type for each scenario.
Which WebJob type should you recommend? To answer, drag the appropriate WebJob types to the correct scenarios. Each WebJob type may be used once,
more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE:Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: Continuous
Continuous runs on all instances that the web app runs on. You can optionally restrict the WebJob to a single instance.
Box 2: Triggered
Triggered runs on a single instance that Azure selects for load balancing.
Box 3: Continuous
Continuous supports remote debugging.
Note:
The following table describes the differences between continuous and triggered WebJobs.
References:
https://docs.microsoft.com/en-us/azure/app-service/web-sites-create-web-jobs
NEW QUESTION 9
HOTSPOT - (Topic 8)
You develop and deploy the following staticwebapp.config.json file to the app_location value specified in the workflow file of an Azure Static Web app.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
NEW QUESTION 10
DRAG DROP - (Topic 8)
You are developing an ASP.NET Core website that can be used to manage photographs which are stored in Azure Blob Storage containers.
Users of the website authenticate by using their Azure Active Directory (Azure AD) credentials.
You implement role-based access control (RBAC) role permissions on the containers that store photographs. You assign users to RBAC roles.
You need to configure the website’s Azure AD Application so that user’s permissions can be used with the Azure Blob containers.
How should you configure the application? To answer, drag the appropriate setting to the correct location. Each setting can be used once, more than once, or not
at all. You may need to drag the split bar between panes or scroll to view content.
NOTE:Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: user_impersonation
Box 2: delegated Example:
* 1. Select the API permissions section
* 2. Click the Add a permission button and then: Ensure that the My APIs tab is selected
* 3. In the list of APIs, select the API TodoListService-aspnetcore.
* 4. In the Delegated permissions section, ensure that the right permissions are checked: user_impersonation.
* 5. Select the Add permissions button.
Box 3: delegated Example
* 1. Select the API permissions section
* 2. Click the Add a permission button and then, Ensure that the Microsoft APIs tab is selected
* 3. In the Commonly used Microsoft APIs section, click on Microsoft Graph
* 4. In the Delegated permissions section, ensure that the right permissions are checked: User.Read. Use the search box if necessary.
* 5. Select the Add permissions button
NEW QUESTION 10
DRAG DROP - (Topic 8)
You develop and deploy an Azure Logic App that calls an Azure Function app. The Azure Function App includes an OpenAPI (Swagger) definition and uses an
Azure Blob storage account. All resources are secured by using Azure Active Directory (Azure AD).
The Logic App must use Azure Monitor logs to record and store information about runtime data and events. The logs must be stored in the Azure Blob storage
account.
You need to set up Azure Monitor logs and collect diagnostics data for the Azure Logic App.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the
correct order.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Step 1: Create a Log Analytics workspace
Before you start, you need a Log Analytics workspace.
Step 2: Install the Logic Apps Management solution
To set up logging for your logic app, you can enable Log Analytics when you create your logic app, or you can install the Logic Apps Management solution in your
Log Analytics workspace for existing logic apps.
Step 3: Add a diagnostic setting to the Azure Logic App Set up Azure Monitor logs
? In the Azure portal, find and select your logic app.
? On your logic app menu, under Monitoring, select Diagnostic settings > Add diagnostic setting.
NEW QUESTION 15
- (Topic 8)
Your company is designing an application named App1 that will use data from Azure SQL Database. App1 will be accessed over the internet by many users.
You need to recommend a solution for improving the performance ofApp1. What should you include in the recommendation?
Answer: D
NEW QUESTION 16
HOTSPOT - (Topic 8)
You are developing a web application that makes calls to the Microsoft Graph API. You register the application in the Azure portal and upload a valid X509
certificate.
You create an appsettings.json file containing the certificate name, client identifier for the application, and the tenant identifier of the Azure active Directory (Azure
AD). You create a method named ReadCertificate to return the X509 certificate by name.
You need to implement code that acquires a token by using the certificate.
How should you complete the code segment? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-app-configuration?tabs=dotnet#instantiate-the-confidential-client-application-with-
a-client- certificate
https://docs.microsoft.com/en-us/azure/active-directory/develop/scenario-daemon-acquire-token?tabs=dotnet#acquiretokenforclient-api
NEW QUESTION 17
- (Topic 8)
You develop Azure Durable Functions to manage vehicle loans.
The loan process includes multiple actions that must be run in a specified order. One of the actions includes a customer credit check process, which may require
multiple days to process.
You need to implement Azure Durable Functions for the loan process. Which Azure Durable Functions type should you use?
A. orchestrator
B. client
C. activity
D. entity
Answer: A
NEW QUESTION 20
- (Topic 8)
You are developing an application that uses Azure Blob storage.
The application must read the transaction logs of all the changes that occur to the blobs and the blob metadata in the storage account for auditing purposes. The
changes must be in the order in which they occurred, include only create, update, delete, and copy operations and be retained for compliance reasons.
You need to process the transaction logs asynchronously. What should you do?
A. Process all Azure Blob storage events by using Azure Event Grid with a subscriber Azure Function app.
B. Enable the change feed on the storage account and process all changes for available events.
C. Process all Azure Storage Analytics logs for successful blob events.
D. Use the Azure Monitor HTTP Data Collector API and scan the request body for successful blob events.
Answer: B
Explanation:
Change feed support in Azure Blob Storage
The purpose of the change feed is to provide transaction logs of all the changes that occur to the blobs and the blob metadata in your storage account. The
change feed provides ordered, guaranteed, durable, immutable, read-only log of these changes. Client applications can read these logs at any time, either in
streaming or in batch mode. The change feed enables you to build efficient and scalable solutions that process change events that occur in your Blob Storage
account at a low cost.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-change-feed
NEW QUESTION 24
- (Topic 8)
You are developing a solution that will use a multi-partitioned Azure Cosmos DB database. You plan to use the latest Azure Cosmos DB SDK for development.
The solution must meet the following requirements:
? Send insert and update operations to an Azure Blob storage account.
A. Create an Azure App Service API and implement the change feed estimator of the SD
B. Scale the API by using multiple Azure App Service instances.
C. Create a background job in an Azure Kubernetes Service and implement the change feed feature of the SDK.
D. Create an Azure Function to use a trigger for Azure Cosmos D
E. Configure the trigger toconnect to the container.
F. Create an Azure Function that uses a Feedlterator object that processes the change feed by using the pull model on the containe
G. Use a FeedRange objext to parallelize the processing of the change feed across multiple functions.
Answer: CD
Explanation:
Azure Functions is the simplest option if you are just getting started using the change feed. Due to its simplicity, it is also the recommended option for most change
feed use cases. When you create an Azure Functions trigger for Azure Cosmos DB, you select the container to connect, and the Azure Function gets triggered
whenever there is a change in the container. Because Azure Functions uses the change feed processor behind the scenes, it automatically parallelizes change
processing across your container's partitions.
Note: You can work with change feed using the following options:
? Using change feed with Azure Functions
? Using change feed with change feed processor
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/read-change-feed
https://docs.microsoft.com/en-us/azure/cosmos-db/change-feed-pull-model https://docs.microsoft.com/en-us/azure/cosmos-db/read-change-feed#azure-functions
https://docs.microsoft.com/en-us/azure/cosmos-db/change-feed-pull-model#using-feedrange-for-parallelization
NEW QUESTION 26
- (Topic 8)
Your company purchases an Azure subscription and plans to migrate several on-premises virtual machines to Azure. You need to design the infrastructure
required (or the Azure virtual machines solution. What should you include in the design?
Answer: C
NEW QUESTION 28
HOTSPOT - (Topic 8)
You are developing a service where customers can report news events from a browser using Azure Web PubSub. The service is implemented as an Azure App
that the JSON WebSocket suprotocol to receive news events.
You need to implement the bindings for the Azure Function App.
How should you configure the binding? To answer, select the appropriate options in the answer area.
Note: Each Correct Selection in worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
NEW QUESTION 29
- (Topic 8)
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the
stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing a website that will run as an Azure Web App. Users will authenticate by using their Azure Active Directory (Azure AD) credentials.
You plan to assign users one of the following permission levels for the website: admin, normal, and reader. A user’s Azure AD group membership must be used to
determine the permission level. You need to configure authorization.
Solution:
•Create a new Azure AD application’s manifest, set value of the groupMembershipClaims option to All.
•In the website, use the value of the groups claim from the JWI for the user to determine permissions.
Does the solution meet the goal?
A. Yes
B. No
Answer: A
Explanation:
To configure Manifest to include Group Claims in Auth Token
* 1. Go to Azure Active Directory to configure the Manifest. Click on Azure Active Directory, and go to App registrations to find your application:
* 2. Click on your application (or search for it if you have a lot of apps) and edit the Manifest by clicking on it.
* 3. Locate the “groupMembershipClaims” setting. Set its value to either “SecurityGroup” or “All”. To help you decide which:
“SecurityGroup” - groups claim will contain the identifiers of all security groups of which the user is a member.
“All” - groups claim will contain the identifiers of all security groups and all distribution lists of which the user is a member
Now your application will include group claims in your manifest and you can use this fact in your code.
References:
https://blogs.msdn.microsoft.com/waws/2017/03/13/azure-app-service-authentication-aad- groups/
NEW QUESTION 32
- (Topic 8)
You are developing a mobile instant messaging app for a company. The mobile app must meet the following requirements:
• Support offline data sync.
• Update the latest messages during normal sync cycles. You need to implement Offline Data Sync.
Which two actions should you perform? Each conn I answer presents part of the solution. NOTE: Each correct selection is worth one point.
A. Retrieve records from Offline Data Sync on every call to the PullAsync method.
B. Retrieve records from Offline Data Sync using an Incremental Sync.
C. Push records to Offline Data Sync using an Incremental Sync.
D. Return the updatedAt column from the Mobile Service Backend and implement sorting by using the column.
E. Return the updatedAt column from the Mobile Service Backend and implement sorting by the message id.
Answer: BE
Explanation:
B: Incremental Sync: the first parameter to the pull operation is a query name that is used only on the client. If you use a non-null query name, the Azure Mobile
SDK performs an incremental sync. Each time a pull operation returns a set of results, the latest updatedAt timestamp from that result set is stored in the SDK
local system tables. Subsequent pull operations retrieve only records after that timestamp.
E (not D): To use incremental sync, your server must return meaningful updatedAt values and must also support sorting by this field. However, sincethe SDK adds
its own sort on the updatedAt field, you cannot use a pull query that has its own orderBy clause.
References:
https://docs.microsoft.com/en-us/azure/app-service-mobile/app-service-mobile-offline-data- sync
NEW QUESTION 34
DRAG DROP - (Topic 8)
You are preparing to deploy an Azure virtual machine (VM) based application. The VMs that run the application have the following requirements:
• When a VM is provisioned the firewall must be automatically configured before it can access Azure resources.
• Supporting services must be installed by using an Azure PowerShell script that is stored in Azure Storage
You need to ensure that the requirements are met.
Which features should you use? To answer, drag the appropriate features to the correct requirements.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
NEW QUESTION 35
HOTSPOT - (Topic 8)
You are developing an application that runs in several customer Azure Kubernetes Service clusters, within each cluster, a pod runs that collects performance data
to be analyzed later, a large amount of data is collected so saving latency must be minimized
The performance data must be stored so that pod restarts do not impact the stored data. Write latency should be minimized.
You need to configure blob storage.
How should you complete the YAML configuration? To answer, select the appropriate options in the answer area.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
NEW QUESTION 37
HOTSPOT - (Topic 8)
You are creating a CLI script that creates an Azure web app related services in Azure App Service. The web app uses the following variables:
You need to automatically deploy code from GitHub to the newly created web app.
How should you complete the script? To answer, select the appropriate options in the answer area.
NOTE:Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: az appservice plan create
The azure group creates command successfully returns JSON result. Now we can use resource group to create a azure app service plan
Box 2: az webapp create Create a new web app..
Box 3: --plan $webappname
with the serviceplan we created in step 1.
Box 4: az webapp deployment
Continuous Delivery with GitHub. Example:
az webapp deployment source config --name firstsamplewebsite1 --resource-group websites--repo-url $gitrepo --branch master --git-token $token
Box 5: --repo-url $gitrepo --branch master --manual-integration
NEW QUESTION 42
- (Topic 8)
You are a developing a SaaS application that stores data as key value pairs.
You must make multiple editions of the application available. In the lowest cost edition, the performance must be best-effort, and there is no regional failover.
In higher cos! editions customers must be able to select guaranteed performance and support for multiple regions. Azure costs must be minimized.
Which Azure Cosmos OB API should you use for the application?
A. Core
B. MongoDB
C. Cassandra
D. Table API
Answer: D
NEW QUESTION 47
- (Topic 8)
You ate developing an application that allows users to find musicians that ate looking for work. The application must store information about musicians, the
instruments that they play, and other related data.
The application must also allow users to determine which musicians have played together, including groups of three or more musicians that have performed
together at a specific location.
Which Azure Cosmos D6 API should you use for the application?
A. Core
B. MongoDB
C. Cassandra
D. Gremlin
Answer: B
NEW QUESTION 49
- (Topic 8)
You use Azure Table storage to store customer information for an application. The data contains customer details and is partitioned by last name. You need to
create a query that returns all customers with the last name Smith. Which code segment should you use?
Answer: C
Explanation:
Retrieve all entities in a partition. The following code example specifies a filter for entities where 'Smith' is the partition key. This example prints the fields of each
entity in the query results to the console.
Construct the query operation for all customer entities where PartitionKey="Smith".
TableQuery<CustomerEntity> query = new TableQuery<CustomerEntity>().Where(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal,
"Smith"));
References:
https://docs.microsoft.com/en-us/azure/cosmos-db/table-storage-how-to-use-dotnet
NEW QUESTION 53
HOTSPOT - (Topic 8)
You are developing an application that uses Azure Storage Queues. You have the following code:
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE:Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: No
The QueueDescription.LockDuration property gets or sets the duration of a peek lock; that is, the amount of time that the message is locked for other receivers.
The maximum value for LockDuration is 5 minutes; the default value is 1 minute.
Box 2: Yes
You can peek at the message in the front of a queue without removing it from the queue by calling the PeekMessage method.
Box 3: Yes
NEW QUESTION 57
DRAG DROP - (Topic 8)
You develop software solutions for a mobile delivery service. You are developing a mobile app that users can use to order from a restaurant in their area. The app
uses the following workflow:
* 1. A driver selects the restaurants for which they will deliver orders.
* 2. Orders are sent to all available drivers in an area.
* 3. Only orders for the selected restaurants will appear for the driver.
* 4. The first driver to accept an order removes it from the list of available orders.
You need to implement an Azure Service Bus solution.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the
correct order.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: Create a single Service Bus Namespace
To begin using Service Bus messaging entities in Azure, you must first create a namespace with a name that is unique across Azure. A namespace provides a
scoping container for addressing Service Bus resources within your application.
Box 2: Create a Service Bus Topic for each restaurant for which a driver can receive messages.
Create topics.
Box 3: Create a Service Bus subscription for each restaurant for which a driver can receive orders.
NEW QUESTION 62
DRAG DROP - (Topic 8)
You are a developer for a Software as a Service (SaaS) company. You develop solutions that provide the ability to send notifications by using Azure Notification
Hubs.
You need to create sample code that customers can use as a reference for how to send raw notifications to Windows Push Notification Services (WNS) devices.
The sample code must not use external packages.
How should you complete the code segment? To answer, drag the appropriate code segments to the correct locations. Each code segment may be used once,
more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE:Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: windows Example code:
var request = new HttpRequestMessage(method, $"{resourceUri}?api-version=2017-04"); request.Headers.Add("Authorization", createToken(resourceUri,
KEY_NAME, KEY_VALUE));
request.Headers.Add("X-WNS-Type", "wns/raw"); request.Headers.Add("ServiceBusNotification-Format", "windows"); return request;
Box 2: application/octet-stream
Example code capable of sending a raw notification: string resourceUri =
$"https://{NH_NAMESPACE}.servicebus.windows.net/{HUB_NAME}/messages/"; using (var request = CreateHttpRequest(HttpMethod.Post, resourceUri))
{
request.Content = new StringContent(content, Encoding.UTF8,"application/octet-stream"); request.Content.Headers.ContentType.CharSet = string.Empty;
var httpClient = new HttpClient();
var response = await httpClient.SendAsync(request); Console.WriteLine(response.StatusCode);
}
NEW QUESTION 66
- (Topic 8)
You develop an ASP.NET Core app that uses Azure App Configuration. You also create an App Configuration containing 100 settings. The app must meet the
following requirements:
• Ensure the consistency of all configuration data when changes to individual settings occur.
• Handle configuration data changes dynamically without causing the application to restart.
• Reduce the overall number of requests made to App Configuration APIs.
You must implement dynamic configuration updates in the app.
What are two ways to achieve this goal? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
A. Increase the App Configuration cache expiration from the default value.
B. Create and implement environment variables for each App Configuration store setting.
C. Decrease the App Configuration cache expiration from the default value.
D. Register all keys in the App Configuration stor
E. Set the refreshAII parameter of the Register method to false.
F. Create and register a sentinel key in the App Configuration stor
G. Set the refreshAII parameter of the Register method to true.
H. Create and configure Azure Key Vaul
I. Implement the Azure Key Vault configuration provider.
Answer: AE
NEW QUESTION 67
- (Topic 8)
You are designing a multi-tiered application that will be hosted on Azure virtual machines. The virtual machines will run Windows Server. Front-end servers will be
accessible from the Internet over port 443. The other servers will NOT be directly accessible over the internet
You need to recommend a solution to manage the virtual machines that meets the following requirement
• Allows the virtual machine to be administered by using Remote Desktop.
• Minimizes the exposure of the virtual machines on the Internet Which Azure service should you recommend?
A. Azure Bastion
B. Service Endpoint
C. Azure Private Link
D. Azure Front Door
Answer: C
NEW QUESTION 71
DRAG DROP - (Topic 8)
You have a web app named MainApp. You are developing a triggered App Service background task by using the WebJobs SDK. This task automatically invokes a
function code whenever any new data is received in a queue.
You need to configure the services.
Which service should you use for each scenario? To answer, drag the appropriate services to the correct scenarios. Each service may be used once, more than
once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: WebJobs
A WebJob is a simple way to set up a background job, which can process continuously or on a schedule. WebJobs differ from a cloud service as it gives you get
less fine-grained control over your processing environment, making it a more true PaaS service.
Box 2: Flow
NEW QUESTION 74
- (Topic 8)
You deploy an API to API Management
You must secure all operations on the API by using a client certificate.
You need to secure access to the backend service of the API by using client certificates. Which two security features can you use?
A. Azure AD token
B. Self-signed certificate
C. Certificate Authority (CA) certificate
D. Triple DES (3DES) cipher
E. Subscription key
Answer: BC
NEW QUESTION 77
HOTSPOT - (Topic 8)
You develop several Azure Grid to include hundreds of event types, such as billing, inventory, and shipping updates.
Events must be sent to a single endpoint for the Azure Functions app to process. The events must be filtered by event type before processing. You must have
authorization and authentication control to partition your tenants to receive the event data.
You need to configure Azure Event Grid.
Which configuration should you use? To answer, select the appropriate values in the answer area.
NOTE: Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
NEW QUESTION 81
HOTSPOT - (Topic 8)
All functions in the app meet the following requirements:
• Run until either a successful run or until 10 run attempts occur.
• Ensure that there are at least 20 seconds between attempts for up to 15 minutes. You need to configure the hostjson file.
How should you complete the code segment? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
NEW QUESTION 86
HOTSPOT - (Topic 8)
You are developing an application that uses a premium block blob storage account. You are optimizing costs by automating Azure Blob Storage access tiers.
You apply the following policy rules to the storage account. You must determine the implications of applying the rules to the data. (Line numbers are included for
reference only.)
A. Mastered
B. Not Mastered
Answer: A
Explanation:
* 1. Yes
* 2. Yes
* 3. Yes
* 4. No
https://docs.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview?tabs=azure-portal#move-aging-data-to-a-cooler-tier
NEW QUESTION 87
DRAG DROP - (Topic 8)
You are developing a .NET Core model-view controller (MVC) application hosted on Azure for a health care system that allows providers access to their
information.
You develop the following code:
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1:
Allow the ProviderAdmin and SysAdmin roles access to the Partner controller regardless of whether the user holds an editor claim of partner.
Box 2:
Limit access to the Manage action of the controller to users with an editor claim of partner who are also members of the SysAdmin role.
NEW QUESTION 88
- (Topic 8)
You are developing a web application that uses the Microsoft identity platform to authenticate users and resources, The web application calls several REST APIs.
The APIs require an access token from the Microsoft identity platform. You need to request a token.
Which three properties should you use? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
A. Application name
B. Application secret
C. Application ID
D. Supported account type
E. Redirect URI/URL
Answer: ABC
NEW QUESTION 91
- (Topic 8)
You are developing a software solution for an autonomous transportation system. The solution uses large data sets and Azure Batch processing to simulate
navigation sets for entire fleets of vehicles.
You need to create compute nodes for the solution on Azure Batch. What should you do?
Answer: D
Explanation:
A Batch job is a logical grouping of one or more tasks. A job includes settings common to the tasks, such as priority and the pool to run tasks on. The app uses the
BatchClient.JobOperations.CreateJob method to create a job on your pool.
Note:
Step 1: Create a pool of compute nodes. When you create a pool, you specify the number of compute nodes for the pool, their size, and the operating system.
When each task in your job runs, it's assigned to execute on one of the nodes in your pool.
Step 2 : Create a job. A job manages a collection of tasks. You associate each job to a specific pool where that job's tasks will run.
Step 3: Add tasks to the job. Each task runs the application or script that you uploaded to process the data files it downloads from your Storage account. As each
task completes, it can upload its output to Azure Storage.
NEW QUESTION 95
HOTSPOT - (Topic 8)
You are developing a data storage solution for a social networking app.
The solution requires a mobile app that stores user information using Azure Table Storage. You need to develop code that can insert multiple sets of user
information.
How should you complete the code? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1, Box 2: TableBatchOperation Create the batch operation.
TableBatchOperation op = new TableBatchOperation();
Box 3: ExecuteBatch
/ Execute the batch operation. table.ExecuteBatch(op);
Note: You can insert a batch of entities into a table in one write operation. Some other notes on batch operations:
You can perform updates, deletes, and inserts in the same single batch operation. A single batch operation can include up to 100 entities.
All entities in a single batch operation must have the same partition key.
While it is possible to perform a query as a batch operation, it must be the only operation in the batch.
References:
https://docs.microsoft.com/en-us/azure/cosmos-db/table-storage-how-to-use-dotnet
NEW QUESTION 98
FILL IN THE BLANK - (Topic 8)
You are developing a web application by using the Azure SDK. The web application accesses data m a zone-redundant BlockBlobStorage storage account
The application must determine whether the data has changed since the application last reao the data. Update operations must use the latest data changes when
writing data to the storages..................
You need to implement the update operations.
Which values should you use? To answer, select the appropriate option m the answer area.
NOTE Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Register a new application using the Azure portal
? Sign in to the Azure portal using either a work or school account or a personal Microsoft account.
? If your account gives you access to more than one tenant, select your account in the upper right corner. Set your portal session to the Azure AD tenant that you
want.
? Search for and select Azure Active Directory. Under Manage, select App registrations.
? Select New registration. (Step 1)
? In Register an application, enter a meaningful application name to display to users.
? Specify who can use the application. Select the Azure AD instance. (Step 2)
? Under Redirect URI (optional), select the type of app you're building: Web or Public client (mobile & desktop). Then enter the redirect URI, or reply URL, for your
application. (Step 3)
? When finished, select Register.
Answer: AC
- (Topic 8)
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution. Determine whether the
solution meets the stated goals.
You are developing and deploying several ASP.Net web applications to Azure App Service. You plan to save session state information and HTML output. You
must use a storage mechanism with the following requirements:
•Share session state across all ASP.NET web applications
•Support controlled, concurrent access to the same session state data for multiple readers and a single writer
•Save full HTTP responses for concurrent requests You need to store the information.
Proposed Solution: Deploy and configure an Azure Database for PostgreSQL. Update the web applications.
Does the solution meet the goal?
A. Yes
B. No
Answer: B
Explanation:
Instead deploy and configure Azure Cache for Redis. Update the web applications. Reference:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/caching#managing-concurrency-in-a-cache
A. Yes
B. No
Answer: B
Explanation:
The change feed is a log of changes that are organized into hourly segments but appended to and updated every few minutes. These segments are created only
when there are blob change events that occur in that hour.
Instead catch the triggered event, so move the photo processing to an Azure Function triggered from the blob upload.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-change-feed https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview
A. Yes
B. No
Answer: A
Explanation:
Large, long-running functions can cause unexpected timeout issues. General best practices include:
Whenever possible, refactor large functions into smaller function sets that work together and return responses fast. For example, a webhook or HTTP trigger
function might require an acknowledgment response within a certain time limit; it's common for webhooks to require an immediate response. You can pass the
HTTP trigger payload into a queue to be processedby a queue trigger function. This approach lets you defer the actual work and return an immediate response.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Step 1: Get-AzSubscription
If you have multiple subscriptions, you might have to specify the one that was used to create your key vault. Enter the following to see the subscriptions for your
account: Get-AzSubscription
Step 2: Set-AzContext -SubscriptionId
To specify the subscription that's associated with the key vault you'll be logging, enter: Set-AzContext -SubscriptionId <subscriptionID>
Step 3: Get-AzStorageAccountKey You must get that storage account key.
Step 4: $secretvalue = ConvertTo-SecureString <storageAccountKey> -AsPlainText -Force
Set-AzKeyVaultSecret -VaultName <vaultName> -Name <secretName> -SecretValue
$secretvalue
After retrieving your secret (in this case, your storage account key), you must convert that key to a secure string, and then create a secret with that value in your
key vault.
Step 5: Get-AzKeyVaultSecret
Next, get the URI for the secret you created. You'll need this URI in a later step to call the key vault and retrieve your secret. Run the following PowerShell
command and make note of the ID value, which is the secret's URI:
Get-AzKeyVaultSecret –VaultName <vaultName>
A. uscrFlowType
B. Status
C. invittdUstr
D. resetRedemption
Answer: B
A. Mastered
B. Not Mastered
Answer: A
Explanation:
You can host native Linux applications in the cloud by using Azure Web Apps. To create a Web App for Containers, you must run Azure CLI commands that create
a group, then a service plan, and finally the web app itself.
Step 1: az group create
In the Cloud Shell, create a resource group with the az group create command.
Step 2: az appservice plan create
In the Cloud Shell, create an App Service plan in the resource group with the az appservice plan create command.
Step 3: az webapp create
In the Cloud Shell, create a web app in the myAppServicePlan App Service plan with the az webapp create command. Don't forget to replace with a unique app
name, and <docker- ID> with your Docker ID.
References:
https://docs.microsoft.com/mt-mt/azure/app-service/containers/quickstart-docker-go?view=sql-server-ver15
A. Yes
B. No
Answer: B
Explanation:
To configure Manifest to include Group Claims in Auth Token
? Go to Azure Active Directory to configure the Manifest. Click on Azure Active Directory, and go to App registrations to find your application:
? Click on your application (or search for it if you have a lot of apps) and edit the Manifest by clicking on it.
? Locate the “groupMembershipClaims” setting. Set its value to either “SecurityGroup” or “All”. To help you decide which:
? “SecurityGroup” - groups claim will contain the identifiers of all security groups of which the user is a member.
? “All” - groups claim will contain the identifiers of all security groups and all distribution lists of which the user is a member
Now your application will include group claims in your manifest and you can use this fact in your code.
Reference:
https://blogs.msdn.microsoft.com/waws/2017/03/13/azure-app-service-authentication-aad- groups/
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: No
ExpirationTime - The time that the message expires.
InsertionTime - The time that the message was added to the queue.
Box 2: Yes
maxDequeueCount - The number of times to try processing a message before moving it to the poison queue. Default value is 5.
Box 3: Yes
When there are multiple queue messages waiting, the queue trigger retrieves a batch of messages and invokes function instances concurrently to process them.
By default, the batch size is 16. When the number being processed gets down to 8, the runtime gets another batch and starts processing those messages. So the
maximum number of concurrent messages being processed per function on one virtual machine (VM) is 24.
Box 4: Yes References:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Step 1: Create a blank Logic app. Create and configure a Logic App.
Step 2: Add a logical app trigger that fires when one or more messages arrive in the queue. Configure the logic app trigger.
Under Triggers, select When one or more messages arrive in a queue (auto-complete). Step 3: Add an action that reads IoT temperature data from the Service
Bus queue
Step 4: Add a condition that compares the temperature against the upper and lower thresholds.
Step 5: Add an action that sends an email to specified personnel if the temperature is outside of those thresholds
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Step 1: Create a Log Analytics workspace. First create the workspace.
Answer: AC
A. Mastered
B. Not Mastered
Answer: A
Explanation:
A. Yes
B. No
Answer: B
Explanation:
Instead deploy and configure Azure Cache for Redis. Update the web applications. Reference:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/caching#managing-concurrency-in-a-cache
- (Topic 8)
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the
stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You develop and deploy an Azure App Service API app to a Windows-hosted deployment slot named Development. You create additional deployment slots
namedTestingand Production. You enable auto swap on the Production deployment slot.
You need to ensure that scripts run and resources are available before a swap operation occurs.
Solution: Enable auto swap for the Testing slot. Deploy the app to the Testing slot. Does the solution meet the goal?
A. Yes
B. No
Answer: B
Explanation:
Instead update the web.config file to include the applicationInitialization configuration element. Specify custom initialization actions to run the scripts.
Note: Some apps might require custom warm-up actions before the swap. The applicationInitialization configuration element in web.config lets you specify custom
initialization actions. The swap operation waits for this custom warm-up to finish before swapping with the target slot. Here's a sample web.config fragment.
<system.webServer>
<applicationInitialization>
<add initializationPage="/" hostName="[app hostname]" />
<add initializationPage="/Home/About" hostName="[app hostname]" />
</applicationInitialization>
</system.webServer>
Reference:
https://docs.microsoft.com/en-us/azure/app-service/deploy-staging-slots#troubleshoot- swaps
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Step 1: WORKDIR /apps/ContosoApp
Step 2: COPY ./-
The Docker document must be created in the same folder where ContosoApp.dll and setupScript.ps1 are stored.
Step 3: EXPOSE ./ContosApp/ /app/ContosoApp Step 4: CMD powershell ./setupScript.ps1
ENTRYPOINT ["dotnet", "ContosoApp.dll"]
You need to create a Dockerfile document that meets the following requirements:
? Call setupScript.ps1 when the container is built.
? Run ContosoApp.dll when the container starts.
References:
https://docs.microsoft.com/en-us/azure/app-service/containers/tutorial-custom-docker- image
HOTSPOT - (Topic 8)
You are building a website that is used to review restaurants. The website will use an Azure CDN to improve performance and add functionality to requests.
You build and deploy a mobile app for Apple iPhones. Whenever a user accesses the website from an iPhone, the user must be redirected to the app store.
You need to implement an Azure CDN rule that ensures that iPhone users are redirected to the app store.
How should you complete the Azure Resource Manager template? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: iOS
Azure AD Conditional Access supports the following device platforms:
? Android
? iOS
? Windows Phone
? Windows
? macOS
Box 2: DeliveryRuleIsDeviceConditionParameters
The DeliveryRuleIsDeviceCondition defines the IsDevice condition for the delivery rule. parameters defines the parameters for the condition.
Box 3: HTTP_USER_AGENT
Box 4: DeliveryRuleRequestHeaderConditionParameters DeliveryRuleRequestHeaderCondition defines the RequestHeader condition for the delivery rule.
parameters defines the parameters for the condition.
Box 5: iOS
The Require approved client app requirement only supports the iOS and Android for device platform condition.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Step 1: Upgrade the storage account to GPv2
Object storage data tiering between hot, cool, and archive is supported in Blob Storage and General Purpose v2 (GPv2) accounts. General Purpose v1 (GPv1)
accounts don't support tiering.
You can easily convert your existing GPv1 or Blob Storage accounts to GPv2 accounts through the Azure portal.
Step 2: Copy the data to be archived to a Standard GPv2 storage account and then delete the data from the original storage account
Step 3: Change the storage account access tier from hot to cool Note: Hot - Optimized for storing data that is accessed frequently.
Cool - Optimized for storing data that is infrequently accessed and stored for at least 30
days.
Archive - Optimized for storing data that is rarely accessed and stored for at least 180 days with flexible latency requirements, on the order of hours.
Only the hot and cool access tiers can be set at the account level. The archive access tier can only be set at the blob level.
Answer: DF
A. Application secret
B. Redirect URI/URL
C. Application name
D. Supported account type
E. Application ID
Answer: ABE
A. Generate a shared access signature (SAS) for the Azure Blob storage account and provide the SAS to all developers.
B. Create and apply a new lifecycle management policy to include a last accessed date valu
C. Apply the policy to the Azure Blob storage account.
D. Provide all developers with the access key for the Azure Blob storage accoun
E. Update the API to include the Coordinated Universal Time (UTC) timestamp for the request header.
F. Grant all developers access to the Azure Blob storage account by assigning role-based access control (RBAC) roles.
Answer: A
Explanation:
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: Azure Resource Box 2: Client cert
API Management allows to secure access to the back-end service of an API using client certificates.
References:
https://docs.microsoft.com/en-us/rest/api/apimanagement/apimanagementrest/azure-api-management-rest-api-backend-entity
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Answer: B
Explanation:
The Blob storage trigger starts a function when a new or updated blob is detected. The blob contents are provided as input to the function.
The Consumption plan limits a function app on one virtual machine (VM) to 1.5 GB of memory.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob- trigger
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: Soft delete
When soft-delete is enabled, resources marked as deleted resources are retained for a specified period (90 days by default). The service further provides a
mechanism for recovering the deleted object, essentially undoing the deletion.
Box 2: Purge protection
Purge protection is an optional Key Vault behavior and is not enabled by default. Purge protection can only be enabled once soft-delete is enabled.
When purge protection is on, a vault or an object in the deleted state cannot be purged until the retention period has passed. Soft-deleted vaults and objects can
still be recovered, ensuring that the retention policy will be followed.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: Set-AzureRmRoleDefinition Input-File C:\SupportRole.json
The Set-AzureRmRoleDefinition cmdlet updates an existing custom role in Azure Role- Based Access Control. Provide the updated role definition as an input to
the command as a JSON file or a PSRoleDefinition object.
The role definition for the updated custom role MUST contain the Id and all other required properties of the role even if they are not updated: DisplayName,
Description, Actions, AssignableScope
Box 2: "*/read*."* Microsoft.Support/*" Microsoft.Support/* Create and manage support tickets
"Microsoft.Support" role definition azure
A. Yes
B. No
Answer: B
Explanation:
Instead in the Azure AD application’s manifest, set value of the groupMembershipClaims option to All.
References:
https://blogs.msdn.microsoft.com/waws/2017/03/13/azure-app-service-authentication-aad- groups/
For each of the following statements, select Yes if the statement is true. Otherwise, select No,
NOTE:Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: No
The AzScheduledQueryRuleSource is Heartbeat, not CPU.
Box 2: Yes
The AzScheduledQueryRuleSource is Heartbeat!
Note: New-AzScheduledQueryRuleTriggerCondition creates an object of type Trigger Condition. This object is to be passed to the command that creates Alerting
Action object.
Box 3: No
The schedule is 60 minutes, not two hours.
-FrequencyInMinutes: The alert frequency.
-TimeWindowInMinutes: The alert time window
The New-AzAscheduledQueryRuleSchedule command creates an object of type Schedule. This object is to be passed to the command that creates Log Alert
Rule.
A. Core
B. MongoDe
C. Cassandra
D. Gremlin
Answer: C
A. Option A
B. Option B
C. Option C
D. Option D
Answer: A
Explanation:
A service bus instance has already been created (Step 2 below). Next is step 3, Create a Service Bus queue.
Note: Steps:
Step 1: # Create a resource group resourceGroupName="myResourceGroup"
az group create --name $resourceGroupName --location eastus
Step 2: # Create a Service Bus messaging namespace with a unique name namespaceName=myNameSpace$RANDOM
az servicebus namespace create --resource-group $resourceGroupName --name
$namespaceName --location eastus
Step 3: # Create a Service Bus queue
az servicebus queue create --resource-group $resourceGroupName --namespace-name
$namespaceName --name BasicQueue
Step 4: # Get the connection string for the namespace
connectionString=$(az servicebus namespace authorization-rule keys list --resource-group
$resourceGroupName --namespace-name $namespaceName --name RootManageSharedAccessKey --query primaryConnectionString --output tsv)
Reference:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-quickstart-cli
A. Yes
B. No
Answer: B
Explanation:
Instead run the Invoke-RestMethod cmdlet to make a request to the local managed identity
for Azure resources endpoint.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-arm
A. Mastered
B. Not Mastered
Answer: A
Explanation:
HOTSPOT - (Topic 8)
You are developing an application that use an Azure blob named data to store application data. The application creates blob snapshots to allow application state to
be reverted to an earlier state. The Azure storage account has soft deleted enabled.
The system performs the following operations in order:
•The blob is updated
•Snapshot 1 is created.
•Snapshot 2 is created.
•Snapshot 1 is deleted.
A system error then deletes the data blob and all snapshots.
You need to determine which application states can be restored.
What is the restorability of the application data? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: Can be restored
When enabled, soft delete enables you to save and recover your data when blobs or blob snapshots are deleted. This protection extends to blob data that is
erased as the result of an overwrite.
Box 2: Cannot be restored It has been deleted.
Box 3: Can be restored It has not been deleted.
References:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-soft-delete
A. Mastered
B. Not Mastered
Answer: A
Explanation:
A. Yes
B. No
Answer: B
Explanation:
Instead pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a queue trigger function and return an immediate HTTP success
response.
Note: Large, long-running functions can cause unexpected timeout issues. General best practices include:
Whenever possible, refactor large functions into smaller function sets that work together and return responses fast. For example, a webhook or HTTP trigger
function might require an acknowledgment response within a certain time limit; it's common for webhooks to require an immediate response. You can pass the
HTTP trigger payload into a queue to be processed by a queue trigger function. This approach lets you defer the actual work and return an immediate response.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: getContext().getRequest();
Box 2: if(isNaN(i)["tip"] ..
In JavaScript, there are two ways to check if a variable is a number :
isNaN() – Stands for “is Not a Number”, if variable is not a number, it return true, else return false.
typeof – If variable is a number, it will returns a string named “number”.
Box 3:r.setBody(i);
// update the item that will be created
References:
https://docs.microsoft.com/bs-latn-ba/azure/cosmos-db/how-to-write-stored-procedures- triggers-udfs
https://mkyong.com/javascript/check-if-variable-is-a-number-in-javascript/
A. AzCopy
B. Azure Storage Explorer
C. Azure portal
D. .NET Storage Client Library
Answer: A
Explanation:
You can copy blobs, directories, and containers between storage accounts by using the AzCopy v10 command-line utility.
The copy operation is synchronous so when the command returns, that indicates that all files have been copied.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-blobs-copy
the solution.
NOTE: Each correct selection is worth one point.
Answer: BCE
A. Mastered
B. Not Mastered
Answer: A
Explanation:
The New-AzResourceGroup cmdlet creates an Azure resource group.
The New-AzAppServicePlan cmdlet creates an Azure App Service plan in a given location The New-AzWebApp cmdlet creates an Azure Web App in a given a
resource group
The New-AzWebAppSlot cmdlet creates an Azure Web App slot.
References:
https://docs.microsoft.com/en-us/powershell/module/az.resources/new- azresourcegroup?view=azps-2.3.2
https://docs.microsoft.com/en-us/powershell/module/az.websites/new- azappserviceplan?view=azps-2.3.2
https://docs.microsoft.com/en-us/powershell/module/az.websites/new- azwebapp?view=azps-2.3.2
https://docs.microsoft.com/en-us/powershell/module/az.websites/new- azwebappslot?view=azps-2.3.2
A. Download the blob to a virtual machine and then upload the blob to Container2.
B. Run the Azure PowerShell command Start-AzureStorageBlobCopy.
C. Copy blobs to Container2 by using the Put Blob operation of the Blob Service REST API.
D. Use AzCopy with the Snapshot switch blobs to Container2.
Answer: B
Explanation:
The Start-AzureStorageBlobCopy cmdlet starts to copy a blob. Example 1: Copy a named blob
C:\PS>Start-AzureStorageBlobCopy -SrcBlob "ContosoPlanning2015" -DestContainer "ContosoArchives" -SrcContainer "ContosoUploads"
This command starts the copy operation of the blob named ContosoPlanning2015 from the container named ContosoUploads to the container named
ContosoArchives.
References:
https://docs.microsoft.com/en-us/powershell/module/azure.storage/start-azurestorageblobcopy?view=azurermps-6.13.0
stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
Margie's Travel is an international travel and bookings management service. The company
is expanding into restaurant bookings. You are tasked with implementing Azure Search tor the restaurants listed in their solution.
You create the index in Azure Search.
You need to import the restaurant data into the Azure Search service by using the Azure Search NET SDK.
Solution:
* 1. Create a SearchServiceClient object to connect to the search index.
* 2. Create a DataContainer that contains the documents which must be added.
* 3. Create a DataSource instance and set its Container property to the DataContainer.
* 4. Set the DataSource property of the SearchServiceCIient
Does the solution meet the goal?
A. Yes
B. No
Answer: B
Explanation:
Use the following method:
* 1.Create a SearchIndexClient object to connect to the search index
* 2.Create an IndexBatch that contains the documents which must be added.
* 3.Call the Documents.Index method of the SearchIndexClient and pass the IndexBatch.
References:
https://docs.microsoft.com/en-us/azure/search/search-howto-dotnet-sdk
A. Yes
B. No
Answer: B
Explanation:
Instead pass the HTTP trigger payload into an Azure Service Bus queue to be processed by a queue trigger function and return an immediate HTTP success
response.
Note: Large, long-running functions can cause unexpected timeout issues. General best practices include:
Whenever possible, refactor large functions into smaller function sets that work together and return responses fast. For example, a webhook or HTTP trigger
function might require an acknowledgment response within a certain time limit; it's common for webhooks to require an immediate response. You can pass the
HTTP trigger payload into a queue to be processed by a queue trigger function. This approach lets you defer the actual work and return an immediate response.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-best-practices
A. Mastered
B. Not Mastered
Answer: A
Explanation:
A. Mastered
B. Not Mastered
Answer: A
Explanation:
A. Create a Selenium web test and configure it to run from your workstation as a scheduled task.
B. Set up a URL ping test to query the home page.
C. Create an Azure function to query the home page.
D. Create a multi-step web test to query the home page.
E. Create a Custom Track Availability Test to query the home page.
Answer: D
Explanation:
You can monitor a recorded sequence of URLs and interactions with a website via multi- step web tests.
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/availability-multistep
A. Mastered
B. Not Mastered
Answer: A
Explanation:
https://docs.microsoft.com/en-us/azure/storage/common/storage-account-move?tabs=azure-portal#modify-the-template
A. EnabledForDeployment
B. EnablePurgeProtection
C. EnabledForTemplateDeployment
D. EnableSoftDelete
Answer: BD
A. QueueClient
B. SubscriptionClient
C. TopicClient
D. CloudQueueClient
Answer: A
Explanation:
A queue allows processing of a message by a single consumer. Need a CloudQueueClient to access the Azure VM.
Reference:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-queues-topics- subscriptions
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: config
To Configure logging for a web app use the command: az webapp log config
Box 2: --docker-container-logging Syntax include:
az webapp log config [--docker-container-logging {filesystem, off}]
Box 3: webapp
To download a web app's log history as a zip file use the command: az webapp log download
Box 4: download References:
https://docs.microsoft.com/en-us/cli/azure/webapp/log
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: NotificationHubClient
Box 2: NotificationHubClient
Box 3: CreateClientFromConnectionString
// Initialize the Notification Hub NotificationHubClient hub =
NotificationHubClient.CreateClientFromConnectionString(listenConnString, hubName);
Box 4: SendWindowsNativeNotificationAsync Send the push notification.
var result = await hub.SendWindowsNativeNotificationAsync(windowsToastPayload);
References:
https://docs.microsoft.com/en-us/azure/notification-hubs/notification-hubs-push-notification- registration-management
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/app-service-mobile/app-service-mobile-windows-store-dotnet-get-started-push.md
A. MongoDB
B. Gremlin
C. Cassandra
D. Core
Answer: A
You need to write an Azure CLI script that will create the jobs, tasks, and the pool.
In which order should you arrange the commands to develop the solution? To answer, move the appropriate commands from the list of command segments to the
answer area and arrange them in the correct order.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Step 1: az batch pool create
# Create a new Linux pool with a virtual machine configuration. az batch pool create \
--id mypool \
--vm-size Standard_A1 \
--target-dedicated 2 \
--image canonical:ubuntuserver:16.04-LTS \
--node-agent-sku-id "batch.node.ubuntu 16.04"
Step 2: az batch job create
# Create a new job to encapsulate the tasks that are added. az batch job create \
--id myjob \
--pool-id mypool
Step 3: az batch task create
# Add tasks to the job. Here the task is a basic shell command. az batch task create \
--job-id myjob \
--task-id task1 \
--command-line "/bin/bash -c 'printenv AZ_BATCH_TASK_WORKING_DIR'" Step 4: for i in {1..$numberOfJobs} do
References:
https://docs.microsoft.com/bs-latn-ba/azure/batch/scripts/batch-cli-sample-run-job
A. Mastered
B. Not Mastered
Answer: A
Explanation:
A. Column
B. Table
C. Trigger
D. Index
E. Schema
Answer: ABE
Explanation:
In the Dynamic Data Masking configuration page, you may see some database columns that the recommendations engine has flagged for masking. In order to
accept the recommendations, just click Add Mask for one or more columns and a mask is created based on the default type for this column. You can change the
masking function by clicking on the masking rule and editing the masking field format to a different format of your choice.
References:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-dynamic-data-masking-get-started-portal
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: user_impersonation
Box 2: delegated Example:
* 1. Select the API permissions section
* 2. Click the Add a permission button and then: Ensure that the My APIs tab is selected
* 3. In the list of APIs, select the API TodoListService-aspnetcore.
* 4. In the Delegated permissions section, ensure that the right permissions are checked: user_impersonation.
* 5. Select the Add permissions button.
Box 3: delegated Example
* 1. Select the API permissions section
* 2. Click the Add a permission button and then, Ensure that the Microsoft APIs tab is selected
* 3. In the Commonly used Microsoft APIs section, click on Microsoft Graph
* 4. In the Delegated permissions section, ensure that the right permissions are checked: User.Read. Use the search box if necessary.
* 5. Select the Add permissions button
References:
https://docs.microsoft.com/en-us/samples/azure-samples/active-directory-dotnet-webapp- webapi-openidconnect-aspnetcore/calling-a-web-api-in-an-aspnet-core-
web-application- using-azure-ad/
A. Mastered
B. Not Mastered
Answer: A
Explanation:
A. SSL port
B. Subscription name
C. Location
D. Host name
E. Access key
F. Subscription id
Answer: ACD
Explanation:
https://learn.microsoft.com/en-us/azure/azure-cache-for-redis/cache-web-app-howto
A. Mastered
B. Not Mastered
Answer: A
Explanation:
To create an internal load balancer, create a service manifest named internal-lb.yaml with the service type LoadBalancer and the azure-load-balancer-internal
annotation as shown in the following example:
YAML:
apiVersion: v1 kind: Service metadata:
name: internal-app annotations:
service.beta.kubernetes.io/azure-load-balancer-internal: "true" spec:
type: LoadBalancer ports:
- port: 80 selector:
app: internal-app
References:
https://docs.microsoft.com/en-us/azure/aks/internal-lb
A. Yes
B. No
Answer: A
Explanation:
Specify custom warm-up.
Some apps might require custom warm-up actions before the swap. The applicationInitialization configuration element in web.config lets you specify custom
initialization actions. The swap operation waits for this custom warm-up to finish before swapping with the target slot. Here's a sample web.config fragment.
<system.webServer>
<applicationInitialization>
<add initializationPage="/" hostName="[app hostname]" />
<add initializationPage="/Home/About" hostName="[app hostname]" />
</applicationInitialization>
</system.webServer>
Reference:
https://docs.microsoft.com/en-us/azure/app-service/deploy-staging-slots#troubleshoot- swaps
Answer: A
Explanation:
With the Premium plan the max outbound connections per instance is unbounded compared to the 600 active (1200 total) in a Consumption plan.
Note: The number of available connections is limited partly because a function app runs in
a sandbox environment. One of the restrictions that the sandbox imposes on your code is a limit on the number of outbound connections, which is currently 600
active (1,200 total) connections per instance. When you reach this limit, the functions runtime writes the following message to the logs: Host thresholds exceeded:
Connections.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/manage-connections https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale#service-limits
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: Service bus queue
You are developing a back-end Azure App Service that scales based on the number of messages contained in a Service Bus queue.
Box 2: ActiveMessage Count
ActiveMessageCount: Messages in the queue or subscription that are in the active state and ready for delivery.
Box 3: Count
Box 4: Less than or equal to
You need to add a new rule that will continuously scale down the App Service as long as the scale up condition is not met.
Box 5: Decrease count by
When photos are uploaded, they must be processed to produce and save a mobile-friendly version of the image. The process to produce a mobile-friendly version
of the image must start in less than one minute.
You need to design the process that starts the photo processing.
Solution: Create an Azure Function app that uses the Consumption hosting model and that is triggered from the blob upload.
Does the solution meet the goal?
A. Yes
B. No
Answer: A
Explanation:
In the Consumption hosting plan, resources are added dynamically as required by your functions.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob- triggered-function
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE:Each correct selection is worth one point.
A. Mastered
B. Not Mastered
Answer: A
Explanation:
Box 1: Yes
The createDatabaseIfNotExistsAsync method checks if a database exists, and if it doesn't, create it.
The Database.CreateContainerAsync method creates a container as an asynchronous operation in the Azure Cosmos service.
Box 2: Yes
The CosmosContainer.CreateItemAsync method creates an item as an asynchronous operation in the Azure Cosmos service.
Box 3: Yes
......
* AZ-204 Most Realistic Questions that Guarantee you a Pass on Your FirstTry
* AZ-204 Practice Test Questions in Multiple Choice Formats and Updatesfor 1 Year