[13-July-2020 Update] Exam DP-200 VCE Dumps and DP-200 PDF Dumps from PassLeader

Valid DP-200 Dumps shared by PassLeader for Helping Passing DP-200 Exam! PassLeader now offer the newest DP-200 VCE dumps and DP-200 PDF dumps, the PassLeader DP-200 exam questions have been updated and ANSWERS have been corrected, get the newest PassLeader DP-200 dumps with VCE and PDF here: https://www.passleader.com/dp-200.html (241 Q&As Dumps –> 256 Q&As Dumps –> 272 Q&As Dumps)

BTW, DOWNLOAD part of PassLeader DP-200 dumps from Cloud Storage: https://drive.google.com/open?id=1CTHwJ44u5lT4tsb2qo8oThaQ5c_vwun1

NEW QUESTION 218
You are migrating a corporate research analytical solution from an internal datacenter to Azure. 200 TB of research data is currently stored in an on-premises Hadoop cluster. You plan to copy it to Azure Storage. Your internal datacenter is connected to your Azure Virtual Network (VNet) with Express Route private peering. The Azure Storage service endpoint is accessible from the same VNet. Corporate policy dictates that the research data cannot be transferred over public internet. You need to securely migrate the research data online. What should you do?

A. Transfer the data using Azure Data Box Disk devices.
B. Transfer the data using Azure Data Factory in distributed copy (DistCopy) mode, with an Azure Data Factory self-hosted Integration Runtime (IR) machine installed in the on-premises datacenter.
C. Transfer the data using Azure Data Factory in native Integration Runtime (IR) mode, with an Azure Data Factory self-hosted IR machine installed on the Azure VNet.
D. Transfer the data using Azure Data Box Heavy devices.

Answer: C
Explanation:
You should transfer the data using Azure Data Factory in native IR mode, with an Azure Data Factory selfhosted IR machine installed on the Azure VNet. This approach supports data transfer via Express Route and uses Azure Data Factory IR as an engine to copy the data.
Incorrect:
Not A: You should not transfer the data using Azure Data Box Disk devices. This approach is for offline transfer scenarios. Also, an Azure Data Box Disk device has a total capacity of only 40 TB.
Not B: You should not transfer the data using Azure Data Factory in DistCopy mode, with an Azure Data Factory self-hosted IR machine installed in the on-premises datacenter. The DistCp tool does not support Express Route private peering with an Azure Storage VNet endpoint.
Not D: You should not transfer the data using Azure Data Box Heavy devices. While these devices have a capacity of 1 PB, they are intended for offline transfer scenarios only.

NEW QUESTION 219
You are a data engineer for your company. Your company has an on-premises SQL Server instance that contains 16 databases. Four of the databases require Common Language Runtime (CLR) features. You must be able to manage each database separately because each database has its own resource needs. You plan to migrate these databases to Azure. You want to migrate the databases by using a backup and restore process by using SQL commands. You need to choose the most appropriate deployment option to migrate the databases. What should you use?

A. Azure SQL Database with an elastic pool.
B. Azure Cosmos DB with the SQL (DocumentDB) API.
C. Azure SQL Database managed instance.
D. Azure Cosmos DB with the Table API.

Answer: C
Explanation:
You should use an Azure SQL Database managed instance deployment. This deployment option is almost 100% compatible with an on-premises instance, including the ability to use CLR features. When you back up the databases on-premises, you can execute a restore command to migrate the databases in Azure. This is referred to as lift and shift.
Incorrect:
Not A: You should not use an Azure SQL Database with an elastic pool deployment. An elastic pool allows you to deploy multiple databases to a single logical instance and have all databases share a pool of resources. You configure the resource usage up front by choosing a purchasing model. You cannot take advantage of CLR features with an elastic pool.
Not B: You should not use an Azure Cosmos DB with the SQL (DocumentDB) API deployment. Cosmos DB is a multimodel database that supports five APIs for storage and queries, including SQL, Table, Cassandra, Gremlin, and MongoDB. The SQL API allows you to access data by using SQL-like queries. You cannot restore SQL Server databases to Cosmos DB by using SQL commands.
Not D: You should not use an Azure Cosmos DB with the Table API deployment. The Table API is similar to Azure Tables. This deployment is useful if you are migrating an application from Azure Tables to Cosmos DB. With Azure Tables, you can access data by using Language Integrated Query (LINQ) and OData. You cannot restore SQL Server databases to Cosmos DB by using SQL commands.

NEW QUESTION 220
SIMULATION
……

Answer:
Step 1: Create a new SQL database named DB3:
1. Select SQL in the left-hand menu of the Azure portal. If SQL is not in the list, select All services, then type SQL in the search box.
2. Select + Add to open the Select SQL deployment option page. Select Single Database. You can view additional information about the different databases by selecting Show details on the Databases tile.
3. Select Create.
4. Enter the required fields if necessary.
5. Leave the rest of the values as default and select Review + Create at the bottom of the form.
6. Review the final settings and select Create. Use DB3 as database name.
On the SQL Database form, select Create to deploy and provision the resource group, server, and database.
Step 2: Create your elastic pool using the Azure portal:
1. Select Azure SQL in the left-hand menu of the Azure portal. If Azure SQL is not in the list, select All services, then type Azure SQL in the search box.
2. Select + Add to open the Select SQL deployment option page.
3. Select Elastic pool from the Resource type drop-down in the SQL Databases tile. Select Create to create your elastic pool.
4. Configure your elastic pool with the following values:
– Name: Provide a unique name for your elastic pool, such as myElasticPool.
– Subscription: Select your subscription from the drop-down.
– ResourceGroup: Select the resource group.
– Server: Select the server.
5. Select Configure elastic pool.
6. On the Configure page, select the Databases tab, and then choose to Add database.
7. Add the Azure SQL database named DB2, and the new SQL database named DB3 that you created in Step 1.
8. Select Review + create to review your elastic pool settings and then select Create to create your elastic pool.
Explanation:
https://docs.microsoft.com/bs-latn-ba/azure/sql-database/sql-database-elastic-pool-failover-group-tutorial

NEW QUESTION 221
SIMULATION
……

Answer:
Create a general-purpose v2 storage account, which provides access to all of the Azure Storage services: blobs, files, queues, tables, and disks:
1. On the Azure portal menu, select All services. In the list of resources, type Storage Accounts. As you begin typing, the list filters based on your input. Select Storage Accounts.
2. On the Storage Accounts window that appears, choose Add.
3. Select the subscription in which to create the storage account.
4. Under the Resource group field, select Create new. Enter the name for your new resource group, as shown in the following image.
5. Next, enter the name account10543936 for your storage account.
6. Select a location for your storage account, or use the default location.
7. Leave these fields set to their default values:
– Deployment model: Resource Manager
– Performance: Standard
– Account kind: StorageV2 (general-purpose v2)
– Replication: Read-access geo-redundant storage (RA-GRS)
– Access tier: Hot
8. Select Review + Create to review your storage account settings and create the account.
9. Select Create.
Explanation:
https://docs.microsoft.com/en-us/azure/storage/common/storage-account-create

NEW QUESTION 222
SIMULATION
……

Answer:
You can enable Availability Zones by using Azure portal when creating an Azure Cosmos account. You can enable Availability Zones by using Azure portal:
Step 1: enable the Geo-redundancy, Multi-region Writes:
1. In Azure Portal search for and select Azure Cosmos DB.
2. Locate the Cosmos DB database named cosmos10543936
3. Access the properties for cosmos10543936
4. Enable the Geo-redundancy, Multi-region Writes:
– Location: West US region
Step 2: Add region from your database account:
1. In to Azure portal, go to your Azure Cosmos account, and open the Replicate data globally menu.
2. To add regions, select the hexagons on the map with the + label that corresponds to your desired region(s). Alternatively, to add a region, select the + Add region option and choose a region from the drop-down menu:
– Add: West US region
3. To save your changes, select OK.
Explanation:
https://docs.microsoft.com/en-us/azure/cosmos-db/high-availability
https://docs.microsoft.com/en-us/azure/cosmos-db/how-to-manage-database-account

NEW QUESTION 223
SIMULATION
……

Answer:
Provision an Azure Active Directory administrator for your managed instance. Each Azure SQL server (which hosts a SQL Database or SQL Data Warehouse) starts with a single server administrator account that is the administrator of the entire Azure SQL server. A second SQL Server administrator must be created, that is an Azure AD account. This principal is created as a contained database user in the master database:
1. In the Azure portal, in the upper-right corner, select your connection to drop down a list of possible Active Directories. Choose the correct Active Directory as the default Azure AD. This step links the subscription-associated Active Directory with Azure SQL server making sure that the same subscription is used for both Azure AD and SQL Server. (The Azure SQL server can be hosting either Azure SQL Database or Azure SQL Data Warehouse.)
2. Search for and select the SQL server SQL10543936.
3. In SQL Server page, select Active Directory admin.
4. In the Active Directory admin page, select Set admin.
5. In the Add admin page, search for user [email protected], select it, and then select Select. (The Active Directory admin page shows all members and groups of your Active Directory. Users or groups that are grayed out cannot be selected because they are not supported as Azure AD administrators.)
6. At the top of the Active Directory admin page, select SAVE.
Explanation:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-aad-authentication-configure?tabs=azure-powershell

NEW QUESTION 224
SIMULATION
……

Answer:
You can use Private Endpoints for your Azure Storage accounts to allow clients on a virtual network (VNet) to securely access data over a Private Link. Create your Private Endpoint:
1. On the upper-left side of the screen in the Azure portal, Storage > Storage account, and select your storage account storage10543936.
2. Select Networking.
3. Select Add Private Endpoint.
4. In Create Private Endpoint, enter or select this information:
– Virtual network: Select  VNET1 from the resource group.
5. Select OK.
6. Select Review + create. You’re taken to the Review + create page where Azure validates your configuration.
Explanation:
https://docs.microsoft.com/en-us/azure/private-link/create-private-endpoint-storage-portal

NEW QUESTION 225
SIMULATION
……

Answer:
1. In the Azure portal, browse to the database db1-copy10543936 that you want to set up for geo-replication.
2. On the SQL database page, select geo-replication, and then select the region to create the secondary database: US West region.
3. Select or configure the server and pricing tier for the secondary database.
4. Click Create to add the secondary.
5. The secondary database is created and the seeding process begins.
6. When the seeding process is complete, the secondary database displays its status.
Explanation:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-active-geo-replication-portal

NEW QUESTION 226
SIMULATION
……

Answer:
Enable soft delete for blobs on your storage account by using Azure portal:
1. In the Azure portal, select your storage account.
2. Navigate to the Data Protection option under Blob Service.
3. Click Enabled under Blob soft delete.
4. Enter the number of days you want to retain for under Retention policies. Here enter 10.
5. Choose the Save button to confirm your Data Protection settings.
Note: Azure Storage now offers soft delete for blob objects so that you can more easily recover your data when it is erroneously modified or deleted by an application or other storage account user. Currently you can retain soft deleted data for between 1 and 365 days.
Explanation:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-soft-delete

NEW QUESTION 227
You are monitoring an Azure Stream Analytics job. You discover that the Backlogged Input Events metric is increasing slowly and is consistently non-zero. You need to ensure that the job can handle all the events. What should you do?

A. Change the compatibility level of the Stream Analytics job.
B. Increase the number of streaming units (SUs).
C. Create an additional output stream for the existing input stream.
D. Remove any named consumer groups from the connection and use $default.

Answer: B
Explanation:
Backlogged Input Events: Number of input events that are backlogged. A non-zero value for this metric implies that your job isn’t able to keep up with the number of incoming events. If this value is slowly increasing or consistently non-zero, you should scale out your job. You should increase the Streaming Units. Note: Streaming Units (SUs) represents the computing resources that are allocated to execute a Stream Analytics job. The higher the number of SUs, the more CPU and memory resources are allocated for your job.
https://docs.microsoft.com/bs-cyrl-ba/azure/stream-analytics/stream-analytics-monitoring

NEW QUESTION 228
SIMULATION
……

Answer:
Set up auditing for your database. The following section describes the configuration of auditing using the Azure portal:
1. Go to the Azure portal.
2. Navigate to Auditing under the Security heading in your SQL database db2/server pane.
3. If you prefer to enable auditing on the database level, switch Auditing to ON.
Note: By default the audit database data retention period is set to 100 days.
Explanation:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-auditing

NEW QUESTION 229
SIMULATION
……

Answer:
You can configure your storage account to accept requests from secure connections only by setting the Secure transfer required property for the storage account. Require secure transfer for an existing storage account:
1. Select the existing storage account storage10543936 in the Azure portal.
2. In the storage account menu pane, under SETTINGS, select Configuration.
3. Under Secure transfer required, select Enabled.
Explanation:
https://docs.microsoft.com/en-us/azure/storage/common/storage-require-secure-transfer

NEW QUESTION 230
SIMULATION
……

Answer:
SQL Data Warehouse compute resources can be scaled by increasing or decreasing data warehouse units:
1. Click SQL data warehouses in the left page of the Azure portal.
2. Select datawarehouse from the SQL data warehouses page. The data warehouse opens.
3. Click Scale.
4. In the Scale panel, move the slider left or right to change the DWU setting. Double the DWU setting.
5. Click Save. A confirmation message appears. Click yes to confirm or no to cancel.
Explanation:
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/quickstart-scale-compute-portal

NEW QUESTION 231
……


Get the newest PassLeader DP-200 VCE dumps here: https://www.passleader.com/dp-200.html (241 Q&As Dumps –> 256 Q&As Dumps –> 272 Q&As Dumps)

And, DOWNLOAD the newest PassLeader DP-200 PDF dumps from Cloud Storage for free: https://drive.google.com/open?id=1CTHwJ44u5lT4tsb2qo8oThaQ5c_vwun1