Use this quick start guide to collect all the information about Microsoft Implementing an Azure Data Solution (DP-200) Certification exam. This study guide provides a list of objectives and resources that will help you prepare for items on the DP-200 Microsoft Implementing an Azure Data Solution exam. The Sample Questions will help you identify the type and difficulty level of the questions and the Practice Exams will make you familiar with the format and environment of an exam. You should refer this guide carefully before attempting your actual Microsoft Implementing an Azure Data Solution certification exam.
The Microsoft Implementing an Azure Data Solution certification is mainly targeted to those candidates who want to build their career in Microsoft Azure domain. The Microsoft Certified - Azure Data Engineer Associate exam verifies that the candidate possesses the fundamental knowledge and proven skills in the area of Microsoft Implementing an Azure Data Solution.
Microsoft Implementing an Azure Data Solution Exam Summary:
Exam Name | Microsoft Certified - Azure Data Engineer Associate |
Exam Code | DP-200 |
Exam Price | $165 (USD) |
Duration | 120 mins |
Number of Questions | 40-60 |
Passing Score | 700 / 1000 |
Books / Training | Course DP-200T01-A: Implementing an Azure Data Solution |
Schedule Exam | Pearson VUE |
Sample Questions | Microsoft Implementing an Azure Data Solution Sample Questions |
Practice Exam | Microsoft DP-200 Certification Practice Exam |
Microsoft DP-200 Exam Syllabus Topics:
Topic | Details |
---|---|
Implement Data Storage Solutions (40-45%) |
|
Implement non-relational data stores |
- implement a solution that uses Cosmos DB, Data Lake Storage Gen2, or Blob storage - implement data distribution and partitions - implement a consistency model in Cosmos DB - provision a non-relational data store - provide access to data to meet security requirements - implement for high availability, disaster recovery, and global distribution |
Implement relational data stores |
- provide access to data to meet security requirements - implement for high availability and disaster recovery - implement data distribution and partitions for Azure Synapse Analytics - implement PolyBase |
Manage data security |
- implement data masking - encrypt data at rest and in motion |
Manage and Develop Data Processing (25-30%) |
|
Develop batch processing solutions |
- develop batch processing solutions by using Data Factory and Azure Databricks - ingest data by using PolyBase - implement the integration runtime for Data Factory - create linked services and datasets - create pipelines and activities - create and schedule triggers - implement Azure Databricks clusters, notebooks, jobs, and autoscaling - ingest data into Azure Databricks |
Develop streaming solutions |
- configure input and output - select the appropriate built-in functions - implement event processing by using Stream Analytics |
Monitor and Optimize Data Solutions (30-35%) |
|
Monitor data storage
|
- monitor relational and non-relational data stores - implement Blob storage monitoring - implement Data Lake Storage Gen2 monitoring - implement Azure Synapse Analytics monitoring - implement Cosmos DB monitoring - configure Azure Monitor alerts - implement auditing by using Azure Log Analytics |
Monitor data processing |
- monitor Data Factory pipelines - monitor Azure Databricks - monitor Stream Analytics - configure Azure Monitor alerts - implement auditing by using Azure Log Analytics |
Optimize of Azure data solutions |
- troubleshoot data partitioning bottlenecks - optimize Data Lake Storage Gen2 - optimize Stream Analytics - optimize Azure Synapse Analytics - manage the data lifecycle |
To ensure success in Microsoft Implementing an Azure Data Solution certification exam, we recommend authorized training course, practice test and hands-on experience to prepare for Microsoft Implementing an Azure Data Solution (DP-200) exam.