BONUS! Cyber Phoenix Subscription Included: All Phoenix TS students receive complimentary ninety (90) day access to the Cyber Phoenix learning platform, which hosts hundreds of expert asynchronous training courses in Cybersecurity, IT, Soft Skills, and Management and more!
Course Overview
Phoenix TS’ 3-day instructor-led Microsoft Implementing an Azure Data Solution training and certification boot camp in Washington, DC Metro, Tysons Corner, VA, Columbia, MD or Live Online teaches students how to implement various data platform technologies into solutions that are in-line with business and technical requirements, including on-premises, cloud, and hybrid data scenarios incorporating both relational and NoSQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data. The students will also explore how to implement data security, including authentication, authorization, data policies, and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing, and streaming data solutions.
What You’ll Learn
- Working with Data Storage
- Enabling Team Based Data Science with Azure Databricks
- Building Globally Distributed Databases with Cosmos DB
- Working with Relational Data Stores in the Cloud
- Performing Real-Time Analytics with Stream Analytics
- Orchestrating Data Movement with Azure Data Factory
- Securing Azure Data Platforms
- Monitoring and Troubleshooting Data Storage and Processing
- Integrating and Optimizing Data Platforms
Schedule
Currently, there are no public classes scheduled. Please contact a Phoenix TS Training Consultant to discuss hosting a private class at 301-258-8200.
Not seeing a good fit?
Let us know. Our team of instructional designers, curriculum developers, and subject matter experts can create a custom course for you.
Learn more about custom training
Program Level
Intermediate
Training Delivery Methods
Group Live
Duration
3 Days / 24 hours Training
CPE credits
13 NASBA CPE Credits
Field of Study
Information Technology
Advanced Prep
N/A
Course Registration
Candidates can choose to register for the course by via any of the below methods:
- Email: Sales@phoenixts.com
- Phone: 301-582-8200
- Website: www.phoenixts.com
Upon registration completion candidates are sent an automated course registration email that includes attachments with specific information on the class and location as well as pre-course study and test preparation material approved by the course vendor. The text of the email contains a registration confirmation as well as the location, date, time and contact person of the class.
Online enrolment closes three days before course start date.
On the first day of class, candidates are provided with instructions to register with the exam provider before the exam date.
Complaint Resolution Policy
To view our complete Complaint Resolution Policy policy please click here: Complaint Resolution Policy
Refunds and Cancellations
To view our complete Refund and Cancellation policy please click here: Refund and Cancellation Policy
Who Should Attend
The primary audience for this course is Data Professionals, Data Architects, and Business Intelligence Professionals who want to learn about the data platform technologies that exist on Microsoft Azure.The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.
Prerequisites
In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
- Azure Fundamentals
Exam Information
Exam DP-200: Implementing an Azure Data Solution
Implement data storage solutions | 40-45% |
Manage and develop data processing | 25-30% |
Monitor and optimize data solutions | 30-35% |
Duration
3 Days
Price
$1,945
Course Outline
Module 1: Azure for the Data Engineer
This module explores how the world of data has evolved and how cloud data platform technologies are providing new opportunities for businesses to explore their data in different ways. The students will gain an overview of the various data platform technologies that are available and how a Data Engineer’s role and responsibilities has evolved to work in this new world to an organization’s benefit.- Explain the evolving world of data
- Survey the services in the Azure Data Platform
- Identify the tasks that are performed by a Data Engineer
- Describe the use cases for the cloud in a Case Study
Lab : Azure for the Data Engineer
- Identify the evolving world of data
- Determine the Azure Data Platform Services
- Identify tasks to be performed by a Data Engineer
- Finalize the data engineering deliverables
- Explain the evolving world of data
- Survey the services in the Azure Data Platform
- Identify the tasks that are performed by a Data Engineer
- Describe the use cases for the cloud in a Case Study
Module 2: Working with Data Storage
This module teaches the variety of ways to store data in Azure. The students will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data want to be stored in the cloud. They will also understand how Data Lake storage can be created to support a wide variety of big data analytics solutions with minimal effort.- Choose a data storage approach in Azure
- Create an Azure Storage Account
- Explain Azure Data Lake storage
- Upload data into Azure Data Lake
Lab : Working with Data Storage
- Choose a data storage approach in Azure
- Create a Storage Account
- Explain Data Lake Storage
- Upload data into Data Lake Store
- Choose a data storage approach in Azure
- Create an Azure Storage Account
- Explain Azure Data Lake Storage
- Upload data into Azure Data Lake
Module 3: Enabling Team Based Data Science with Azure Databricks
This module introduces students to Azure Databricks and how a Data Engineer works with it to enable an organization to perform Team Data Science projects. They will learn the fundamentals of Azure Databricks and Apache Spark notebooks; how to provision the service and workspaces; and how to perform data preparation task that can contribute to the data science project.- Explain Azure Databricks
- Work with Azure Databricks
- Read data with Azure Databricks
- Perform transformations with Azure Databricks
Lab : Enabling Team Based Data Science with Azure Databricks
- Explain Azure Databricks
- Work with Azure Databricks
- Read data with Azure Databricks
- Perform transformations with Azure Databricks
- Explain Azure Databricks
- Work with Azure Databricks
- Read data with Azure Databricks
- Perform transformations with Azure Databricks
Module 4: Building Globally Distributed Databases with Cosmos DB
In this module, students will learn how to work with NoSQL data using Azure Cosmos DB. They will learn how to provision the service, how they can load and interrogate data in the service using Visual Studio Code extensions, and the Azure Cosmos DB .NET Core SDK. They will also learn how to configure the availability options so that users are able to access the data from anywhere in the world.- Create an Azure Cosmos DB database built to scale
- Insert and query data in your Azure Cosmos DB database
- Build a .NET Core app for Cosmos DB in Visual Studio Code
- Distribute data globally with Azure Cosmos DB
Lab : Building Globally Distributed Databases with Cosmos DB
- Create an Azure Cosmos DB
- Insert and query data in Azure Cosmos DB
- Build a .Net Core App for Azure Cosmos DB using VS Code
- Distribute data globally with Azure Cosmos DB
- Create an Azure Cosmos DB database built to scale
- Insert and query data in your Azure Cosmos DB database
- Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
- Distribute data globally with Azure Cosmos DB
Module 5: Working with Relational Data Stores in the Cloud
In this module, students will explore the Azure relational data platform options, including SQL Database and SQL Data Warehouse. The students will be able explain why they would choose one service over another, and how to provision, connect, and manage each of the services.- Use Azure SQL Database
- Describe Azure SQL Data Warehouse
- Creating and Querying an Azure SQL Data Warehouse
- Use PolyBase to Load Data into Azure SQL Data Warehouse
Lab : Working with Relational Data Stores in the Cloud
- Use Azure SQL Database
- Describe Azure SQL Data Warehouse
- Creating and Querying an Azure SQL Data Warehouse
- Use PolyBase to Load Data into Azure SQL Data Warehouse
- Use Azure SQL Database
- Describe Azure Data Warehouse
- Create and Query an Azure SQL Data Warehouse
- Use PolyBase to Load Data into Azure SQL Data Warehouse
Module 6: Performing Real-Time Analytics with Stream Analytics
In this module, students will learn the concepts of event processing and streaming data and how this applies to Events Hubs and Azure Stream Analytics. The students will then set up a stream analytics job to stream data and learn how to query the incoming data to perform analysis of the data. Finally, they will learn how to manage and monitor running jobs.- Explain data streams and event processing
- Data Ingestion with Event Hubs
- Processing Data with Stream Analytics Jobs
Lab : Performing Real-Time Analytics with Stream Analytics
- Explain data streams and event processing
- Data Ingestion with Event Hubs
- Processing Data with Stream Analytics Jobs
- Be able to explain data streams and event processing
- Understand Data Ingestion with Event Hubs
- Understand Processing Data with Stream Analytics Jobs
Module 7: Orchestrating Data Movement with Azure Data Factory
In this module, students will learn how Azure Data Factory can be used to orchestrate the data movement and transformation from a wide range of data platform technologies. They will be able to explain the capabilities of the technology and be able to set up an end to end data pipeline that ingests and transforms data.- Explain how Azure Data Factory works
- Azure Data Factory Components
- Azure Data Factory and Databricks
Lab : Orchestrating Data Movement with Azure Data Factory
- Explain how Data Factory Works
- Azure Data Factory Components
- Azure Data Factory and Databricks
- Understand Azure Data Factory and Databricks
- Understand Azure Data Factory Components
- Be able to explain how Azure Data Factory works
Module 8: Securing Azure Data Platforms
In this module, students will learn how Azure provides a multi-layered security model to protect data. The students will explore how security can range from setting up secure networks and access keys, to defining permission, to monitoring across a range of data stores.- An introduction to security
- Key security components
- Securing Storage Accounts and Data Lake Storage
- Securing Data Stores
- Securing Streaming Data
Lab : Securing Azure Data Platforms
- An introduction to security
- Key security components
- Securing Storage Accounts and Data Lake Storage
- Securing Data Stores
- Securing Streaming Data
- Have an introduction to security
- Understand key security components
- Understand securing Storage Accounts and Data Lake Storage
- Understand securing Data Stores
- Understand securing Streaming Data
Module 9: Monitoring and Troubleshooting Data Storage and Processing
In this module, the students will get an overview of the range of monitoring capabilities that are available to provide operational support should there be issue with a data platform architecture. They will explore the common data storage and data processing issues. Finally, disaster recovery options are revealed to ensure business continuity.- Explain the monitoring capabilities that are available
- Troubleshoot common data storage issues
- Troubleshoot common data processing issues
- Manage disaster recovery
Lab : Monitoring and Troubleshooting Data Storage and Processing
- Explain the monitoring capabilities that are available
- Troubleshoot common data storage issues
- Troubleshoot common data processing issues
- Manage disaster recovery
- Explain the monitoring capabilities that are available
- Troubleshoot common data storage issues
- Troubleshoot common data processing issues
- Manage disaster recovery
BONUS! Cyber Phoenix Subscription Included: All Phoenix TS students receive complimentary ninety (90) day access to the Cyber Phoenix learning platform, which hosts hundreds of expert asynchronous training courses in Cybersecurity, IT, Soft Skills, and Management and more!
Phoenix TS is registered with the National Association of State Boards of Accountancy (NASBA) as a sponsor of continuing professional education on the National Registry of CPE Sponsors. State boards of accountancy have final authority on the acceptance of individual courses for CPE credit. Complaints re-garding registered sponsors may be submitted to the National Registry of CPE Sponsors through its web site: www.nasbaregistry.org