Stay ahead with 100% Free Administering Microsoft Azure SQL Solutions DP-300 Dumps Practice Questions
You are designing an enterprise data warehouse in Azure Synapse Analytics that will contain a table named Customers. Customers will contain credit card information.
You need to recommend a solution to provide salespeople with the ability to view all the entries in Customers. The solution must prevent all the salespeople from viewing or inferring the credit card information.
What should you include in the recommendation?
You're analyzing the performance metrics of an Azure SQL Database. Which combination of tools would provide both real-time and historical insights into query performance?
Introductory Info Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview -
General Overview -
Contoso, Ltd. is a financial data company that has 100 employees. The company delivers financial data to customers.
Physical Locations -
Contoso has a datacenter in Los Angeles and an Azure subscription. All Azure resources are in the US West 2 Azure region. Contoso has a 10-Gb ExpressRoute connection to Azure.
The company has customers worldwide.
Existing Environment -
Active Directory -
Contoso has a hybrid Azure Active Directory (Azure AD) deployment that syncs to on-premises Active Directory.
Database Environment -
Contoso has SQL Server 2017 on Azure virtual machines shown in the following table.
SQL1 and SQL2 are in an Always On availability group and are actively queried. SQL3 runs jobs, provides historical data, and handles the delivery of data to customers.
The on-premises datacenter contains a PostgreSQL server that has a 50-TB database.
Current Business Model -
Contoso uses Microsoft SQL Server Integration Services (SSIS) to create flat files for customers. The customers receive the files by using FTP.
Requirements -
Planned Changes -
Contoso plans to move to a model in which they deliver data to customer databases that run as platform as a service (PaaS) offerings. When a customer establishes a service agreement with Contoso, a separate resource group that contains an Azure SQL database will be provisioned for the customer. The database will have a complete copy of the financial data. The data to which each customer will have access will depend on the service agreement tier. The customers can change tiers by changing their service agreement.
The estimated size of each PaaS database is 1 TB.
Contoso plans to implement the following changes:
Move the PostgreSQL database to Azure Database for PostgreSQL during the next six months.
Upgrade SQL1, SQL2, and SQL3 to SQL Server 2019 during the next few months.
Start onboarding customers to the new PaaS solution within six months.
Business Goals -
Contoso identifies the following business requirements:
Use built-in Azure features whenever possible.
Minimize development effort whenever possible.
Minimize the compute costs of the PaaS solutions.
Provide all the customers with their own copy of the database by using the PaaS solution.
Provide the customers with different table and row access based on the customer's service agreement.
In the event of an Azure regional outage, ensure that the customers can access the PaaS solution with minimal downtime. The solution must provide automatic failover.
Ensure that users of the PaaS solution can create their own database objects but be prevented from modifying any of the existing database objects supplied by
Contoso.
Technical Requirements -
Contoso identifies the following technical requirements:
Users of the PaaS solution must be able to sign in by using their own corporate Azure AD credentials or have Azure AD credentials supplied to them by
Contoso. The solution must avoid using the internal Azure AD of Contoso to minimize guest users.
All customers must have their own resource group, Azure SQL server, and Azure SQL database. The deployment of resources for each customer must be done in a consistent fashion.
Users must be able to review the queries issued against the PaaS databases and identify any new objects created.
Downtime during the PostgreSQL database migration must be minimized.
Monitoring Requirements -
Contoso identifies the following monitoring requirements:
Notify administrators when a PaaS database has a higher than average CPU usage.
Use a single dashboard to review security and audit data for all the PaaS databases.
Use a single dashboard to monitor query performance and bottlenecks across all the PaaS databases.
Monitor the PaaS databases to identify poorly performing queries and resolve query performance issues automatically whenever possible.
PaaS Prototype -
During prototyping of the PaaS solution in Azure, you record the compute utilization of a customer's Azure SQL database as shown in the following exhibit.
Role Assignments -
For each customer's Azure SQL Database server, you plan to assign the roles shown in the following exhibit.
Question What should you use to migrate the PostgreSQL database?
Introductory Info Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview -
ADatum Corporation is a retailer that sells products through two sales channels: retail stores and a website.
Existing Environment -
ADatum has one database server that has Microsoft SQL Server 2016 installed. The server hosts three mission-critical databases named SALESDB, DOCDB, and REPORTINGDB.
SALESDB collects data from the stores and the website.
DOCDB stores documents that connect to the sales data in SALESDB. The documents are stored in two different JSON formats based on the sales channel.
REPORTINGDB stores reporting data and contains several columnstore indexes. A daily process creates reporting data in REPORTINGDB from the data in
SALESDB. The process is implemented as a SQL Server Integration Services (SSIS) package that runs a stored procedure from SALESDB.
Requirements -
Planned Changes -
ADatum plans to move the current data infrastructure to Azure. The new infrastructure has the following requirements:
Migrate SALESDB and REPORTINGDB to an Azure SQL database.
Migrate DOCDB to Azure Cosmos DB.
The sales data, including the documents in JSON format, must be gathered as it arrives and analyzed online by using Azure Stream Analytics. The analytics process will perform aggregations that must be done continuously, without gaps, and without overlapping.
As they arrive, all the sales documents in JSON format must be transformed into one consistent format.
Azure Data Factory will replace the SSIS process of copying the data from SALESDB to REPORTINGDB.
Technical Requirements -
The new Azure data infrastructure must meet the following technical requirements:
Data in SALESDB must encrypted by using Transparent Data Encryption (TDE). The encryption must use your own key.
SALESDB must be restorable to any given minute within the past three weeks.
Real-time processing must be monitored to ensure that workloads are sized properly based on actual usage patterns.
Missing indexes must be created automatically for REPORTINGDB.
Disk IO, CPU, and memory usage must be monitored for SALESDB. Question Which windowing function should you use to perform the streaming aggregation of the sales data?
Case study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study -
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview -
ADatum Corporation is a financial services company that has a main office in New York City.
Existing Environment. Licensing Agreement
ADatum has a Microsoft Volume Licensing agreement that includes Software Assurance.
Existing Environment. Network Infrastructure
ADatum has an on-premises datacenter and an Azure subscription named Sub1.
Sub1 contains a virtual network named Network1 in the East US Azure region.
The datacenter is connected to Network1 by using a Site-to-Site (S2S) VPN.
Existing Environment. Identity Environment
The on-premises network contains an Active Directory Domain Services (AD DS) forest.
The forest contains a single domain named corp.adatum.com.
The corp.adatum.com domain syncs with a Microsoft Entra tenant named adatum.com.
Existing Environment. Database Environment
The datacenter contains the servers shown in the following table.
DB1 and DB2 are used for transactional and analytical workloads by an application named App1.
App1 runs on Microsoft Entra hybrid joined servers that run Windows Server 2022. App1 uses Kerberos authentication.
DB3 stores compliance data used by two applications named App2 and App3.
DB3 performance is monitored by using Extended Events sessions, with the event_file target set to a file share on a local disk of SVR3.
Resource allocation for DB3 is managed by using Resource Governor.
Requirements. Planned Changes -
ADatum plans to implement the following changes:
• Deploy an Azure SQL managed instance named Instance1 to Network1.
• Migrate DB1 and DB2 to Instance1.
• Migrate DB3 to Azure SQL Database.
• Following the migration of DB1 and DB2, hand over database development to remote developers who use Microsoft Entra joined Windows 11 devices.
• Following the migration of DB3, configure the database to be part of an auto-failover group.
Requirements. Availability Requirements
ADatum identifies the following post-migration availability requirements:
• For DB1 and DB2, offload analytical workloads to a read-only database replica in the same Azure region.
• Ensure that if a regional disaster occurs, DB1 and DB2 can be recovered from backups.
• After the migration, App1 must maintain access to DB1 and DB2.
• For DB3, manage potential performance issues caused by resource demand changes by App2 and App3.
• Ensure that DB3 will still be accessible following a planned failover.
• Ensure that DB3 can be restored if the logical server is deleted.
• Minimize downtime during the migration of DB1 and DB2.
Requirements. Security Requirements
ADatum identifies the following security requirements for after the migration:
• Ensure that only designated developers who use Microsoft Entra joined Windows 11 devices can access DB1 and DB2 remotely.
• Ensure that all changes to DB3, including ones within individual transactions, are audited and recorded.
Requirements. Management Requirements
ADatum identifies the following post-migration management requirements:
• Continue using Extended Events to monitor DB3.
• In Azure SQL Database, automate the management of DB3 by using elastic jobs that have database-scoped credentials.
Requirements. Business Requirements
ADatum identifies the following business requirements:
• Minimize costs whenever possible, without affecting other requirements.
• Minimize administrative effort.
You need to recommend a process to automate the management of DB3. The solution must meet the management requirements.
What should be the first step of the process?
© Copyrights TheExamsLab 2025. All Rights Reserved
We use cookies to ensure your best experience. So we hope you are happy to receive all cookies on the TheExamsLab.