Microsoft Fabric Configuration and Optimization Solutions

Fabric Tenant and Workspace Management

Scenario: You have a Fabric tenant that contains a workspace.

  • User Capabilities: User1 can connect to DW1 via the Azure SQL Analytics endpoint.
  • XMLA Read-Write Access: To ensure read-write access to DS1 via the XMLA endpoint, you must first modify the C1 settings.
  • Fabric Admin Portal Recommendations:
    • From the Tenant settings, set Allow XMLA to Enabled.
    • From the Capacity settings, set XMLA to Read Write.
    • From the Tenant settings, ensure the relevant feature is set to Enabled.
  • Model Optimization: To optimize the deployment of Model1 in Workspace1, select the Large semantic model storage format.
  • XMLA Connection Support: To ensure Model1 supports XMLA connections, you should modify the License mode.
  • Data Loss Prevention (DLP): To apply DLP1 to items in Workspace1, apply sensitivity labels to the semantic model and reports.
  • User Permissions: Enable settings for User1 to create Fabric items and synchronize workspace items with their Git repositories.
  • Deployment Pipeline Structure: When DeployPipeline1 is complete, Workspace2 will have the structure: \Pipeline1 ----\Folder1\Lakehouse1.
  • Version Control Sequence: To enable version control for Workspace1:
    1. Assign Workspace1.
    2. Connect Workspace1.
    3. Sync Workspace1.

Lakehouse Operations and Data Integration

Scenario: The environment contains a lakehouse.

  • Connectivity: After connecting to Lakehouse1, you will be able to Read Table3.
  • Copy Data Activity: In the Destination tab, set the Table action to Overwrite.
  • Shortcut Configuration: To recommend a file format for a shortcut, use the Parquet format and create a shortcut in the Tables section.
  • V-Order Optimization: To convert CSV files with V-Order optimization enabled, use the Load to Tables feature from Lakehouse explorer.
  • Partition Columns: To specify a partition column in the Destination settings of a Copy activity, first set the Mode to Overwrite in the Destination tabs.
  • Pipeline Activities: To support Power Query M in Pipeline1, add a Dataflow activity.
  • ACID Properties: To ensure the ACID properties of a table are maintained, run the VACUUM command.
  • SQL Analytics: Configure the SQL analytics endpoint settings to manage connectivity.
  • AWS S3 Integration: To connect to an Amazon S3 subscription, provide the secret access key and the access key ID.
  • Automatic Semantic Models: To ensure tables are added automatically to the default semantic model, enable Sync the default Power BI semantic model in the LH1 settings pane.
  • Dimension Tables: Include ProductColor, ProductID, and ProductName in the DimProduct table.
  • External Data Loading: Use a copy job to add data to Lakehouse1 from an external Azure Storage account.
  • IoT Data Maintenance: For readings from 100 IoT devices, schedule the VACUUM and OPTIMIZE commands.

Fabric Warehouse and T-SQL Queries

Scenario: You have a Fabric warehouse.

  • Aggregation Queries: Use the following query structure: SELECT ProductID, [...] WHERE [...] GROUP BY [...] HAVING SUM(Amount) > 10000.
  • Security Objects: To implement security, create a FUNCTION and a SECURITY POLICY.
  • Window Functions: Run a T-SQL statement using CAST(1. * [...] OVER(PARTITION BY SalesOrderID) * 100).
  • Conditional Logic: Complete queries using the CASE, ELSE, and END syntax.

Power BI Reporting and Performance

Scenario: The environment contains a Microsoft Power BI report.

  • Visual Data Integrity: To ensure all rows appear in a visual, add a unique field to each row.
  • Performance Analyzer Sequence:
    1. From Performance.
    2. Sort the Duration.
    3. Copy the first.
    4. Enable Query.
    5. View the Server.
  • Column Statistics: Use the Table.Profile Power Query function to display statistics.

Semantic Model Optimization and Security

Scenario: The environment contains a semantic model.

  • DAX Expressions: To return stores opened since December 1, 2023, use: DEFINE, EVALUATE, and SUMMARIZE.
  • Memory Analysis: To identify unnecessary columns, use the Vertipaq Analyzer tool or query the $System.DISCOVER (DMV).
  • Memory Reduction: For large tables (100 million rows), split OrderDateTime into separate date and time columns and replace TotalSalesAmount with a measure.
  • Implicit Measures: To prevent the use of implicit measures, use Power BI Desktop or Tabular Editor.
  • Programmatic Metadata: Use Tabular Editor to programmatically change columns ending in “Key”.
  • RLS and Managed Identities: When modifying Row-Level Security (RLS) via XMLA from an App Service, first add a managed identity to App1.
  • Git Integration: To prevent data values from being pushed to a repository, reference the cache.abf file in .gitignore.
  • Object-Level Security (OLS): Use Tabular Editor to modify OLS for the model.

Advanced Data Evaluation and Logic

  • Data Statistics: To calculate min, max, and mean in a notebook, use df.summary().
  • Maintenance Tracking: To identify if maintenance tasks were performed on a customer table, use DESCRIBE HISTORY customer.
  • DAX Performance: To reduce execution time, use NOT ISEMPTY(CALCULATETABLE('Order Item')).
  • User Identity: To return the User Principal Name (UPN) in a measure, use the USERPRINCIPALNAME() function.