How Microsoft Fabric notebooks can be helpful on your Power BI semantic model creator (& data analyst) life
✔️ My personal context is Power BI analytics over GCP Bigquery SQL DWH
✔️ As consequence : None Fabric Lakehouse & DWH on my Fabric workspaces , only power BI artifacts and notebooks
✔️ A new way to manage Data Analyzes for PBI public
✔️ Safety : Perform read only scripts
✔️ Communication support : Share your notebooks codes to run & results with your data community
1) Allow Fabric items creations on your tenant
Admin portal setup ==>
2) Install libraries (semantic link)
3) Run DAX queries
Remark : Prepare your DAX queries on DAX studio or Desktop queries view
4) Test a KPIs values (DAX) - During UAT sessions
Remark : Prepare your DAX queries on DAX studio or Desktop queries view
5) Document my model
queries a Dynamic Management View (DMV)
with INFO() functions
show this model, tables relationships
Recommended by LinkedIn
show for import models , incremental refresh policy
Great thanks to (as nice information source) ==>
6) Best practices analyzer
My model
vertipaq informations
7) Pilot My Model directly on my notebook :
Refresh the model (Import mode - with traces)
import sempy.fabric as fabric
-- full mode refresh
fabric.refresh_dataset( dataset, workspace)
fabric.refresh_dataset(
workspace = 'imyWS',
dataset = 'imySM',
refresh_type = 'full',
apply_refresh_policy = 'false'
)
-- control refreshes status
fabric.list_refresh_requests(dataset=dataset, workspace=workspace)
-
-- OR reshresh a table & a partition
-
#define TMSL
#In the below example, I am refreshing a table called Order_Details and
#partition named Customers-ROW from Customers table
#database is your semantic model name
tmsl_script = {
"refresh": {
"type": "full",
"objects": [
{
"database": "SL-Refresh",
"table": "Order_Details"
},
{
"database": "SL-Refresh",
"table": "Customers",
"partition": "Customers-ROW"
}
]
}
}
fabric.execute_tmsl(workspace="<workspace_name>", script=tmsl_script)
--control last refresh status by objects
fabric.list_tables()
fabric.list_partitions()
Rmk : A direct lake model switched to import mode for consistency (stop the real time updates) - Reframe
Great thanks to (as nice information source) ==>
ETC..
Microsoft Certified Fabric Analytics Engineer Associate. I help businesses make data-driven decisions through insightful data analysis and compelling visualisations (Power BI | DAX | Python | SQL | Excel)
5moThis is great! It's nice to know you can link a notebook to a Power BI Semantic model.
Business Intelligence Manager @Majid Al Futtaim | Visualization Designer | Microsoft Certified
5moWhat an article 👏
Solution Leader Microsoft Analytics | Business Intelligence Expert
5moJ'adore vraiment tes articles !!
Cloud Data & AI Platforms | Microsoft Fabric, Azure and Databricks
5mointeresting implementation INFO() as a source for documentation. Never thought in such a way