Migrate/Copy Semantic Model Across Workspaces Using Semantic Link Labs

Here is a quick script to copy a semantic model from one workspace to another in the same tenant, assuming you are contributor+ in both the workspaces. I tested this for a Direct Lake model but should work for any more other semantic model. This just copies the metadata (not the data in the model) so be sure to set up other configurations (RLS members, refresh schedule, settings etc.). That can also be changed programmatically, thanks to Semantic Link Labs, but I will cover that in a future post.

If it’s a Direct Lake model, the model will point to the lakehouse of the source model. You can point to another lakehouse with the same schema using this function. If you need to migrate an import model including the data in the semantic model, use backup and restore functions (needs ADLSGg2).

Steps:

  • Install semantic link labs

  • specify the source and target workspace and dataset ids

  • Configure data sources, parameters, RLS, other options as required

#!pip install semantic-link-labs --q #install SLL
import sempy_labs as labs
import sempy.fabric as fabric
import time

def migrate_semantic_model(source_workspace_id, source_dataset_id, target_workspace_id, target_dataset_name):
    """
    Sandeep Pawar | fabric.guru | 02-10-2025
    Migrates/copies a semantic model from a source workspace to a target workspace.

    Parameters:
        source_workspace_id (str): The ID of the source workspace.
        source_dataset_id (str): The ID of the source dataset.
        target_workspace_id (str): The ID of the target workspace.
        target_dataset_name (str): The name of the target dataset.
    """
    # current bim
    bim_file = labs.get_semantic_model_bim(dataset=source_dataset_id, workspace=source_workspace_id)

    # see if the model with the same name exists
    datasets = fabric.list_datasets(workspace=target_workspace_id)
    if not any(datasets['Dataset Name'].isin([target_dataset_name])):
        # first a blank model
        labs.create_blank_semantic_model(dataset=target_dataset_name, workspace=target_workspace_id)
        # update blank with above bim
        labs.update_semantic_model_from_bim(dataset=target_dataset_name, bim_file=bim_file, workspace=target_workspace_id)
        print(f"Semantic model '{target_dataset_name}' has been created and updated in the target workspace.")

    else:
        print(f"Semantic model with the name '{target_dataset_name}' already exists in the target workspace.")
        datasets.query('`Dataset Name` == @target_dataset_name')
        display(datasets.query('`Dataset Name` == @target_dataset_name'))


source_workspace_id = "d70c1ed7--911f-3300a13145ff"
source_dataset_id = "368628e2--b089-87b3b84d93ec"
target_workspace_id = "e127c20c--a8d0-0947b7b87337"
target_dataset_name = "migrated_dataset"

migrate_semantic_model(source_workspace_id, source_dataset_id, target_workspace_id, target_dataset_name)

Update : 2/11/2025

Thanks!

Did you find this article valuable?

Support Sandeep Pawar | Microsoft Fabric by becoming a sponsor. Any amount is appreciated!