Custom Logging Package in Microsoft Fabric

February 17, 2026

Introduction

This guide is geared towards Microsoft Fabric specifically. I am not sure what other environments this process will work in. The specific package we will be working with will likely only work in Microsoft Fabric due to notebookutils is specific to Microsoft Fabric.

The general process overall should work for building any Python package though.

The code for this guide can be found on my fabric-logger-package github repo. I am not going to focus as much in this post on what this code is doing. That can be found in my recent Custom Logging in Microsoft Fabric Notebooks blog post.

Create Folder Structure For Custom Package

Create the folder structure needed to build the package.

fabric_logger_package/
├── fabric_logger/
│   ├── __init__.py
│   └── logger.py
└── setup.py

Create logger.py

This file is the main logic in the package.

import notebookutils
from datetime import datetime
from notebookutils import mssparkutils
from pyspark.sql import SparkSession
from pyspark.sql.types import StructType, StructField, StringType, TimestampType

class FabricLogger:
    def __init__(self, eventhouse_uri, database="Logging", table="NotebookLogging"):
        self.logs = []
        self.eventhouse_uri = eventhouse_uri
        self.database = database
        self.table = table
        self.notebook_name = notebookutils.runtime.context.get("currentNotebookName")
        self.workspace_name = notebookutils.runtime.context.get("currentWorkspaceName")

    # Add records to the message log
    def log(self, log_level, log_message, log_trace=None):
        self.logs.append({
            "log_datetime": datetime.now(), 
            "workspace_name": self.workspace_name,
            "notebook_name": self.notebook_name,
            "log_level": log_level,
            "log_message": log_message,
            "log_trace": log_trace
        })

    # Write the message log to Eventhouse
    def write_to_eventhouse(self):
        # Define schema
        schema = StructType([
            StructField("log_datetime", TimestampType(), nullable=False),
            StructField("workspace_name", StringType(), nullable=True),
            StructField("notebook_name", StringType(), nullable=True),
            StructField("log_level", StringType(), nullable=False),
            StructField("log_message", StringType(), nullable=False),
            StructField("log_trace", StringType(), nullable=True)
        ])
        
        # Column order (must match schema order)
        columns_in_order = [
            "log_datetime", 
            "workspace_name",
            "notebook_name", 
            "log_level", 
            "log_message", 
            "log_trace"
        ]

        try:
            # Create spark session
            spark = SparkSession.builder.getOrCreate()
            
            # Create dataframe with schema, then select to ensure order
            df = spark.createDataFrame(self.logs, schema=schema).select(columns_in_order)
            
            # Write to Eventhouse
            df.write.format("com.microsoft.kusto.spark.synapse.datasource").\
            option("kustoCluster", self.eventhouse_uri).\
            option("kustoDatabase", self.database).\
            option("kustoTable", self.table).\
            option("accessToken", mssparkutils.credentials.getToken(self.eventhouse_uri)).\
            option("tableCreateOptions", "CreateIfNotExist").mode("Append").save()
        except Exception as e:
            print(f"Error writing to Eventhouse: {e}")

Create init.py

Give some more information.

from .logger import FabricLogger

__version__ = "1.0.0"
__all__ = ['FabricLogger']

Create setup.py

This file is what gives our package information like what version it is and what it is called.

from setuptools import setup, find_packages

setup(
    name="fabric_logger",
    version="1.0.0",
    packages=find_packages(),
    description="Eventhouse logging for Microsoft Fabric",
    author="Kevin of Tech",
    python_requires=">=3.8"
)

Build the Custom Package and Upload It

Now that we have our files and folder structure complete, it is time to build our package.

First, navigate to the base folder. In this case is fabric-logger-package.

Make sure you have build installed.

# The way you are supposed to install it
pip install build

# The way I had to do it
python -m install build

Now that build is installed, let's run it to build our package!

python -m build

If all succeeds, a couple of new folders will appear. The one we are interested in for now is the "dist" folder.

cd dist

Looking at the contents of this folder, there are two files.

  • fabric_logger-1.0.0.tar.gz
  • fabric_logger-1.0.0-py3-none-any.whl

Open up Microsoft Fabric in your browser.

Navigate to a test workspace.

If you don't already have an Environment created, create a new Environment by clicking on the " + New item" button at the top of the workspace.

Click on "All items" and search for "Environment".

Click on the "Environment" object. This will show a new pop up that asks you to name the new environment. Give it a name and click "Create".

Open the Environment. On the left navigation pane there is a section called "Libraries". Click on "Custom" under the "Libraries" section.

This should show you a "Custom libraries" page. Click the green "Upload" button.

Navigate to the "fabric_logger-1.0.0-py3-none-any.whl" file that you created. Select it and click "Open".

You should see the library on your screen now. You should also see a new banner across the top with a "Save" and a "Publish" button. Click "Save". It will take a couple of seconds to complete. After it is done, click "Publish". This may take a few minutes to complete.

Using the Package

Once you have uploaded the package, you can start using it by using the code below in a Notebook. Make sure you change the Notebook's Environment to be the custom Environment that you created so it can see the library.

# Log without trace
from fabric_logger import FabricLogger

database = "Logging"
table = "NotebookLogging"
eventhouse_uri = "https://<your-kusto-cluster>.kusto.fabric.microsoft.com"

logger = FabricLogger(eventhouse_uri, database, table)

logger.log("Warning", "There was an error")

logger.write_to_eventhouse()

You can also import the traceback library to include a trace.

# Log with trace
from fabric_logger import FabricLogger
import traceback

database = "Logging"
table = "NotebookLogging"
eventhouse_uri = "https://<your-kusto-cluster>.kusto.fabric.microsoft.com"

logger = FabricLogger(eventhouse_uri, database, table)

try:
    1 / 0  # This raises a ZeroDivisionError
except Exception as e:
    log("Warning", str(e), traceback.format_exc())

logger.write_to_eventhouse()

I don't have a comments section yet, so feel free to send me feedback on this blog.


Kevin Williams

Kevin is a data engineer and is the Business Intelligence Practice Lead at Software Design Partners specializing in data warehousing. He is a father, an occasional gamer, and lover of many different types of music.


The opinions expressed on this site are my own and may not represent my employer's view.
Share this post
About this blog...

In this post we will take our custom logging from Notebooks to Eventhouse in Microsoft Fabric a step further by creating a custom package we can upload to an Environment in Microsoft Fabric.

Archives


An error has occurred. This application may no longer respond until reloaded. Reload 🗙