Projects

Demonstrates a robust, fully automated process that loads the daily campaign performance CSV into a Microsoft Fabric Lakehouse without any manual intervention.

Demonstrates how Fabric can be used to architect and deploy a robust, automated, and incremental pipeline to efficiently identify and consolidate only the new log files each day into a single, queryable Delta Lake table in the Bronze layer of the Lakehouse


Demonstrates use of Dataflow Gen2 to build a robust, repeatable, no-code solution that cleans and standardizes customer data from multiple sources to the Silver layer of the Lakehouse.


Demonstrates a rules-driven ingestion pipeline in Microsoft Fabric using dynamic metadata, expressions, Switch routing, and automated archiving. The pipeline classifies files by type and pattern, enabling clean, scalable, hands-off data ingestion.


Demonstrates how PySpark optimization and Delta Lake reliability features were applied to fix a failing enterprise finance pipeline, delivering high-performance joins, guaranteed data quality, and a fully recoverable, production-grade SCD Type 2 system.