top of page

Projects

Demonstrates a robust, fully automated process that loads the daily campaign performance CSV into a Microsoft Fabric Lakehouse without any manual intervention.

Demonstrates how Fabric can be used to architect and deploy a robust, automated, and incremental pipeline to efficiently identify and consolidate only the new log files each day into a single, queryable Delta Lake table in the Bronze layer of the Lakehouse

Gemini_Generated_Image_hpf5xghpf5xghpf5.png
Gemini_Generated_Image_hpf5xghpf5xghpf5.png

Demonstrates use of Dataflow Gen2 to build a robust, repeatable, no-code solution that cleans and standardizes customer  data from multiple sources to the Silver layer of the Lakehouse.

Gemini_Generated_Image_hpf5xghpf5xghpf5.png
Gemini_Generated_Image_984wv9984wv9984w.png

Demonstrates a rules-driven ingestion pipeline in Microsoft Fabric using dynamic metadata, expressions, Switch routing, and automated archiving. The pipeline classifies files by type and pattern, enabling clean, scalable, hands-off data ingestion.

Gemini_Generated_Image_hpf5xghpf5xghpf5.png
Copilot_20251207_231204.png

Demonstrates how PySpark optimization and Delta Lake reliability features were applied to fix a failing enterprise finance pipeline, delivering high-performance joins, guaranteed data quality, and a fully recoverable, production-grade SCD Type 2 system.

bottom of page