Building scalable pipelines & transforming raw data into analytics-ready datasets.
I specialize in building end-to-end ETL/ELT workflows and analytical infrastructure. My focus is turning messy raw data into clean, reliable datasets that power real decisions.
Currently pursuing my B.E. in Computer Science & Data Science at Lords Institute of Engineering and Technology. CGPA: 8.2/10.
Modular ELT pipeline using dbt and SQL with layered modeling — Staging, Intermediate, Marts — and automated testing for reliable analytics-ready datasets.
End-to-end pipelines orchestrated via Databricks Workflows with dependency management across dimension and fact processing layers.
Used car market analysis with Python and ML to predict prices. Results visualized in Tableau — data science and analytical depth on full display.
Python-based pipeline to fetch, clean, and transform historical stock price data. Automated ingestion via fetch_data.py and data processing with Databricks notebooks — delivering structured JSON datasets for financial analysis and modeling.