Professional Experience
With over 7 years of experience in data and analytics, I've worked on various projects and technologies. Here's a detailed look at my professional journey.
Senior Data Engineer
Department of Health and Social Care
Key Responsibilities:
- Led a major data architecture review and cloud migration from legacy systems to improve efficiency.
- Building and optimizing Azure Databricks based data pipelines using Databricks Asset Bundles (DABs), CI/CD and DBT to improve development cycle by 80%.
- Mentored junior engineers, driving best practices in Git, CI/CD, and Pytest for enhanced data quality
- Designed and implemented scalable data models, data lakehouse and pipelines enabling advanced analytics for reporting and decision-making.
Technologies Used:
Azure Databricks dbt Python R Spark SQL PySpark Databricks Asset Bundle Power BI CI/CD Git Dagster SQL Server SSIS
Senior Data Engineer
Quantiphi Inc
Key Responsibilities:
- Designed and deployed scalable data pipelines handling 100GB+ daily streaming data from various sources including IoT and API.
- Led big data processing initiatives with BigQuery, DataFlow, Pub Sub and Airflow, improving data accessibility and performance.
- Built real-time dashboards using Looker, providing key business insights for the renewable energy sector.
- Collaborated with stakeholders to define KPIs and standardize data reporting to align business strategy.
Technologies Used:
Google Cloud Platform (GCP) Big Query Apache Airflow Looker SQL Python Pub Sub Data Flow Git CI CD
Business Intelligence Analyst
Sports Marketing Surveys USA
Key Responsibilities:
- Leveraged Amazon Web Services (AWS) to design and implement salable and cost-effective data solutions to optimize data integration, storage, and analytics capabilities.
- Developed predictive analytics models in python, improving customer engagement metrics by 18%.
- Led A/B testing and optimized data workflows for enhanced business insights using SQL and Excel.
- Build interective dashboards in Shiny and Dash
Technologies Used:
SQL Python R AWS Glue AWS S3 Redshift Excel
Software Quality Operation Analyst
Waymo (Google X)
Key Responsibilities:
- Assuring software quality of a Self-driving car by designing test cases, overseeing simulation test results, filing and tracking of the bugs to ensure operational safety of the SDC fleet on the road using internal tools and Google Cloud.
- Identified steps to reduce the backlog rates by 10% resulting in faster report submission of the test results.
Technologies Used:
BigQuery Python SQL Looker
Business Analyst
Commonwealth of Massachusetts
Key Responsibilities:
- Involved in all phases of Software development life cycle (SDLC). Maintaining artefacts such as Business Requirement Documents (BRD), FRD; and getting approval, Sign-off to support ongoing development phase and lock down the scope of the project by requirements gathering and analysing and providing design solutions to support final product delivery.
Technologies Used:
SQL Jira Excel Visio
Skills & Expertise
Technical Skills
- Data Modeling
- ETL/ELT pipelines
- Data warehouse design & management
- Big data processing
- Data visualisation
Soft Skills
- Team Leadership
- Project Management
- Technical Mentoring
- Problem Solving
- Communication