Professional Resume
Professional Summary
Results-driven Data Analyst with hands-on experience in analytical tools and designing AI-integrated dashboards.
Proficient in data collection, extraction, cleaning, processing, and visualization. Experienced in performing
statistical and spatial analysis, and developing actionable reports and dashboards. Strong understanding of core
analytical concepts such as data wrangling, exploratory data analysis, data modelling, data warehousing, and
performance metrics. Skilled at troubleshooting, optimizing analytical workflows, automating processes,
and documenting insights to improve efficiency and decision-making.
Experience
Intern | April 2023 – June 2023
- Designed customer support workflows using Amazon Connect, improving call routing and user experience.
- Worked with ServiceNow for data retrieval and navigation with ITSM workflows.
- Gained foundational understanding of cloud-based support tools and process automation in a business environment.
Intern | July 2025 – Current
- Integrated AI solutions into business intelligence platforms and web applications, gaining hands-on experience in AI-driven analytics.
- Embedded AI insights into Power BI reports and dashboards to enable smarter, data-driven decision-making.
- Developed a web application that generates AI insights from user activity, providing productivity and time optimization suggestions.
Education
- Master of Information Communication Technology (Data Analytics), Western Sydney University – Nov 2024
- Bachelor of Science in Information Technology, Mumbai University – Apr 2022
Skills
Databricks & ETL Pipelines
Data Warehousing (Azure/GCP)
SQL (Advanced Queries, Joins, CTEs)
Python (Pandas, PySpark)
Data Modeling & Cleaning
Data Visualization (Power BI, Tableau)
Statistical Analysis & EDA
Workflow Automation & Reporting
Geospatial Analysis
Certifications
IBM Data Analyst
Microsoft Excel
CISCO Data Analytics Essentials
TAFE NSW Micro Skill
LinkedIn Data Analyst
Tesla Hackathon
Projects
1) NSW Road Accidents Analysis
- Cleaned and standardized ~200,000 records across 8 years of crash data using Excel, Python, and Databricks SQL.
- Loaded data from Azure into Databricks and refined it with Medallion architecture layers for data quality and usability.
- Automated jobs, parameterized pipelines, and optimized ETL processes for accuracy and efficiency.
- Executed complex SQL queries to identify high-risk intersections, accident hotspots, and recurring crash patterns.
- Conducted geospatial analysis identifying over 2,500 high-risk zones.
- Developed dashboards in Databricks and Power BI, delivering insights projected to reduce crashes by 10–15%.
2) CAIXA Bank Analysis (Hackathon 2024)
- Cleaned and standardized historical customer transaction data, handling missing values, duplicates, and inconsistencies.
- Improved data loading efficiency by 82% using query optimization techniques.
- Integrated data from JSON, CSV, and XLSX to improve completeness and reliability.
- Performed time-series and exploratory analysis to uncover fraud patterns and detect high-risk merchants.
- Developed interactive reports and dashboards for customer behavior insights and fraud detection.
- Implemented machine learning models for automated fraud detection and personalized recommendations.