What you’ll Do:
- Design & Implement Data Models (Star Schema / Snowflake Based) to support reporting and analytics.
- Optimize Reporting Performance by building aggregation models and improving compute efficiency.
- Enable Self-Service Analytics by making data accessible and easy to query for business users.
- Ensure Data Quality & Validation through unit testing and implementation of validation rules.
- Collaborate with Data Engineers (DE) to ensure the availability and quality of required data sources.
- Define Data Requirements for DE by specifying the necessary data structures and transformations needed for analytics and modeling.
- Work with Cross-Functional Teams to gather reporting requirements and provide well-structured data solutions.
What you’ll Need:
- Strong proficiency in SQL, Python, and Git.
- Hands-on experience with Airflow, DBT, and data transformation tools.
- Experience in optimizing reports and dashboards for better performance.
- Knowledge of data validation, unit testing, and error handling in data pipelines.
- Ability to understand business requirements and translate them into efficient data models.
- Ability to collaborate with Data Engineers to ensure smooth data integration and availability.
It’d be great if you have: (If any)
- Experience with BI tools (Looker, Tableau, Power BI, etc.).
- Familiarity with ETL pipelines and cloud data warehouses (BigQuery, Snowflake, Redshift).
- Understanding of SLA management for data execution & reporting.