What you’ll Do:
- Support the team in designing, building, and maintaining data pipelines and data products
- Assist in developing real-time or batch data processing workflows
- Help monitor data jobs and troubleshoot data issues
- Work with engineers and analysts to understand data needs and support internal projects
- Participate in improving data quality, data documentation, and data governance
- Learn and follow best practices for code quality, performance, and reliability
What you’ll Need:
- Currently pursuing or recently graduated with a degree in Computer Science, Computer Engineering, IT, or related fields
- Basic understanding of programming (Python preferred) and SQL
- Interest in big data technologies and data engineering concepts
- Eagerness to learn new tools, frameworks, and technologies
- Good problem-solving skills and a strong growth mindset
- Ability to communicate in Thai and basic English (reading documentation, writing simple updates)
It’d be Great if you have:
- Familiarity with database systems (e.g., MySQL, PostgreSQL)
- Exposure to data engineering tools such as Airflow, Spark, Kafka, or cloud platforms (AWS/GCP)
- Experience with Docker, Linux, or Git
- Coursework or personal projects related to data engineering, analytics, or distributed systems