[HCM] Data Engineer

Salary: Negotiation

Location: Ho Chi Minh Office (Văn phòng Hồ Chí Minh)

Team: Data Analytics (Phân tích dữ liệu)

Application deadline: 19/09 — 31/10/2025

Job Description

Job Scope

We are seeking a motivated Data Engineer with at least 1 year of experience to join our regional Data & Analytics team. In this role, you will collaborate with senior engineers and analysts to design, build, and optimize data pipelines, ensuring reliable and scalable data flows across multiple markets (Vietnam, Hong Kong, Taiwan, Myanmar).

This role is ideal for someone who is eager to learn modern data engineering practices, thrives in a collaborative environment, and wants to grow into a strong data professional in a fast-paced F&B/retail environment.

Job Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines (batch and streaming) across multiple data sources.
  • Work with tools like Apache Airflow, Spark, dbt, and Kafka for orchestration and real-time data streaming.
  • Develop and optimize data workflows on Google Cloud Platform (BigQuery, GCS, Dataflow, GKE).
  • Implement data modeling and data validation to support BI dashboards and ML use cases.
  • Collaborate with data analysts, ML engineers, and business teams to deliver high-quality, business-ready datasets.
  • Support CI/CD pipelines, containerized deployments (Docker, Kubernetes) and infrastructure automation.

Requirements

Education:

  • Bachelor’s in related field.

Experience:

  • 1+ year of experience in data engineering, data analytics engineering, or related role.
  • Strong SQL skills and familiarity with at least one programming language (Spark, Python preferred).
  • Hands-on experience with at least some of the following:
    • Data orchestration: Airflow, dbt, or similar.
    • Streaming & messaging: Kafka or similar.
    • Cloud platforms: GCP (BigQuery, GCS, Dataproc, Dataflow) or equivalent AWS/Azure experience.
    • Containers & orchestration: Docker, Kubernetes.
  • Understanding of ETL/ELT concepts, partitioning, and data modeling.
  • Good communication skills and fluent in English (written and spoken).
  • Eagerness to learn and grow in a cross-regional, Agile/Scrum team environment.

Nice to have:

  • Exposure to machine learning pipelines or MLOps tools (MLflow, DVC, Great Expectations).
  • Familiarity with BI tools such as Power BI, Looker, or Data Studio.
  • Experience with Git-based CI/CD and Infrastructure as Code (Terraform).

Benefit package:

  • Attractive salary package depending on seniority
  • Hybrid Work Policy
  • 16 days annual leave + 6 paid sick leave
  • Annual health check-up and PVI Healthcare extra
  • Learning & Training opportunity
  • Caring policies, supportive and employee-centric work environment
  • Engagement Activities

Apply for this job

Full name *
Email *
Phone number *
Your CV *
Click to select & upload your CV
Security code *

Apply