One Bangkok : Senior Data Engineer
Job Description
We are seeking an experienced Data Engineer to support the design, implementation, and ongoing management of a cloud-based data platform. This role will be responsible for building robust data pipelines, ensuring data quality and governance, and enabling scalable analytics and AI use cases across a mixed-use development. The ideal candidate combines strong technical expertise with practical experience in production environments and can work closely with cross-functional teams, vendors, and business stakeholders.
Job Summary
1. Data Pipeline Development & Integration
- Design, build, and maintain scalable data pipelines (batch and streaming)
- Ingest data from multiple sources (IoT systems, enterprise systems, third-party APIs)
- Implement ETL/ELT processes using GCP-native tools such as Cloud Dataflow, BigQuery, and Cloud Pub/Sub.
2. Data Platform Implementation
- Support the setup and enhancement of the overall data architecture on GCP
- Work with solution architects to define data models, storage, and processing layers
- Ensure platform scalability, performance, and cost efficiency
3. Data Governance & Quality
- Implement data validation, monitoring, and quality controls
- Support data cataloging, lineage, and metadata management
- Ensure compliance with internal data governance and security policies
4. Operations & Maintenance
- Monitor pipeline performance and troubleshoot issues
- Optimize jobs for reliability and cost
- Manage production deployments and version control
- Collaborate with DevOps for CI/CD pipelines
5. Stakeholder & Vendor Coordination
- Work closely with internal teams (ICT, digital, operations)
- Coordinate with external vendors and system integrators
- Translate business requirements into technical data solutions
6. Enable Analytics & AI Use Cases
- Prepare clean, structured datasets for dashboards and AI models
- Support integration with BI tools (e.g., Looker, Power BI)
- Enable data availability for predictive analytics and smart city use cases
Key Qualification
Education
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
Experience
- 5 - 10 years of experience in data engineering, data platforms, or related roles
- Proven experience building and maintaining production-grade data pipelines
- Hands-on experience with GCP is highly preferred
Technical Skills
- Strong proficiency in SQL and Python
- Experience with distributed data processing and streaming technologies (e.g., Apache Beam/Dataflow, Apache Spark, Apache Kafka or Cloud Pub/Sub)
- Familiarity with GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Storage)
- Understanding of data warehousing and data lake concepts
- Experience with API integration and real-time data streaming
Data & Architecture Knowledge
- Strong understanding of data modeling (star schema, normalization)
- Knowledge of ETL vs ELT approaches
- Understanding of data lifecycle management
- Exposure to data governance frameworks is a plus
Soft Skills
- Strong problem-solving and analytical thinking
- Ability to work independently and manage multiple priorities
- Good communication skills to engage both technical and non-technical stakeholders
Preferred Background
- Experience in smart buildings, real estate, or mixed-use developments
- Exposure to IoT data integration (e.g., BMS, Wi-Fi analytics, sensors)
- Experience working in multi-vendor environments (SIAM model)