Kyanon Digital is a Vietnam-based tech powerhouse. We deliver world-class solutions to clients across the globe. At Kyanon Digital, we offer end-to-end solutions that encompass every facet of the digital landscape. With the slogan: “Digital Impact that Matters”, this has guided our team of over 300 employees for over 12 years, creating many positive changes for large clients in various industries.
We are seeking a highly skilled Data Engineer to join our team and play a pivotal role in building and maintaining our data infrastructure. The ideal candidate will have a strong foundation in data engineering principles, a passion for data, and a proven ability to design and implement robust data pipelines. You will collaborate closely with data scientists, analysts, and business stakeholders to ensure data quality, accessibility, and reliability.
How You Can Contribute
- Engineering & Implementation
- Design, develop, and maintain scalable data pipelines and ETL processes to extract, transform, and load data from various sources into data warehouses and data lakes.
- Build and optimize data models and schemas to support business intelligence and analytics needs.
- Develop and implement data quality checks and monitoring systems to ensure data accuracy and integrity.
- Collaborate with data scientists, analysts, and AI Engineers to understand their data requirements and translate them into technical solutions.
- Develop and optimize data lake and data warehouse performance through indexing, partitioning, and query optimization.
- Troubleshoot data-related issues and provide timely resolutions.
- Others
- Stay up-to-date with the latest data engineering technologies and trends.
- Collaborate with stakeholders and provide guidance on best practices.
- Participate in code reviews and knowledge sharing sessions.
- Contribute to the development of data governance policies and procedures.
What You Need To Maximize Your Contribution
- Bachelor’s degree in Computer Science, Engineering, Information System, or a related field.
- 5+ years of experience as a Data Engineer with a strong track record of delivering data solutions.
- Proficiency in SQL and Python or other programming languages.
- Experience with data warehousing and data lake technologies (e.g., Snowflake, AWS Redshift, S3, Azure Data Lake).
- Knowledge of ETL/ELT tools and frameworks (e.g., Airflow, Talend, Informatica).
- Experience with cloud platforms (AWS, GCP, Azure) is preferred.
- Strong analytical and problem-solving skills.
- Ability to work independently and as part of a team.
- Excellent communication and interpersonal skills.
Please don’t hesitate to become a part of our team by sending your updated resume, highlighting your experiences, via [email protected]. We will contact you soon.