Senior Data Engineer (Apache Airflow, dbt, Superset)
Senior Data Engineer (Apache Airflow, dbt, Superset)
Ho Chi Minh City, VN, 700000 Hanoi, VN, 10000
The Role
We are seeking a pragmatic and autonomous Senior Data Engineer to accelerate our data roadmap. Your primary mission will be to expand our data ecosystem by ingesting new sources and building reliable data models, directly enabling our self-service analytics goals. You will take ownership of our Airflow DAGs and dbt models, balancing the development of new pipelines with the maintenance of our existing critical infrastructure.
This role is for a hands-on engineer who thrives on delivering business value. While you will collaborate with senior engineers on high-level architecture for novel data sources, you will be fully empowered to own the implementation, operation, and success of your projects.
Sign-on Bonus: Eligible for candidates who are currently employed elsewhere and able to join GFT within 30 days of offer acceptance.
Key Responsibilities
-
● Expand Data Coverage: Proactively own the end-to-end process of ingesting new data sources and building scalable pipelines using Airflow and dbt.
-
● Partner with Analysts: Work in close partnership with our data analysts to understand business requirements, define metrics, and build the specific, reliable data models they need for their dashboards and analyses in Superset.
-
● Deliver Pragmatic Solutions: Consistently make pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment.
-
● Operational Excellence: Own the day-to-day health of the data platform. This includes monitoring pipelines, debugging failures, and helping to establish and maintain data quality SLAs.
-
● Cross-functional Collaboration: Work with other product and engineering teams to understand source systems and ensure seamless data extraction.
● Refactor and Improve: Identify opportunities to improve and refactor existing ingestion and transformation logic for better performance and maintainability.
Required Skills & Experience
-
● 5+ years of dedicated experience as a Data Engineer.
-
● Expertise in Apache Airflow: Proven experience developing, deploying, and owning
complex data pipelines. You can work independently to build and debug intricate DAGs.
-
● Deep Expertise in dbt: Strong proficiency in building modular, testable, and maintainable
data models with dbt.
-
● Pragmatic Problem-Solving: A demonstrated ability to choose the right solution for the
problem at hand, avoiding over-engineering while ensuring robustness.
-
● Business Acumen: Experience translating ambiguous business or product requirements
into concrete technical data solutions. You are comfortable asking "why" to understand the
core business driver.
-
● Expert-level SQL and Strong Python: Essential for all aspects of the role.
-
● Data Warehousing Fundamentals: Solid understanding of dimensional modeling and
ETL/ELT best practices.
Preferred Skills (Nice-to-Haves)
-
● AWS Experience: Familiarity with core AWS services used in a data context (Aurora RDS, S3, IAM).
-
● Experience in a Startup Environment: Comfortable with ambiguity and a fast-paced setting.
-
● BI Tool Support: Experience working closely with users of BI tools like Superset, Metabase, or Tableau.
Due to the high volume of applications we receive, we are unable to respond to every candidate individually. If you have not received a response from GFT regarding your application within 10 workdays, please consider that we have decided to proceed with other candidates. We truly appreciate your interest in GFT and thank you for your understanding.