Senior Data Engineer (Core Data Platform)
Senior Data Engineer (Core Data Platform)
Ho Chi Minh City, VN, 700000 Hanoi, VN, 10000
What do we do?
GFT Technologies is driving the digital transformation of the world’s leading financial institutions. Other sectors, such as industry and insurance, also leverage GFT’s strong consulting and implementation skills across all aspects of pioneering technologies, such as cloud engineering, artificial intelligence, the Internet of Things for Industry 4.0, and blockchain.
With its in-depth technological expertise, strong partnerships and scalable IT solutions, GFT increases productivity in software development. This provides clients with faster access to new IT applications and innovative business models, while also reducing risk.
We’ve been a pioneer of near-shore delivery since 2001 and now offer an international team spanning 16 countries with a global workforce of over 9,000 people around the world. GFT is recognised by industry analysts, such as Everest Group, as a leader amongst the global mid-sized Service Integrators and ranked in the Top 20 leading global Service Integrators in many of the exponential technologies such as Open Banking, Blockchain, Digital Banking, and Apps Services.
Sign-on Bonus: Eligible for candidates who are currently employed elsewhere and able to join GFT within 30 days of offer acceptance.
Role Summary:
We are seeking an expert and autonomous contract Senior Data Engineer to accelerate the evolution of our core data platform. Your primary mission will be to solve our most critical infrastructure bottleneck: the co-mingling of production and analytical workloads that creates performance risks and slows our ability to generate insights.
You will be a key architect on a new initiative, channeling your deep expertise in data warehousing, event-driven systems, and database operations into the foundational systems that will unlock the entire company. This role is for a hands-on engineer who is a master of their craft. You will be fully empowered to own the implementation and success of this project.
Key Activities:
- Engineer for Enterprise-Grade Governance: Architect and implement the technical backbone for our compliance initiatives (SOC 2, GDPR). You will build the systems for our future data catalog, manage data access, and solve the high-friction processes that currently hinder legitimate analysis.
- Modernize our Data Ingestion Layer: Design and build robust, scalable pipelines for both batch and event-driven data, setting the stage for future real-time analytics and ML feature capabilities.
- Establish Data Quality as a Core Practice: Implement the foundational patterns for data quality monitoring. You will build automated freshness and integrity checks for our most critical data assets, establishing a new level of trust and reliability in our data.
- Architect and Build the Next-Generation Data Warehouse: Proactively own the end-to-end process of designing and building the migration of our analytical workloads from a shared production cluster to a modern, scalable cloud data warehouse (e.g., Snowflake).
Required Skills:
- 7+ years of dedicated data engineering experience, with a strong focus on building and operating core data platforms.
- Expertise in Modern Data Warehousing: You have a proven track record of architecting, building, and operating robust, scalable data warehouses like Snowflake, BigQuery, or Redshift.
- Experience with Event-Driven Architectures: You have hands-on experience with real-time data processing technologies and patterns (e.g., Kafka, Kinesis, Flink, Spark Streaming).
- Deep Knowledge of Database Operations: You have a strong understanding of database performance tuning, monitoring, disaster recovery, and the operational trade-offs of large-scale data systems.
- Data Governance & Compliance Experience: You have hands-on experience building technical solutions to meet compliance requirements like SOC 2 or GDPR, including data access controls and cataloging.
- Pragmatic Problem-Solving: A demonstrated ability to choose the right solution for the problem at hand, avoiding over-engineering while ensuring robustness in a startup environment. You are proficient in SQL and Python.
Nice-to-have requirements:
- AWS Experience: Familiarity with core AWS services used in a platform context (RDS, S3, IAM, Kinesis, etc.).
- dbt (Data Build Tool) Experience: Hands-on experience using dbt for production data transformations is a strong plus.
- Infrastructure as Code Experience: Familiarity with tools like Terraform for managing data infrastructure.
- Experience in a Startup Environment: Comfortable with ambiguity and a fast-paced setting.
What can we offer you?
- Competitive salary
- 13th-month salary guarantee
- Performance bonus
- Professional English course for employees
- Premium health insurance
- Extensive annual leave
Due to the high volume of applications we receive, we are unable to respond to every candidate individually. If you have not received a response from GFT regarding your application within 10 workdays, please consider that we have decided to proceed with other candidates. We truly appreciate your interest in GFT and thank you for your understanding.