Share this Job
Apply now »

GCP Data Engineer

Date:  23-Feb-2021
City:  London
Company:  GFT Technologies SE

GCP Data Engineer 


About GFT

GFT is driving the digital transformation of the world’s leading financial institutions. Other sectors, such as industry and insurance, also leverage GFT’s strong consulting and implementation skills across all aspects of pioneering technologies, such as cloud engineering, artificial intelligence, the Internet of Things for Industry 4.0, and blockchain.


With its in-depth technological expertise, strong partnerships and scalable IT solutions, GFT increases productivity in software development. This provides clients with faster access to new IT applications and innovative business models, while also reducing risk.


Role Summary

GFT are seeking a Data Engineer with demonstrable experience of Big Data technologies, including those available on the Google Cloud Platform.  You should not only be technically proficient in these technologies but also be confident in articulating the use and capabilities of these to our clients.  This position will be based in London, but international travel may be required. The potential and variety of the work that GFT is supporting our clients with is truly mind-blowing.

  • Provide deep technical skills for modern data migrations to the cloud and cloud native implementations
  • Be a “go-to” expert for data technologies and solutions
  • Perform hands on design and implementation of complex hybrid and cloud solutions in high availability, high scale environments
  • Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
  • Develop Prototypes and Proofs of Concept to support, test and validate design and delivery assumptions
  • Be an advocate for Data technologies within GFT and further enhance the data engineering capability in the UK and influence the development of GFT’s global Cloud delivery capability
  • Communicate complex solutions in business terms to internal GFT and Client Stakeholders
  • Mentor and coach less experienced Engineers to develop and grow GFTs talent pool, share best practice and establish common patterns and standards
  • Provide industry thought leadership writing blogs posts and white papers, and being active on social media

Role Responsibilities

  • Spearhead design and development of modern data processing applications or Big Data-based solutions on the Google Cloud Platform.
  • Stakeholder relationship management
  • Understand complex data and aggregation requirements and translate them into robust data pipelines
  • Interface with SMEs, business analysts and other IT teams to understand the requirements and translate them into technical deliverables
  • Working with complex systems in the analytical space (e.g. risk / pnl and complex data processing) 


Competencies and Skills Needed

Mandatory Skills

  • Demonstrable hands-on experience designing, developing and implementing Big Data Solutions using the tools and products available in the Google Cloud Platform
  • Demonstrable experience of Agile processes & tooling: Jira, Confluence, Agile/Scaled Agile, SAFE, Kanban, Scrum etc.
  • Experience of Continuous Integration tools such as; Git, Maven, Bazel/Blaze, Nexus, Artifactory, Jenkins, Octopus CI, TeamCity
  • Demonstrable experience implementing data pipelines in both batch and stream processing
  • Understanding of DataOps and ability to create use case agnostic configurable ETL / ELT pipelines
  • Understanding and experience with Object-Oriented Programming, DRY / SOLID
  • Experience with at least one of the following programming languages: Python, Java, Scala
  • Understanding and experience of TDD/BDD 
  • Knowledge of Crypto and Cyber security principles; encryption, RBAC, network security
  • Excellent data mapping/modelling skills – understanding data requirements
  • Data serialisation formats: JSON, YAML, PARQUET, ORC, Protobuf, AVRO
  • Significant demonstrable expertise and experience of application architecture using GCP:
  • GCE, App Engine, GCS, TensorFlow, CloudSQL, Kubernetes, Helm, Docker, GKE, Istio, OpenShift, DataProc, DataFlow/Apache Beam, Big Query/BigTable, Snowflake, Pub/Sub, Data Lab, Confluent / Kafka, GCP DevOps tooling or Jenkins, Rancher, TeamCity , Data lab / Data studio / Kyligence / Kylin / Looker, IaC technology e.g. Terraform, Scripting experience. Python, Linux/Bash, SQL

Behavioural Skills

  • Ability to assist program and project managers in the design, planning, and governance of solutions
  • Strong communicator and be able to interface with application teams and add value. Excellent oral and written communications skills to meet high standards of consulting business.
  • Self-motivated and self-driven individual who is able to work autonomously as well as a member of a team, including the ability to multitask
  • Ability to take ownership of the deliverable

What we offer you:
You will be working with some of the brightest people in business and technology on challenging and rewarding projects in a team of like-minded individuals. GFT prides itself on its international environment that promotes professional and cultural exchange and encourages further individual development.

Founded in 1987 and located in 13 countries to ensure close proximity to its clients, GFT employs over 5,500 people. GFT provides them with career opportunities in all areas of software engineering and innovation. The GFT Technologies SE share is listed in the Prime Standard segment of the Frankfurt Stock Exchange.

Apply now »