Data Engineer with GCP
Kraków, PL, 30-302 Łódź, PL, 90-118 Poznan, PL, 61-725 Warszawa, PL, 00-839
Your profile:
- Spearhead design and development of modern data processing applications
or Big Data-based solutions on public, private and hybrid clouds - Stakeholder relationship management
- Understand complex data and aggregation requirements and translate them into functionalities
- Interface with SMEs, business analysts and other IT teams to understand the requirements and translate them into technical deliverables
- Have to be critical thinkers who can connect the dots, and work either collaboratively or independently, whilst also being able to handle and resolve conflicts
Your responsibilities:
- Provide deep technical skills for modern data migrations to the cloud and cloud native implementations
- Be a “go-to” expert for data technologies and solutions
- Perform hands on design and implementation of complex hybrid and cloud solutions in high availability, high scale environments
- Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
- Develop Prototypes and Proofs of Concept to support, test and validate design and delivery assumptions
- Be an advocate for Data technologies within GFT and further enhance the data engineering capability in the UK and influence the development of GFT’s global Cloud delivery capability
- Communicate complex solutions in business terms to internal GFT and Client Stakeholders
- Mentor and coach less experienced Engineers to develop and grow GFTs talent pool, share best practice and establish common patterns and standards
- Provide industry thought leadership writing blogs posts and white papers, and being active on social media
Your skills:
Knowledge of some or all of the following topics for the services listed below: use cases / ELT patterns, benefits / limitations, optimization, security, cost management, internal architecture, HA, orchestration, CI/CD and API specific functions (e.g. for Airflow & Apache Beam):
- BigQuery
- Dataproc for Spark
- Dataflow for Apache Beam
- Composer/Airflow
Nice to have:
- CloudSQL for Postgres
- GCP operations suite for logging and monitoring
- GCS – Usage in conjunction with other services
- Low code ETL: Data Fusion & Data Prep
- Using GKE for industry wide products e.g. MongoDB & Kafka
We offer you:
- Working in a highly experienced and dedicated team
- Competitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
- Permanent or B2B contract
- On-line training and certifications fit for career path
- Free on-line foreign languages lessons
- Regular social events
- Access to e-learning platform
- Ergonomic and functional working space with 2 monitors (you can also borrow monitors for your home office)