Apply now »

Senior Data Engineer

Senior Data Engineer

Custom Field 1:  Data
Custom Field 3:  Data
Country/Region:  IN
Date:  20 Feb 2025
Location: 

Pune, IN, 411014

Working place:  Hybrid

Work location: Remote, (Pune is preferred)

Work mode: Hybrid

 

 

 

About GFT 

GFT Technologies is driving the digital transformation of the world’s leading financial institutions. Other sectors, such as industry and insurance, also leverage GFT’s strong consulting and implementation skills across all aspects of pioneering technologies, such as cloud engineering, artificial intelligence, the Internet of Things for Industry 4.0, and blockchain.

 

With its in-depth technological expertise, strong partnerships and scalable IT solutions, GFT increases productivity in software development. This provides clients with faster access to new IT applications and innovative business models, while also reducing risk.

 

We’ve been a pioneer of near-shore delivery since 2001 and now offer an international team spanning 16 countries with a global workforce of over 9,000 people around the world. GFT is recognised by industry analysts, such as Everest Group, as a leader amongst the global mid-sized Service Integrators and ranked in the Top 20 leading global Service Integrators in many of the exponential technologies such as Open Banking, Blockchain, Digital Banking, and Apps Services.  

 

 

Role Summary

As a Data Engineer at GFT, you will play a pivotal role in designing, maintaining, and enhancing various analytical and operational services and infrastructure crucial for the organization's functions. You'll collaborate closely with cross-functional teams to ensure the seamless flow of data for critical decision-making processes

 

Key Activities

  • Data Infrastructure Design and Maintenance: Architect, maintain, and enhance analytical and operational services and infrastructure, including data lakes, databases, data pipelines, and metadata repositories, to ensure accurate and timely delivery of actionable insights
  • Collaboration: Work closely with data science teams to design and implement data schemas and models, integrate new data sources with product teams, and collaborate with other data engineers to implement cutting-edge technologies in the data space
  • Data Processing: Develop and optimize large-scale batch and real-time data processing systems to support the organization's growth and improvement initiatives
  • Workflow Management: Utilize workflow scheduling and monitoring tools like Apache Airflow and AWS Batch to ensure efficient data processing and management
  • Quality Assurance: Implement robust testing strategies to ensure the reliability and usability of data processing systems
  • Continuous Improvement: Stay abreast of emerging technologies and best practices in data engineering, and propose and implement optimizations to enhance development efficiency

 

Required Skills

  • Technical Expertise: Proficient in Unix environments, distributed and cloud computing, Python frameworks (e.g., pandas, pyspark), version control systems (e.g., git), and workflow scheduling tools (e.g., Apache Airflow)
  • Database Proficiency: Experience with columnar and big data databases like Athena, Redshift, Vertica, and Hive/Hadoop
  • Cloud Services: Familiarity with AWS or other cloud services like Glue, EMR, EC2, S3, Lambda, etc.
  • Containerization: Experience with container management and orchestration tools like Docker, ECS, and Kubernetes
  • CI/CD: Knowledge of CI/CD tools such as Jenkins, CircleCI, or AWS CodePipeline

 

Nice-to-have requirements

  • Programming Languages: Familiarity with JVM languages like Java or Scala
  • Database Technologies: Experience with RDBMS (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., DynamoDB, Redis)
  • BI Tools: Exposure to enterprise BI tools like Tableau, Looker, or PowerBI
  • Data Science Environments: Understanding of data science environments like AWS Sagemaker or Databricks
  • Monitoring and Logging: Knowledge of log ingestion and monitoring tools like ELK stack or Datadog
  • Data Privacy and Security: Understanding of data privacy and security tools and concepts
  • Messaging Systems: Familiarity with distributed messaging and event streaming systems like Kafka or RabbitMQ

 

What we offer you
You will be working with some of the brightest people in business and technology on challenging and rewarding projects in a team of like-minded individuals. GFT prides itself on its international environment that promotes professional and cultural exchange and encourages further individual development. 

About Us

We show commitment to our investors and stand for solid, long-term growth performance. Founded in Germany in 1987 and in American territory since 2008, GFT expanded globally to over 10,000 experts. And to more than 15 markets to ensure proximity to clients. With new opportunities from Asia to Brazil, the international growth story continues. We are committed to grow tech talents worldwide. Because our team’s strong consulting and development skills across legacy and pioneering technologies, like GreenCoding, underpin success. We maintain a family atmosphere in an inclusive work environment.

At GFT, we have a strong passion for technology that thrives on collaboration with our clients and celebrates success with our team. We are an organisation that empowers you to not only explore but raise your potential and seek out opportunities that add value. Would you like to shape the future of digital business together with us? Apply now and join our team!

Stay connected!

Enter your e-mail and we will keep you informed about upcoming events and opportunities that match your interests.

Apply now »