Data Architect
Data Architect
Kraków, PL, 30-302 Warszawa, PL, 00-839 Poznań, WP, PL, 61-569 Łódź, PL, 90-118
What will you do?
As Data Architect you will be working on designing and implementing state-of-the-art modern data processing systems for some of the biggest and most technologically advanced companies in the financial sector. Often working directly with stakeholder and up to C-Level client representatives, our architects are experts in top-level system design and project scoping.
Your tasks
- Design and maintain conceptual, logical, and physical data models aligned with business needs with Data Vault 2.0
- Driving governance processes, documentation
- Leading data model and platform innovations and improvements
- Provide architectural guidance on SQL development, data quality, and metadata management
- Lead the development of analytical and data warehouse systems, ensuring data integrity and performance
- Optimize queries and storage for large-scale datasets, collaborating with engineering and operations team
Your skills
- Proven experience as a Data Architect, Lead Data Engineer or similar
- Experience in designing scalable and flexible data models using Data Vault methodology or other (Kimball, Inmon, Anchor...)
- Expertise in relational and analytical database design principles
- Proficiency in dimensional modelling and data normalization techniques
- Knowledge of performance optimization for large-scale datasets
- Familiarity with enterprise-scale data warehouse environments
- Proficiency in SQL and strong understanding of performance optimization techniques
- Understanding of data lineage and auditability
- Excellent communication and stakeholder engagement skills
Nice to have
- Experience in BigQuery or similar Data Warehouse engines, including SQL querying, table partitioning, and clustering
- Experience with GCP-native services like Cloud Storage, CloudSQL
- Knowledge of GCP IAM roles and security best practices
- Familiarity with Google Cloud Data Fusion for data integration pipelines
- Experience with Apache Airflow for building and managing ETL workflows
- Proficiency in building efficient and reusable ELT/ETL pipelines
We offer you
- Remote or hybrid work (2 office days per week)
- Working in a highly experienced and dedicated team
- Competitive salary and extra benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
- Contract of employment or B2B contract
- On-line training and certifications fit for career path
- Free on-line foreign languages lessons
- Regular social events
- Access to e-learning platform