Job
- Level
- Senior
- Job Field
- Data, Back End
- Employment Type
- Full Time
- Contract Type
- Permanent employment
- Salary
- from 67.000 € Gross/Year
- Location
- Vienna
- Working Model
- Hybrid, Onsite
Job Summary
In this role, you will develop complex data pipelines using Databricks or Snowflake, integrate data from various sources, and design scalable data models to create powerful real-time data solutions.
Job Technologies
Your role in the team
As a Data Engineer at Capgemini, you work at the forefront of digital transformation – work with real-time data in Databricks or build a new architecture with Snowflake.
You leverage your skills to tame complex data streams and shape powerful solutions from them.
In this position, you will play an important role in:
- Connecting data that moves: You integrate data from APIs, batch processes, and real-time data streams, ensuring smooth processing.
- Create lasting structures: You design and implement data models that meet both technical and business requirements.
- Build scalable pipelines: You develop robust data pipelines that efficiently process large volumes of data and are continuously improved.
- Build Architecture: With Snowflake, you design a modern data architecture from the ground up.
This text has been machine translated. Show original
Our expectations of you
Qualifications
- You have expertise in Databricks or Snowflake, and ideally certifications in the respective tool.
- You possess very good knowledge of SQL, Python, automation (CI/CD), and DevOps, and you understand downstream processes (BI, Data Science, AI) and their requirements.
- You have a very good understanding of data modeling (Kimball, Data Vault) and know how to effectively represent complex data structures.
- You are a true team player, communicate clearly and convincingly – in German and English at C1 level.
Experience
- You have at least 3 years of experience in successfully implementing data projects.
This text has been machine translated. Show original
