Job
- Level
- Senior
- Job Field
- Data
- Employment Type
- Full Time
- Contract Type
- Permanent employment
- Location
- Vienna
- Working Model
- Hybrid, Onsite
Job Technologies
Your role in the team
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a variety of data sources using AWS technologies
- Implement data pipelines and data integration solutions with Python and Java languages
- Ensure accuracy and timeliness of data is maintained in the data layer, working with stakeholders to align solutions and data integration with business objectives
- Clean and wrangle data into a usable state that are ready to be used by the Business Intelligence and Data Warehouse team
- Debug, troubleshoot, improve cloud-based applications and suggest improvements to current data architecture
This text has been machine translated. Show original
Our expectations of you
Qualifications
- Demonstrated ability to understand requirements and deliver accurate results according to scope
- Proficient in SQL, and at least one of the following languages: Python, Java, Scala
Experience
- 5+ years of related work experience, like in data integration and data gathering methodologies and APIs, and an understanding of ETL/ELT infrastructures and cloud architectures
- Experience with AWS services, Service-Oriented Architecture, Apache technologies such as Spark, and knowledge of version control systems such as GIT
- Experience working with a variety of APIs (e.g., REST API or SOAP API)
This text has been machine translated. Show original