Job
- Level
- Senior
- Job Field
- BI, Data
- Employment Type
- Full Time
- Contract Type
- Permanent employment
- Salary
- from 53.802 € Gross/Year
- Location
- Graz, Vienna, Pörtschach am Wörther See
- Working Model
- Hybrid, Onsite
Job Summary
In this role, you will build scalable ETL/ELT processes on Google Cloud, optimize BigQuery queries, and implement data architectures and real-time processing using Dataflow and Pub/Sub.
Job Technologies
Your role in the team
As a Data Engineer, you’ll play a key role in building modern, scalable, and high-performance data solutions on Google Cloud Platform (GCP). You’ll be part of our growing Data & AI team, designing and implementing data architectures that help clients unlock the full potential of their data.
Your job’s key responsibilities are:
- Building efficient and scalable ETL/ELT processes to ingest, transform, and load data from various structured and unstructured sources (databases, APIs, streaming platforms) into BigQuery and Cloud Storage
- Implementing data ingestion and real-time processing using Dataflow (Apache Beam) and Pub/Sub for batch and streaming workflows
- Developing SQL transformation workflows with Dataform, including version control, testing, and automated scheduling with built-in quality assertions.
- Creating efficient, cost-optimized BigQuery queries with proper partitioning, clustering, and denormalization strategies
- Orchestrating complex workflows using Cloud Composer (Apache Airflow) and Cloud Functions for event-driven data processing
- Implementing centralized data governance and metadata management using Dataplex with automated cataloging and lineage tracking
- Monitoring and optimizing data pipelines for performance, scalability, and cost using Cloud Monitoring and Cloud Logging
- Collaborating with data scientists and analysts to understand data requirements and deliver actionable insights
- Staying up to date with GCP advancements in data services, BigQuery features, and data engineering best practices
This text has been machine translated. Show original
Our expectations of you
Qualifications
Essential Skills:
- 3+ years of hands-on experience as a Data Engineer with proven expertise in Google Cloud Platform (GCP)
- Strong experience with BigQuery (SQL, partitioning, clustering, optimization) and Dataflow (Apache Beam)
- Strong programming skills in Python with experience in data manipulation libraries (PySpark, pandas)
- Expert-level SQL proficiency for complex transformations, optimization, and analysis
- Proficiency with Dataform for modular SQL-based data transformations and data pipeline management
- Solid understanding of data warehousing principles, ETL/ELT processes, dimensional modeling, and data governance
- Experience integrating data from various APIs and streaming systems (Pub/Sub)
- Cloud Composer experience for workflow orchestration
- Excellent communication and collaboration skills in English (min. B2 level)
- Ability to work independently and as part of an agile team
Beneficial Skills:
- Google Professional Data Engineer certification
- Knowledge of BigLake for unified access and management of structured and unstructured data
- Experience with Dataplex for managing metadata, lineage, and data governance
- Vertrautheit mit Infrastructure-as-Code (Terraform) zur Automatisierung der GCP-Ressourcenzuweisung und CI/CD-Pipelines
- Experience with data visualization tools such as Looker, Looker Studio, or Power BI
- Interest in or experience with machine learning workflows using Vertex AI or similar platforms
Experience
- 3+ years of hands-on experience as a Data Engineer with proven expertise in Google Cloud Platform (GCP).
- Strong experience with BigQuery (SQL, partitioning, clustering, optimization) and Dataflow (Apache Beam).
- Strong programming skills in Python with experience in data manipulation libraries (PySpark, pandas).
- Experience integrating data from various APIs and streaming systems (Pub/Sub).
- Cloud Composer experience for workflow orchestration.
- Experience with Dataplex for managing metadata, lineage, and data governance.
- Experience with data visualization tools such as Looker, Looker Studio, or Power BI.
- Interest in or experience with machine learning workflows using Vertex AI or similar platforms.
This text has been machine translated. Show original
Benefits
More net
- 🚙Poolcar
- 💻Company Notebook for Private Use
- 🛍Employee Discount
- 🎁Employee Gifts
- 📱Company Phone for Private Use
- 🚎Public Transport Allowance
Health, Fitness & Fun
- 👨🏻🎓Mentor Program
- ⚽️Tabletop Soccer, etc.
- 🧠Mental Health Care
- 👩⚕️Company Doctor
- 🎳Team Events
- 🚲Bicycle Parking Space
- 🙂Health Care Benefits
Work-Life-Integration
- 🕺No Dresscode
- 🧳Relocation Support
- 🅿️Employee Parking Space
- 🐕Animals Welcome
- 🏠Home Office
- ⏰Flexible Working Hours
- ⏸Educational Leave/Sabbatical
- 🚌Excellent Traffic Connections
Food & Drink
Job Locations
Topics that you deal with on the job
This is your employer
NETCONOMY
Graz, Madrid, Belgrade, Novi Sad, Pörtschach, Graz, Graz, Zürich, Wien, Dortmund, Amsterdam, Berlin
NETCONOMY is a leader in designing digital platforms and customer experience innovations, helping companies to identify and successfully capitalize on digital potential. Our flexible, scalable solutions are based on the latest technologies from SAP, Google Cloud, and Microsoft Azure. With over 20 years of experience and nearly 500 experts across Europe, we help our clients increase their innovation power and expand their core businesses into the digital world.
Description
- Company Size
- 250+ Employees
- Founding year
- 2000
- Language
- English
- Company Type
- Digital Agency
- Working Model
- Hybrid
- Industry
- Internet, IT, Telecommunication
Dev Reviews
by devworkplaces.com
Total
(2 Reviews)Engineering
3.8Career Growth
3.7Culture
4.5Workingconditions
4.2