Job
- Level
- Senior
- Job Field
- BI, Data
- Employment Type
- Full Time
- Contract Type
- Permanent employment
- Location
- Vienna
- Working Model
- Full Remote, Hybrid
AI Summary
In this role, you will build robust data pipelines and manage the integration of various data sources. Your main task is to optimize the data infrastructure to meet the demands of our AI products.
Job Technologies
Your role in the team
- Are you a seasoned Data professional, passionate about building and maintaining robust data pipelines and infrastructure?
- Are you ready to join us in our pivot towards an AI future?
- In dieser Rolle sind Sie maßgeblich daran beteiligt, den reibungslosen Datenfluss zu gewährleisten, entscheidende datengestützte Erkenntnisse zu ermöglichen und die steigenden Anforderungen unseres neuesten KI-Produkts zu unterstützen.
- You will collaborate closely with the Data Engineering team, Data Analytics, Product teams, and other Engineering teams to deliver high-quality data solutions.
- If you love solving complex data challenges with tech and thrive in fast-paced, cross-functional environments, this role is for you!
This text has been machine translated. Show original
Our expectations of you
Qualifications
- Python Pro. You have solid expertise with Python and follow industry best practices.
- Integration Expert. You will create new integrations between diverse data sources and our data warehouse.
- Cost Conscious. You are adept at keeping data infrastructure costs under control and never lose sight of our spending.
- Proactive Problem Solver. You proactively identify and address areas for improvement in our data infrastructure.
- Strategic Contributor. You will contribute to strategic decisions regarding data infrastructure and architecture, making sure our set is efficient and future-proof.
Experience
- 5+ Years' Experience. You've worked in the data space as an Engineer and have a proven track record of building complex data pipelines in AI-empowered environments.
- Technical Acumen. You are proficient in SQL and have experience with data warehousing tools like BigQuery.
- Data Stack. You have hands-on experience with tools like Airflow and dbt, which are essential for our data operations.
- Data Pipeline Ownership. You will maintain and develop data pipelines, ensuring their reliability and efficiency. Less experienced team members can count on your guidance and mentorship.
- Cloud Credentials. You know your way around platforms (ideally GCP) and have worked with PubSub before. Experience with IaC and Kubernetes is a plus!
This text has been machine translated. Show original
Benefits
Health, Fitness & Fun
Work-Life-Integration
Job Locations
Topics that you deal with on the job
This is your employer

MeisterLabs Sofware GmbH
Wien
Meister has developed smart and intuitive web apps that help teams of all sizes and industries turn ideas into reality. Our flagship products MindMeister and MeisterTask support a complete creative workflow - from collaborative brainstorming to agile task management, making it easy for teams to get things done efficiently.
Description
- Company Size
- 1-49 Employees
- Founding year
- 2006
- Language
- English
- Company Type
- Startup
- Working Model
- Full Remote, Hybrid, Onsite
- Industry
- Internet, IT, Telecommunication
Dev Reviews
by devworkplaces.com
Total
(2 Reviews)4.4
Engineering
4.4Workingconditions
4.6Career Growth
4.1Culture
4.5