Data Engineer for Data Integration and Big Data (m/f/d)

  • Stelleninserat
  • Arbeitgeber Einblicke

UNIQA Insurance Group is one of the leading insurance groups in our two core markets: Austria and Central & Eastern Europe. This requires fresh thinking - an attitude that is dynamic and solution-oriented. Do you think that's you?

UNIQA Group is optimizing and strengthening its IT-structure. The UNIQA IT Services GmbH have bundled all IT units of UNIQA Group national and abroad into a single entity. UNIQA IT Services (UITS) is the internal IT Service Provider for all groups within UNIQA.

We are looking for a

Data Engineer for Data Integration and Big Data (m/f/d)

Your main responsibilities

  • Designs, develops, maintains and tests Data Ingestion solutions "bottom-up" from source (e.g. legacy / mainframe / near real time) to target (files / interfaces / Data marts) using Talend Data Fabric, NiFi,
  • Implements Change Data Capture pipelines enabling UNIQA using real time data feeds (e.g. Kafka)
  • Supports the team in building-up large-scale data processing systems and complex big data projects
  • Builds data processing systems in Hadoop, Spark and Hive (Cloudera Data Platform)
  • Understands how to apply technologies to solve big data problems and contributes to the design of Enterprise Data Use Cases
  • Focuses on collecting, parsing, managing, analyzing and visualizing large sets of data coming from heterogeneous domains
  • Contributes to Data Governance in terms of enabling Data Lineage, Data Cataloging and Data Modeling
  • Works in a highly motivated interdisciplinary team with different Business stakeholders, Architects, Data Engineers, and Data Scientists from both Business and IT

Your qualification

  • 3 - 5 years minimum Java Development in Enterprise-grade environments as a must
  • Focus on cutting-edge Data Integration & ETL (e.g. Talend Data Fabric, NiFi), as well as Data Replication and Messaging Broker (e.g. Kafka) tools in Big Data Hadoop/Spark/Hive/HBase/Impala/Kudu ecosystems like Cloudera / Hortonworks as a must
  • FinTech / Insurance know-how as a plus
  • Knowledge in different programming or scripting languages like SQL (ANSI and Dialects), including Bash know-how
  • Experience in Software Development Lifecycle and in releasing applications via GIT workflows and automation
  • Excellent oral and written communication skills in German and English
  • Knowledge in tools for Data Governance and Data Cataloging (e.g Apache Atlas / Cloudera Navigator) as a plus

Your core competencies

  • Perseverance: single-mindedly and persistently pursuing your assignments
  • Drive for Results: achieving your goals dependably and timely
  • Action Oriented: being full of energy and seizing opportunities
  • Organizing: using resources effectively and efficiently
  • Learning on the Fly: learning quickly and having a focus on solutions

Annual minimum wage according to collective agreement: EUR 60.000 - gross. We are prepared to exceed depending on qualification and experience.
We are looking forward to receiving your application!

Apply now!

Sandra Taschler
HR Business Partner
Untere Donaustraße 21
A - 1029 Wien
Tel.: +43 664 888 390 17

Weitere Jobs, die dich interessieren könnten

Alle 101 Data Integration Jobs in Vienna anzeigen

Erhalte Data Integration Jobs in Wien per E-Mail