Expert for Data Engineering in Data Integration and Big Data (f/m)

  • Stelleninserat
  • Arbeitgeber Einblicke
Drucken

In order to support us in driving our ambitious initiative in building up and establishing UNIQA's Enterprise Data Platform, UNIQA IT Services GmbH, the IT provider for the UNIQA group, is hiring an

Expert for Data Engineering in Data Integration and Big Data (f/m)

Your main responsibilities

  • Responsibility for Data Integration Pipelines
  • Designs solutions based on the business demand and project requirements for Enterprise Data Use Cases
  • Develops, maintains and tests Data Ingestion solutions "bottom-up" from source (e.g. legacy / mainframe / near real time) to target (files / interfaces / Data marts)
  • Implements and integrates ETL workflows (Talend Data Fabric, NiFi, et.al.)
  • Implements Change Data Capture pipelines enabling UNIQA using real time data feeds
  • Supports the team in building-up large-scale data processing systems and complex big data projects
  • Builds data processing systems in Hadoop, Spark and Hive (Cloudera Data Platform)
  • Understands how to apply technologies to solve big data problems and contributes to the design of Enterprise Data Use Cases
  • Contributes to Data Governance in terms of enabling Data Lineage, Data Cataloging and Data Modeling
  • Focuses on collecting, parsing, managing, analyzing and visualizing large sets of data coming from heterogeneous domains
  • Works in a highly motivated interdisciplinary team with different Business stakeholders, Architects, Data Engineers, and Data Scientists from both Business and IT

Your qualification

  • Proficiency in cutting-edge Data Integration & ETL (e.g. Talend Data Fabric, Nifi), as well as Data Replication tools as a must
  • Experience with Big Data Hadoop/Spark/Hive/HBase ecosystems like Cloudera / Hortonworks as a must
  • Experience with BI & Analytics Platforms, Reporting (eg. SAS Analytics Platform / Viya / etc..., Information Builder WebFOCUS, Power BI et.al.) preferred
  • Knowledge in tools for Data Governance and Data Cataloging (.e.g Apache Atlas / Cloudera Navigator) preferred
  • Knowledge in different programming or scripting languages like SQL, Java, Python, R, Ruby
  • Good oral and written communication skills in German & English (Slovak, Bulgarian, Hungarian as a plus)
  • Good team player, able to work in a problem-solving agile environment

Your core competencies

  • Problem Solving: methodically analyzing and solving issues
  • Peer Relationships: being a good team player
  • Action Oriented: being full of energy and seizing opportunities
  • Drive for Results: achieving your goals dependably and timely

Annual minimum wage according to collective agreement: EUR 42.952 - gross. We are prepared to exceed depending on qualification and experience.
We are looking forward to receiving your application!

Apply now!

Mag. Wolfgang Küchl
HR Business Partner
Untere Donaustraße 21
A - 1029 Wien
Tel.: 0043 1 21175 - 3613

Weitere Jobs, die dich interessieren könnten

  • Expert Data Integration Engineering (m/f)

    am 7.11.2019

    Your responsibilities: Building, managing and optimizing data pipelines and then moving data pipelines and ensuring that the data parses through the built-in replication and curation module and data is then processed through...

  • Einblicke Videos

    Data Warehouse Data Integration Specialist

    am 29.10.2019

    Ihre Aufgaben: ​ Unterstützung bei der Implementierung von neuen Datenanforderungen; Schnittstellenfunktion zwischen Business und Technik, (Daten)integration im Zuge von M&A Projekten, Kommunikation...

  • Einblicke

    Data Engineer for BI and Big Data (f/m)

    Main responsibilities: Implementing Enterprise-scale BI and Analytics Use Cases using Big Data Technologies, Leading an Agile Team, which is committed to the fast delivery of high-quality results using the BI Reporting & Analytics...

Alle 86 Data Integration Jobs in Wien anzeigen