System Engineer for Big Data & Data Science Use Cases (m/f/d)

  • Stelleninserat
  • Arbeitgeber Einblicke
Drucken

UNIQA Insurance Group is one of the leading insurance groups in our two core markets: Austria and Central & Eastern Europe. This requires fresh thinking - an attitude that is dynamic and solution-oriented. Do you think that's you?

UNIQA Group is optimizing and strengthening its IT-structure. The UNIQA IT Services GmbH have bundled all IT units of UNIQA Group national and abroad into a single entity. UNIQA IT Services (UITS) is the internal IT Service Provider for all groups within UNIQA.

We are looking for a

System Engineer for Big Data & Data Science Use Cases (m/f/d)

Your main responsibilities

  • Drives the Data Operations and System Engineering Tasks
  • Designs solutions based on the business demand and project requirements for Enterprise Data Use Cases
  • Develops Infrastructure-as-Code (IaC) artifacts, CICD & automation solutions end-2-end from source (e.g. legacy / mainframe / near real time) to target (files / interfaces / Data marts) by integrating ETL pipelines (Talend Data Fabric, NiFi, et.al.)
  • Embracing the DataOps / DevOps methodology and improves our routines and procedures
  • Determines requirements for operating and running systems as well as understands the needs for Data Operations, Cloud Providers, and importance of Operation level requirements (e.g. nFRs, Capacity Management, ITIL processes, Disaster Recovery procedures)
  • Operates & Monitors Data Processing systems (consisting mainly of the Cloudera & Azure toolstack)
  • Works in a highly motivated interdisciplinary team with different Business stakeholders, Architects, Data Engineers, and Data Scientists from both Business and IT

Your qualification

  • High Proficiency in Operations Skills (Linux) for Middleware Technology Stacks like Cloudera Data Platform / Hortonworks, Enterprise Application Servers (WebSphere / JBoss / Geronimo / etc…)
  • High Proficiency in Scripting (UNIX Shell, Perl, PHP, Python or similar), DevOps Tools (GIT, Jenkins, Docker, Openshift/Kubernetes) and development of IaC (Jenkins, Ansible / Kubernetes / TerraForm)
  • 3 years experience on operating Apps using Cloud Infrastructure Provider APIs & Management Consoles (e.g. Azure / AWS / Google)
  • Experience in Software Engineering minimum 2 years of Java Development in Enterprise-grade environments
  • Know-How in Networking & IT Security area (Networks and Network segments, Firewalls, PKIs and Certificates, Kerberos, Load-Balancing, etc.) and Know-How for Web-Service Interfaces (REST, SOAP)
  • Excellent oral and written communication skills, in German and English (Slovak, Bulgarian, Hungarian as a plus)
  • Excellent team player, able to work in a problem-solving agile environment
  • Experience with BI & Analytics Platforms, Reporting preferred

Your core competencies

  • Perseverance: single-mindedly and persistently pursuing your assignments
  • Drive for Results: achieving your goals dependably and timely
  • Action Oriented: being full of energy and seizing opportunities
  • Organizing: using resources effectively and efficiently
  • Learning on the Fly: learning quickly and having a focus on solutions

Annual minimum wage according to collective agreement: EUR 43.946 - gross. We are prepared to exceed depending on qualification and experience.
We are looking forward to receiving your application!

Apply now!

Sandra Taschler
HR Business Partner
Untere Donaustraße 21
A - 1029 Wien
Tel.: +43 664 888 390 17

Weitere Jobs, die dich interessieren könnten

Alle 112 Data Science Jobs in Vienna anzeigen

Erhalte Data Science Jobs in Wien per E-Mail