About work in our team
The world is changing, becoming digital, and so are we. We are leaving the traditional bank behind us and are choosing to move forward as a digital enterprise. This is exactly why we need talented people who will join us on this journey.
For that type person we have role waiting for you as a Big Data Operations Engineer for Sales Analytics Cluster.
The Sales Analytics cluster as a part of the BDAA key area services the segments corporate customers as well as private and business customers. We combine and consolidate the bank's data in our Data Lake to identify customer needs and transform them to sales opportunities using customized analytical models.
In your role as a Big Data Operations Engineer who will be responsible for administration, operation and deployment of applications on big data platforms. You will take a role in an innovation driven team, which drives innovative products using state of the art technology such as the Commerzbank Data Lake and Google Cloud technology.

Online
recruitment
Benefits
and multicultural
environment
trainings
and German
courses
from the very
beginning
and health insurance
Pension
Scheme
Main tasks
- Working in cross-functional cells and deep involvement in the development process from the beginning in order to support the cell in implementing operational requirements such as monitoring, traceability and resilience
- Daily monitoring of production processes and incidents, taking the necessary measures and responsible coordination of problem solving in the cell and beyond the cluster and the key area
- 3th-Level-Support
- Maintain, Deploy and Configurate Big Data Sales Analytics Use Cases on Google Cloud and Data Base environment, Hive, Docker-Swarm, Kubernetes, Tomcat / Jetty (depending on Use Case)
- Responsibility for technical users and all configuration items related to your product in prelive and production
- Installation and configuration of the dedicated applications in prelive and production and rollout of increments using our continuous integration and continuous deployment environment
Expectations
- Linux/Unix and shell scripting - good knowledge
- SQL / Databases - good knowledge
- Scheduling tools (like Cron, UC4 or similar)
- Experience with cloud solutions such as e.g. GCP – Google Cloud Platform (preferred), AWS, Azure or Kubernetes
- Experience with Applications Containers like Docker, Docker Compose and Docker-Swarm
- Experience with Applications Servers, e.g. Tomcat, Jetty
- Readiness for business travels to Germany (few trips within first months due to gain training)
Nice to have:
- Experience with Hadoop ecosystem or willingness to learn
- Cloudera or Hortonworks Data Platform (HDP)
- Monitoring tools experience such as Grafana, Nagios, Zabbix, Prometheus
- CI/CD tools, e.g. Jenkins, TeamCity
- Version Control tools, e.g. GIT (preferred), SVN
- Jira, Bitbucket, Confluence
- Knowledge of ITIL processes (certificates is an advantage but no required)
- Data quality topics - practical experience
- Python, PySpark
- Spark/Beam
Foreign language skills:
- English B2 level mandatory
- German optional
Please add the following disclaimer to your application:
2. I have read the content of the information clause, including information about the purpose and methods of processing personal data and the right to access to my personal data and about the right to correct, rectify and delete it.