About work in our team
The world is changing, becoming digital, and so are we. We are leaving the traditional bank behind us and are choosing to move forward as a digital enterprise. This is exactly why we need talented people who will join us on this journey.
For that type of person, we have a role waiting for you as a Big Data Platform Engineer in the Big Data cluster.
Description of the cluster
You would work in the Big Data cluster and take care for the needs of the BDAA client who want to have an access to Data lake Platform of Hadoop platform. We work in a cluster structure which allows the project teams to work on their tasks in a self-organized and flexible manner. Each project team assumes product responsibility from start to end, from software development to operation. The teams use agile working methods for their organization in order to be able to deliver products to customers faster and more efficiently.
Description of the position:
In your role as a Big Data Platform Engineer you will be responsible for the continuous development of the Commerzbank central Hadoop Platform witch is one of the foundations of Commerzbank 4.0 Transformation to a digital technology company.
You take a role in an innovation driven team, that will provide the key technology which is based on modern Hadoop Distribution from Hortonworks Data Platform and Data Flow. They are the base stack for our Data Lake and Analytics Platform This Software Stack needs to be scaled out in terms of size, amount of users, segments and uses cases as well as technological enhancements to fit the requirements of our clients.
from the very
and health insurance
- Rollout upgrades and changes of configuration of services of the Hadoop Platform
- Automation of configuration of the Platform Ansible to provide a continuous scalable platform
- Support Data Scientists and operative Use Cases Developers with their problems on your platform
- Solve challenges like how to embed modern technology into an existing infrastructure
- Support Use Case Application Operation as 3d Level Support
- On-Call Service 24/7
- High knowledge of UNIX/LINUX (Red Hat) RH 7.x
- Good knowledge of Hortonworks Data Platform (HDP) 2.6.x or higher
- Basic knowledge of Hortonworks Data Flow (HDF) 3.x or higher
- Basic knowledge of Spark
- Basic knowledge of Hive
- Basic knowledge of Google Cloud GCP
Foreign language skills:
- English B2 level - mandatory
- German - nice to have
Please add the following disclaimer to your application:
2. I have read the content of the information clause, including information about the purpose and methods of processing personal data and the right to access to my personal data and about the right to correct, rectify and delete it.