SUMMARY:
-
POSITION INFO:
The Data Engineering & Streaming Lead will be responsible for building, enhancing, and maintaining real-time data pipelines. The role involves working closely with various infrastructure and operations teams to maintain data infrastructure. As a senior engineer, the candidate will develop and maintain data ingestion pipelines, ensuring the bank's continued growth through efficient data handling. The role also includes collaborating with multiple teams to understand data needs and translate them into scalable platforms and services.
Duties & Responsibilities
Operational Delivery
- Build, enhance, and maintain real-time data pipelines.
- Work with infrastructure and operations teams to maintain data infrastructure.
- Identify, document, and resolve feature gaps while implementing solutions.
- Modernize and maintain services and tooling to ensure resilience, fix data discrepancies, and enhance customer experience.
- Monitor daily execution, diagnose and log issues, and fix pipelines to ensure SLAs are met.
- Mentor junior engineers through code reviews, best practices, and knowledge sharing.
Technical Leadership
- Participate in the engineering community of practice.
- Share AWS knowledge and practical experience within the community.
- Challenge and contribute to the development of architectural principles and patterns.
Compliance
- Ensure solutions adhere to Olympus patterns, guidelines, and standards.
- Operate within project environments and participate in continuous improvement efforts.
Delivery Management
- Follow and participate in Agile methodologies such as sprint planning, backlog grooming, retrospectives, demos, and PI planning.
Skills & Experience Required
Essential:
- Strong AWS Data Engineering experience.
- Hands-on experience with Kafka and Confluent.
- Strong expertise in real-time data streaming tools such as Kafka & Flink.
- Proficiency in:
- Apache Spark
- Apache Storm
- Cloudera Data Flow
- Databricks
- Hadoop
- AWS Kinesis Streaming
- Experience in developing cloud-based solutions.
- 2-3 years of experience designing and developing streaming data pipelines for ingestion or transformation using AWS technologies.
- Experience with distributed log systems such as Kafka and AWS Kinesis.
Qualifications & Certifications
- A degree in Computer Science, Information Systems, Business Administration, Commerce, or equivalent.
- Bachelor’s Degree in Computer Science, Big Data, or similar fields.
- AWS Data Engineer Certification (Advantageous).
- Relevant technical certifications (Preferred).
* In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent
** If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.