Looking for Data Engineer who will be responsible for designing, developing, and implementing business use cases in to data sets. This role involves overseeing the development of ETL flows using SSIS (SQL Server Integration Services), informatica, pyspark tools, and fostering a data-driven culture across the bank. The candidate will collaborate with stakeholders to ensure data accuracy, usability, and scalability.
Main Duties & Responsibilities:
- Able to work in production environment and monitoring the flows.
- Ability to work independently and as part of a team.
- Prefers to have experience in working with banking domain.
- Flexible to work with production support team.
- Ability to work independently and as part of a team.
- Able to enhance the etl flows based on the business requirement.
- Resolve the issue faced on the production environment with in SLA’S.
Qualifications
Experience:
- Minimum 3+ years of experience in data integration and ETL tools, DW concepts.
- At least 3+ years of hands-on experience with big data ecosystem.
- Proven experience designing and implementing business use cases in to data sets & models.
- trong background in shell scripting, sql and pyspark.
Education, Skillsets & Qualifications:
- Bachelor's degree in Computer Science or equivalent.
- Experience with ETL tools like SSIS, informatica or equivalent.
- Deep knowledge of data warehousing concepts.
- Exposure to big data components HDFS, HIVE, Impala, yarn is essential.
- Knowledge of relational and NoSQL databases, data modelling, and database design .
- Added advantage for python and pyspark.
- Experience with Informatica Data Quality will be Plus.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.
- Ability to work independently and as part of a team.
- Added advantage to have knowledge in go any where and sftp tools.