Outstanding Opportunity in Data Engineering in Saudi ArabiaAs part of Saudi Arabia’s digital transformation journey under Vision 2030, Cody Software Solutions is partnering with a leading financial institution in the Kingdom to hire talented Data Engineers.
In this role, you will design, build, and optimize scalable data pipelines and platforms to support advanced analytics, AI, and business intelligence initiatives across the banking sector. You will work with big data technologies, modern ETL/ELT frameworks, and cloud-based infrastructures to ensure seamless data integration, transformation, and accessibility for enterprise-wide use.
Section I: Role PurposeCody Software Solutions is seeking Data Engineers – Big Data & Analytics on behalf of its distinguished Saudi partner in the banking sector.
The role is responsible for designing, developing, and maintaining robust data pipelines, ensuring reliable data availability, quality, and performance. The position plays a critical role in enabling advanced analytics, supporting AI-driven initiatives, and delivering business-ready data products that enhance decision-making and customer experiences.
Section II: Key Responsibilities / AccountabilitiesCore Responsibilities- Design, develop, and maintain scalable ETL/ELT pipelines for ingesting, transforming, and integrating structured and unstructured data.
- Build and optimize data workflows on big data platforms (e.g., Hadoop, Cloudera).
- Develop and manage data integration solutions using Informatica and related tools.
- Write and optimize Apache Spark jobs for distributed data processing at scale.
- Support banking use cases by ensuring availability of accurate and high-quality data for credit risk, fraud detection, regulatory compliance, and customer insights.
- Collaborate with data scientists, BI teams, and AI engineers to provide clean and well-structured datasets for modeling and analysis.
- Ensure adherence to data governance, security, and quality standards.
- Contribute to MLOps and DataOps operations by enabling reproducible, automated, and scalable data workflows.
Other Key Accountabilities- Operations Support: Monitor, troubleshoot, and optimize data jobs and pipelines to ensure high system availability and reliability.
- Policies & Procedures: Support the development of data engineering standards and best practices to ensure consistency and compliance.
- Reporting: Prepare accurate reports and system documentation for audits and regulatory purposes.
- Cybersecurity: Comply with information security policies, protect sensitive banking data, and report risks proactively.
Section III: Key Interactions- Internal: Work closely with data scientists, AI/ML engineers, BI teams, and product owners to ensure high-quality data delivery.
- External: Engage with technology vendors (e.g., Cloudera, Informatica) to stay updated on best practices and new features.
Section IV: Qualification & ExperienceMinimum Qualifications- Proven experience as a Data Engineer at mid to senior level.
- Strong communication skills in English and Arabic.
- Expertise in ETL/ELT processes and data pipeline design.
- Hands-on experience with:
- Big Data platforms (Cloudera, Hadoop ecosystem).
- Informatica for enterprise-grade data integration.
- Apache Spark for distributed processing.
- Familiarity with data operations, system monitoring, and troubleshooting.
Preferable Experience- Previous work in the banking or financial services industry.
- Knowledge of regulatory compliance frameworks for data in the banking sector.
- Cloud-based data platforms (AWS, GCP, or Azure).
- SQL, Python, or Scala for data engineering tasks.
Education- Bachelor’s or Master’s in Computer Science, Data Engineering, Information Systems, or related field.
Preferred Certifications- Cloudera Certified Data Engineer or Administrator.
- Informatica Professional Certification.
- Cloud certifications (AWS Data Analytics, GCP Data Engineer, or Azure Data Engineer).
Section V: CompetenciesCore Competencies- Driving Results – Deliver high-quality data systems aligned with organizational priorities.
- Collaboration – Work effectively within cross-functional teams.
- Customer Centricity – Ensure data products meet business and customer needs.
- Digital Mindset – Leverage modern tools and platforms to optimize data engineering.
- Effective Communication – Articulate complex technical concepts to non-technical stakeholders.
Technical Competencies- Big Data Platforms (Hadoop, Cloudera).
- ETL/ELT tools (Informatica, Talend).
- Distributed data processing (Apache Spark, Kafka).
- SQL and programming languages (Python, Scala).
- Data Governance, Security, and Quality frameworks.
- Cloud Data Engineering (AWS, GCP, Azure).
- DataOps and system monitoring best practices.