Key Responsibilities:
- Develop and maintain Scala microservices (using Play framework) for data processing and fraud detection.
- Design and implement Infrastructure as Code (IaC) using Terraform.
- Build and optimise large-scale data ingestion pipelines in AWS (S3, Step Functions, Lambda in Scala & Python, EFS, RDS Aurora).
- Develop and orchestrate data processing pipelines using Airflow (Scala and Python), leveraging S3, RDS Aurora, and EMR.
- Contribute to cloud-native development using functional Scala and AWS services (S3, ECS, ECR, EC2).
- Ensure solutions are secure, robust, and aligned with government standards.
- Collaborate with cross-functional teams to improve fraud detection processes through innovative software and data engineering.
Essential Skills & Experience:
- Strong experience with Scala (including functional programming) and Python for data engineering.
- Proven track record in building microservices (Play framework preferred).
- Hands-on expertise with AWS (S3, Step Functions, Lambda, ECS, ECR, EC2, RDS Aurora, EMR, EFS).
- Solid experience with Terraform and Infrastructure as Code.
- Proficiency with data pipeline orchestration tools (Airflow preferred).
- Background in data ingestion, transformation, and processing at scale.
- Understanding of fraud detection systems or similar large-scale analytics platforms.
- Eligible for SC Clearance (or currently Active SC).
Desirable:
- Previous experience in public sector projects.
- Experience with security-focused engineering in data-intensive environments.