- 3+ Years of Development /
Data Engineering Experience
- 2+ Years Developing with AWS Cloud
solutions
- 2+ Years Developing in
Spark/PySpark
- 2+ Years Developing AWS
Glue ETL
- 2 + Years AWS Storage
models (S3 and DynamoDB)
- Some hands-on developing
with on-prem ETL tooling (Ab Initio preferred, Informatica)
Requirements
- Bachelor of degree in
computer or any related field.
- Minimum of 12+ years’
experience in the said role.
- Strong experience in AWS
GLUE, EMR, Hudi, experience extracting data from multiple sources &
loading data into Data Lakes, AWS Redshift
- Experience working with
AWS Elastic search, RDS, PostgreSQL preferred
- Experiencew/ AWS services such as Lambda, EMR,
SNS/SQS, Event Bridge, Lake Formation & Athena
- Experience w/ integrating
applications/systems (data producers) with Enterprise Kafka topics
(Confluent Kafka integration w/ AWS S3, Redshift)
- Experience in Requirements
Analysis, Data Analysis, Application Design, Application Development &
Integration Testing
- Working knowledge of
on-prem Extraction, Transformation, Cleansing and Loading methodology and
principles
- Experience implementing
and adding to DevOps principles (GitLab, maintaining CI/CD pipelines)
- Experience in Java svcs is
nice to have.
N/A:- Please kindly note that this role is for USC and GC
Visa holders only.
Interested and qualified candidates should send their resume
to uunyime@modus-lights.com using
the job title as the subject of the mail.