Job Description
Job Summary:
As a Senior Data Engineer, you will also participate in discovery processes with stakeholders to identify business requirements and expected outcomes. The Senior Data Engineer will also participate in operational support and maintenance of products and services as well as mentor junior team members.
Responsibilities:
- Design and engineer high-performance/large scale data engineering projects, producing maintainable and secure code with automated testing in a continuous integration environment.
- Develop production grade, consumable data views
- Ensure solutions support all functional and non-functional requirements
- Participate in operational support and maintenance of products and services
- Ability to participate in discovery processes with stakeholders to identify business requirements and expected outcomes
- Ability to mentor to junior team members
- Coordinate and collaborate with offshore teams
Basic Qualifications:
- Relevant cloud data engineering work experience
- Ability to perform across multiple phases of development for multiple complex projects, including technical design, build, and end-to-end testing
- Passionate about delivering data engineering projects and features in a team environment
- Demonstrate the ability to quickly learn new technologies
- Troubleshooting skills, ability to determine impacts, ability to resolve complex issues, and ability to exercise sound judgment and initiative in stressful situations
- Strong oral and written communication and interpersonal skills
Must have technical qualifications:
- Fundamentals of data pipelining, ELT/ETL, data architecture, and the overall data lifecycle
- Cross-platform development languages: Python preferred (Java specialty OK)
- Snowflake data warehouse exposure and experience
- SQL and scripting proficiency
- Relational database and NoSQL (ex: MongoDB, DynamoDB, Redis, HBase, Cassandra) database experience
- Cloud technologies including AWS and Google Cloud Platform (GCP)
Preferred Qualifications:
- Queuing Technology – Kafka, RabbitMQ, Redis, SQS, Kinesis Streams, Kinesis Firehose
- Data Processing – EMR, Spark, Glue, Spark Streaming/Flink
- Containers - Docker, Docker Swarm, Docker Applications
- CI/CD - Jenkins/Codebuild/GitLab
- Security - IAM roles, wire encryption, KMS, Kerberos, Authz, AD
- Infrastructure as Code - Terraform, Cloud Formation, CDK
Preferred Education
- Bachelor's degree in Computer Science, Engineering, Information Technology, or related field OR equivalent work experience
This role is considered hybrid, which means the employee will work a portion of their time on-site from a Company designated location and the remainder of their time remotely.
Job ID: 109212