Responsibilities
• Partner with technical and non-technical colleagues to understand data and reporting requirements.
• Work with Engineering teams to collect required data from internal and external systems.
• Design table structures and ETL strategy to build performant Data solutions that are reliable and scalable in a fast growing data ecosystem.
• Develop Data Quality checks for source and target data sets. Develop UAT plans and conduct QA.
• Develop and maintain ETL routines using ETL and orchestration tools such as Airflow, Luigi and Jenkins.
• Document and publish Metadata and table designs to facilitate data adoption.
• Perform ad hoc analysis as necessary.
• Perform SQL and ETL tuning as necessary.
• Develop and maintain Dashboards/reports using Tableau and Looker
• Coach and mentor team members to improve their designs and ETL processes
• Create and conduct project/architecture design review
• Create POC when necessary to test new approaches
• Design and build modern data management solutions
• Enforce common data design pattern to increase code maintainability
• Conduct peers code review and provide constructive feedback
• Partner with team leads to identify, design and implement internal process improvements
• Automate manual processes, optimize data delivery, understand when to re-design architecture for greater scalability
Basic Qualifications
• 3+ years of relevant Professional experience.
• 2+ years’ work experience implementing and reporting on business key performance indicators in data warehousing environments. Strong understanding of data modeling principles including Dimensional modeling, data normalization principles etc.
• 2+ years’ experience using analytic SQL, working with traditional relational databases and/or distributed systems such as Hadoop / Hive, BigQuery, Redshift.
• 1+ Years of experience programming languages (e.g. Python, R, bash).
• 1+ years of experience with workflow management tools (Airflow, Oozie, Azkaban, UC4)
• Expert level understanding of SQL Engines and able to conduct advanced performance tuning
• Experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase)
• Familiarity with data exploration / data visualization tools like Tableau, Chartio, etc.
• Ability to think strategically, analyze and interpret market and consumer information.
• Strong communication skills – written and verbal presentations.
• Excellent conceptual and analytical reasoning competencies.
• Degree in an analytical field such as economics, mathematics, or computer science is desired.
• Comfortable working in a fast-paced and highly collaborative environment. A great team player who embraces collaborations also work well individually while supporting multiple projects in parallel
Job ID: 107251
Meta is embarking on the most transformative change to its business and technolo...
Deloitte’s Enterprise Performance professionals are leaders in optimizing...
Job Duties/Responsibilities:Determine the acceptability of specimens for testing...
• JOB TYPE: Direct Hire Position (no agencies/C2C - see notes below)â€Â...