Position Summary
As an AI Engineer, you’ll support the development and deployment of transformational AI capabilities for large clients. You’ll combine leading open-source tooling and techniques with a suite of customer experience libraries and solutions, which automate the management of cross-channel communications with consumers for large clients. We make heavy use of the Python machine learning ecosystem, and build systems to deliver massive decisioning throughput, with tight latency constraints on our real-time systems. If you have deep experience in designing, implementing, automating, and deploying machine learning pipelines and workflows, we want to hear from you!
Your responsibilities will include:
Participate in all phases of the model development lifecycle
Solution productionalized machine learning to drastically reduce total cost per decision, moving expensive human-driven decisions to drastically cheaper and more effective machine-driven ones
Help design and implement functional requirements for client engagements
Collaborate with our services data scientists to deploy and use libraries and APIs which make machine learning for customer use cases both easy and powerful, and help brands gain deep understandings of their consumers
Prepare technical documentation
Integrate with surrounding technology components and services
Coach junior team members
Successful skillsets for this role are:
Deep interest in data science and software development
Eager to work with data scientists, fellow engineers, and product owners
Experienced with collaborative techniques like pair-programming and white board design sessions
Continuously learning and improving, and constantly exploring new languages, tools, and techniques
Our team
You’ll join a team of passionate, talented "pure" data scientists and hybrid AI engineers who collaborate to design, build, and maintain cutting-edge AI solutions that arm our clients with real-time customer insights that deliver significant value. If you are intellectually curious, hardworking, and solution-oriented, you will fit right into our fast-paced, collaborative environment.
Qualifications
Required:
Bachelor’s Degree in computer science, engineering, or a related field
3+ years of experience authoring, supporting, or providing a data science platform to data scientists:
Deep knowledge in the machine learning lifecycle, and in ways to facilitate collaboration and productivity in each of its phases
Exposure to working with data scientists and expertise in finding solutions to workflow problems
Knowledge of common machine learning frameworks and libraries and in ways to productionalize their inputs and outputs
Comfort with various machine learning techniques and their practical implementation, particularly reinforcement learning
Experience with one or more common workflow / pipelining frameworks (Kubeflow, MLFlow, Argo or equivalents)
Strong knowledge of the Python ecosystem, the Jupyter ecosystem (Lab, Notebook, Binder) and their libraries, norms, and tooling
Exposure to AutoML tooling (H2O, DataRobot or equivalents)
Experience in deploying and maintaining enterprise-scale machine learning applications in production
2+ years of experience writing well-tested production software
1+ years of experience on distributed, high throughput and low latency architecture
1+ years of experience building software on top of major container technology (Kubernetes, Docker, or similar)
Strong testing mindset with experience writing tests at various levels of granularity
Familiarity with Continuous Integration tools (GitHub actions, Travis-CI, etc.)
A history of good collaboration with DevOps and Project Managers on meeting project goals
Proven track record working with products from major cloud providers (AWS, GCP, Azure, etc.)
Limited immigration sponsorship may be available
Helpful, but not required:
Experience with large consumer data sets used in performance marketing is a major advantage
Exposure and/or expertise writing and/or running Terraform or other infrastructure-as-code automation
Well-versed in (or contributes to) data-centric open-source projects
Experience in performance analysis and optimization of machine learning applications, e.g., in optimizing code written by others
Presence or contributions to projects within the wider open-source ecosystem
Proven ability to communicate both verbally and in writing within a high performance, collaborative environment
Exposure to commonly used relational and non-relational databases
#Product22
AHBV
AHBV
AHBV
AHBV
Recruiting tips
From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters.
Benefits
At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you.
Job ID: 107246
Meta is embarking on the most transformative change to its business and technolo...
Deloitte’s Enterprise Performance professionals are leaders in optimizing...
Job Duties/Responsibilities:Determine the acceptability of specimens for testing...
• JOB TYPE: Direct Hire Position (no agencies/C2C - see notes below)â€Â...