Software Developer/ Engineer/ Architect

Lead DevOps Engineer

Job Description

Our Centre for Intelligent Power located at Eaton’s global headquarters in Dublin, applies data science to transform all aspects of our company. We’re working on solving the technical challenges of today’s digital world, while developing groundbreaking technology that’s going to power the next generation of futuristic products and services. Our multidisciplinary team of data scientists, data engineers and UX/UI designers are developing products based on AI, Edge computing and Machine learning.
We’re continuing to expand our organization and are now recruiting a Lead DevOps Engineer who will support the expansion and management of our large-scale, highly-distributed edge and cloud data processing pipelines.
You will work within our world class diverse and inclusive team and will have the opportunity to support all of Eaton’s businesses (Electrical, eMobility, Vehicle and aerospace) addressing some of the toughest climate change challenges.

Your key deliverables :

  • Own and develop the deployment strategy for our data processing and machine learning platform using best practices in infrastructure as code on Microsoft Azure.
  • Work with data scientists on refining their algorithms for production deployment, artifact management, performance monitoring and model accuracy validation, along with automating & streamlining parts of this process.
  • Ensuring that infrastructure and code is delivered incrementally and according to our internal security deployment policies.
  • Responsible for ensuring pipelines are deployable and sustainable at scale and meets the business needs.
  • Author high-quality, high-performance, unit-tested code to extract and transform data based on business and data science needs.
  • Work directly with stakeholders, engineering, and test to create high quality solutions that solve end-user problems.
  • Help build and support/mentor a small team of DevOps and MLOps engineers
  • Develop and execute agile work plans for iterative and incremental project delivery
  • Explore and recommend new tools and processes which can be leveraged across the data preparation pipeline for capabilities and efficiencies.
  • Demonstrate exceptional impact in delivering projects, products and/or platforms in terms of scalable data processing and application architectures, technical deliverables and delivery throughout the project lifecycle.

Required:    

  • Bachelor degree (Masters advantageous) in computer science or equivalent software engineering discipline.
  • Experience managing deployment to cloud systems (preferably Microsoft Azure) through Infrastructure As Code solution (preferably Terraform)
  • Knowledge of cloud based networking, security and access management, log analysis etc.
  • Experience deploying and managing Kubernetes clusters, writing Docker images & image management.
  • Experience researching and applying devops best practices and process such as git branching strategies, version control and code review process
  • Full design and implementation of devops pipelines to validate, build, test, and deploy different content such as scale packages, python packages, docker containers.
  • Experience implementing automated solutions/workflows to improve code quality e.g static code analysis, unit testing, integration testing etc.
  • Strong problem solving and software debugging skills.
  • Good judgment, time management, and decision-making skills.
  • Experience with Agile development methodologies and concepts.
  • Excellent verbal and written communication skills including the ability to effectively explain technical concepts & whiteboarding.

Desired:

  • Proficient in Linux environments.
  • Solid knowledge of Python, PySpark, Scala advantageous.
  • Proficiency with cloud technologies (IaaS, PaaS, serverless technology) micro-service design, CI/CD and DevOps (Azure) technologies.
  • Knowledge of the basic concepts of data science & machine learning.
  • Experience in developing and maintaining AI & ML DevSecOps pipelines to support Model training and prediction requirements.
  • Experience with deploying code & artifacts which use ML frameworks and libraries such as TensorFlow, Keras, Pytorch and Kubeflow.
  • Experience with big data orchestration frameworks such as Apache Airflow.
  • Keeps abreast of upcoming software development/engineering tools, trends, and methodologies.

What Eaton offers:

  • Challenging projects in dynamic collaborative team
  • Excellent working environment – safety and ethic are really important for us
  • Inclusion & Diversity - Openness to diversity widens our access to the best talent. Inclusion allows us to engage that talent fully
  • Learning & Development  - We invest in our employees for the long term – not just with salary and benefits, but with ongoing learning and development opportunities made available through Eaton University