Software Developer/ Engineer/ Architect

Staff Data Engineer

Zendesk is a service-first CRM company that builds powerful, customizable software crafted to improve customer relations. At Zendesk, we encourage growth, innovation and believe in giving back to the communities we call home.

Our Enterprise Data & Analytics (EDA) is looking for a talented and experienced Staff Data Engineer to join our growing platform & engineering team. You’ll work in a collaborative Agile environment using the latest in engineering best practices with involvement in all aspects of the software development lifecycle. You will be responsible for ensuring the team makes sound design & configuration decisions to develop curated data products, applies standard architectural practices, and supports the Data Product Managers in evolving core data products. You will primarily develop on Google Cloud Platform and other technologies such as Linux, Docker, Kubernetes, BigQuery, Kafka, Airflow, and Python

What you get to do every single day:

  • Solve complex problems and provide sustainable solutions
  • Help to build and operate internal analytical platform
  • Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
  • Work in an agile development environment
  • Participate in an on-call rotation along with the rest of the team
  • Use a wide variety of technologies, learn new things, wear lots of hats (actual hats optional)
  • Participate in diverse projects and collaborate with engineering, product, analytics teams
  • Have fun and enjoy your time @ Zendesk

What the stack looks like:

  • Our platform runs on GCP and AWS, fully orchestrated by Kubernetes
  • We use Terraform and helm to deploy and manage platform components.
  • Our data pipeline is built on Kafka, EMR, GCS, BigQuery and Airflow
  • And the underlying code written in Python and Go

What you bring to the role:

  • Drive data architecture and integration design and development discussions with engineering and other teams
  • Programming and automation of Data processing systems in Linux/Cloud environment
  • Ability to quickly master new technologies
  • 7+ years of software development or DevOps/DataOps experience
  • Ability to work effectively in a dynamic, occasionally interrupt-driven environment that includes geographically spread teams and customers
  • Experience in developing and operating high-volume, high-availability environments
  • Previous experience with Linux, Docker and Kubernetes
  • Previous experience with AWS or GCP
  • Good command of Python or Go.
  • BA/BS degree in Engineering, CS, or equivalent