Software Developer/ Engineer/ Architect

Data Engineer

Description

About Us:

Our mission is to elevate leading brands through unforgettable digital connections with their customers. Sitecore delivers a composable digital experience platform that empowers the world’s smartest and largest brands to build lifelong relationships with their customers. A highly decorated industry leader, Sitecore is the leading company bringing together content, commerce, and data into one connected platform that delivers millions of digital experiences every day. Thousands of blue-chip companies including American Express, Porsche, Starbucks, L’Oréal, and Volvo Cars rely on Sitecore to provide more engaging, personalized experiences for their customers. Learn more at Sitecore.com.

Sitecore’s foundation is our diverse group of passionate, smart, innovative, and collaborative individuals located across four continents and over 25 countries. Having a wide range of perspectives, experiences, and skills is what makes us the company we are today. The Sitecore values are what drive and unite us across the globe.

About the Role/ The Opportunity:

The Data Engineering and Analytics team is focused on making the large volumes of data ingested by the platform available for queries and analysis. Our challenges include building pipelines to ETL data from multiple sources, storage and schema design to improve query and ETL performance, all while ensuring our solutions are fully automated, cost effective and massively scalable for many clients with a lot of data. From the configuration of our analytical pipelines to the development of the jobs that run on them, we take full ownership of our features. We are entirely hosted on AWS and build upon services they provide such as AWS CDK+CloudFormation, Step Functions, EMR, Athena etc, with the ETL and enrichment jobs themselves written in Scala and using Spark 3. We are looking for a Data Engineer to join us and help improve our data pipelines.

What You’ll Do:

  • Help us to improve the scalability, reliability, automation and cost of our data pipelines
  • Design, develop, test, deploy, maintain and improve our software stack
  • Lead complex technical conversations and decisions
  • Own individual project priorities, deadlines and deliverables

What You Need to Succeed:

  • Experience with Big Data tools such as Spark, Hive, Kafka etc.
  • Good knowledge of SQL and query optimization
  • Strong functional and object-oriented programming experience
  • Experience ETL-ing large volumes of data
  • DevOps experience - not necessarily the AWS components listed above but experience with CI/CD and Infrastructure as Code tools is a plus

Additional Skills That Could Set You Apart:

  • Experience with OpenAPI/JsonSchema/Swagger
  • Experience building microservices and RESTful APIs