Global Data Strategy & Intelligence team at Salesforce provides strategic, actionable insights to executives, Sales teams and other internal stakeholders. We build analytics, predictive & descriptive models and data products in a fast paced environment, where a self-starter will thrive. The Data Engineer will be responsible for designing, developing and documenting data pipelines related to the global standard dashboards built for Sales. This role provides an opportunity to work closely with business stakeholders and technical teams across the organization. Successful candidate should have strong interest in data analytics, data integration and automation but should also have a strong zeal to understand and learn about the business.
- Lead, design, and develop analytical solutions for our partners to help drive smarter business decisions
- Work with Business Stakeholders for scoping requirements, conducting UAT and gathering feedback
- Build ETL solutions with SQL based technologies like Oracle, Snowflake and Spark
- Serve the team as a subject matter expert & mentor for ETL design, and other related data and programming technologies
- Proactively identify performance & data quality problems and drive the team to remediate them. Advocate architectural and code improvements to the team to improve execution speed and reliability
- Design, develop and document database objects with the best architectural principles and patterns.
- Work effectively in an unstructured and fast-paced environment both independently and in a team setting, with a high degree of self-management
- Prioritize and execute multiple tasks in a highly dynamic environment with a results oriented mindset.
Strong problem solving with acute attention to detail and ability to meet tight deadlines and project plans
- Demonstrate operational excellence and strong can-do attitude
- You find satisfaction in a job well done and thrive on solving head-scratching problems
- Be prepared to learn new platforms and syntax with short ramp-up cycles
- Be prepared for changes in business direction and understand when to adjust designs
- Bachelor degree in Computer Science or equivalent/related degree
- At least 2 years of professional experience with SQL
- 2+ years of experience implementing and managing Python open source data tooling such as Airflow, pandas, Jupyter
- Strong experience working with RDBMS & MPP databases (Oracle, PostgreSQL, Snowflake), with ability to optimize queries for high volume environments with a background in Data Warehousing concepts and schema design
- History of designing, building and launching extremely efficient & reliable data pipelines to move data (both large and small amounts) throughout a Data Warehouse.
- Experience with version control systems (GitHub, Subversion) and deployment tools (e.g. continuous integration) required.
- Experience with programming languages like Scala & scripting in Bash.
- Experience working with Public Cloud platforms like GPC, AWS, or Azure.
- Experience working with Linux and debugging performance issues. Experience with the Salesforce Platform is a big plus.
- Ability to research, analyze, interpret, and produce accurate results within reasonable turnaround times with an iterative mindset with rapid prototyping designs.
- Familiarity with Scrum/Agile project management methodologies
Prior Experience in the below technologies is a added bonus
- Data Visualization: Tableau CRM, Tableau
- ETL Technologies: Mulesoft
- Programming Languages: Java, Scala, Bash Scripting