Analyst(Data/ Business/ Application)

Data Architects

About this role

Our ability to deliver brilliant personalised experiences for all of our  28 Petabytes of BT wealth of data and 30m customers is fundamental to our future success. As a business we are investing extensively in the automation of our operations and networks, and in building our base management and data capabilities. This will allow us to take better decisions with data and to act on them in an automated way across all our customer interactions. 

Data driven automation and decision making has never been more important to BT. We’re creating the best personal experiences for our customers to help them stay connected, and that mission is all underpinned by high quality, low latency data. This role will play a critical part in the creation and be part  of BT Digital build data pipelines, data systems and the data products built on top. 

As part of a team of highly skilled data engineers, you will create and maintain our logical data models, integrating data from multiple data sources into a central cloud repository. You will apply data cleansing and data standardisation rules, providing clear documentation of the business rules embedded in the system, with the potential of solving data quality issues through your team. You will work closely with the wider team to understand what the data journey needs to look like and work closely with the Data Architect team, Data scientists and AI team  to develop our products and services. You will write and maintain data engineering user documentation to provide transparency and maintain the knowledge base within the team. 

You'll have the following responsibilities

  • Applies specialist data expertise to develop and advise approach on a range of complex, high impact and data solutions a services
  • Assures data availability and quality in Digital and across all CFU's for BT
  • Helps to resolve technical problems
  • Proactively identifies new potential data sources and assess feasibility of ingesting
  • Develops tangible and ongoing standardisation, simplification and efficiency of engineering processes, reviewing and revising continuous improvement opportunities
  • Assures a high quality and comprehensive data flow and manage a team that provides a consumption layer where the business has access to all the data it needs
  • Ensures all data is compliant
  • Ensures that all data acquired is fully described/understood and communicated using appropriate tools
  • Productionise any tactical data feeds, including documentation

Essential skills & experience

  • Implements the strategy for Data Quality, and data quality aligned to the overall Base Management and Data strategy
  • Applies specialist data expertise to develop and advise approach on a range of complex, high impact and data solutions a service
  • Coordinates the data solutions and data quality plan within the team
  • Drives data availability and quality standards, tools, and frameworks in

Consumer and across our systems  

  • Coaches and works together with members of the Data Engineering team
  • Resolves critical technical problems at the team or department level
  • Drives tangible and ongoing standardisation, simplification, and efficiency of engineering processes, reviewing, and revising continuous improvement opportunities.
  • Ensures a high quality and monitory data flow jobs and  manage a team that provides a consumption layer where the business has access to all the data it needs
  • Sound awareness of Data Management best practice, including data lifecycle management
  • Extensive skills in SQL, quality tools both at production grade and at analytical level, gained through intensive application in a commercial business environment
  • Can deliver complex big data solutions with structured and unstructured data
  • Excellent oral and written communication skills for all levels of an organisation
  • Collaborate to identify how work activities across the teams are related and highlight inefficiencies. You help to remove barriers and find the resources or support needed to improve processes
  • Some knowledge of Cloud Computing patterns, workflows and services; and how they relate to a big data platform.
  • Advanced math skills (linear algebra, Baysesian statistics, group theory)
  • Background in machine learning & software engineering frameworks such as TensorFlow or Keras

Desirable skills & experience

  • Experience in the big data domain and cloud technologies (AWS, GCP)
  • Experience with cloud technologies (AWS, GCP)
  • Experience in a story-driven, agile environment.
  • Migration expertise from on premise technologies to cloud based warehouse technologies. (i.e. GCP-BigQuery etc)
  • Experienced in deploying data solutions and cloud infrastructure via CI/CD pipelines
  • Experienced in understanding Infrastructure as Code (Terraform / Cloudformation etc)
  • Knowledge of Docker/Kubernetes, and how these can be used to simplify deployments.
  • Experience of building large scale data pipelines on at least one Cloud Platform (GCP and AWS preferred)