AWS/ ETL Developer – Software Engineer

  • International Personal Finance
  • Leeds, UK
  • Sep 25, 2019
Full time Information Technology

Job Description

To support the Data Science & Strategy (DSS) function in delivering numerous data initiatives supporting workstreams relating to IPF group change and governance processes and system developments as a key part of IPF’s data programme.

Working as an experienced ETL developer implementing strategic data initiatives. Using your experience and expertise to provide complex data transformations to drive the development of data science, business intelligence and insights, ensuring data sets are extracted from a wide range of business systems, transformed and loaded in a data lake - implementing plans, with project and data teams delivering data requirements to improve insights via data analysis and modelling.

 What you will be required to deliver in your new role?

  • Take ownership of Data Lake CI/CD activities and automation of route to live.
  • Installation, configuration and maintenance of Cloud server environments for the purposes of development/testing, including supporting the creation of test data sets.
  • Conduct regular monitoring/audits and investigations of the AWS estate to identify cost control options and actions
  • Create Data Lake AWS access roles in line with company policy
  • Liaise with Cloud Management team on AWS best practices, plan and implement them on the Data Lake
  • Serve as AWS technical lead on Data Lake projects and support the upskilling of other team members on AWS
  • Translation of functional specifications and story elaboration to for the development of code/packages, based on in-house standards supporting data programme initiatives achieving agreed scope, data conformance and delivery plans.
  • Provision of estimates and timings to Delivery Manager for the creation and support of structured code and test criteria within work packages.
  • Match business requirements and technical specifications, developing appropriate solutions across multiple systems personally delivering solutions.
  • Documentation of sources to target mappings for both data integration as well as web services.
  • Documentation and coding data quality and/or transformation rules, that can be easily understood by data science team members.
  • Critical evaluation of information gathered from multiple sources; reconcile, conflict, classifying information into logical categories.
  • Contributions to the development of coding standards, development techniques and data compliance standards.
  • Willingness to learn new technologies and move between different technology stacks.
  • Documentation and contribution using online collaboration tools JIRA/Confluence/ Bitbucket.
  • Collaboration with third party suppliers and products, as well as in-country IT teams to identify and deliver appropriate technologies and solutions.

What skills are we looking for?

Qualifications:

  • AWS Certified Architect/Developer or equivalent experience, 3+ years 
  • ETL Data Integration certified, preferably Pentaho Data Integrator 2+ years 
  • DevOps experience 2+ years (Desirable)
  • Degree qualified (2:2 or above), or equivalent (Desirable)

 Skills:

  • Agile/iterative development frameworks.
  • Fluent English speaking/writing.
  • Excellent verbal & written communication.
  • Analytically minded, abilities to develop insights from data sets.

 Knowledge:

  • Deep understanding of data and analytics, gained within data warehouse and/or data lake environments.
  • Demonstrable knowledge of Amazon Web Services and architecture.
  • Demonstrable knowledge of ETL practices and patterns, preferably Pentaho Data Integrator.
  • Deep knowledge of T-SQL and one or more languages of choice (Python, Poweshell, Java, Scala, Ruby, Javascript, C#).
  • Structured, semi-structured and unstructured data types and functional strengths/weaknesses.
  • Working knowledge of Change Management processes and Project Management methodologies.
  • Online collaboration tools JIRA/Confluence.
  • Financial services IT solutions.
  • Data modelling and mapping techniques.
  • Data quality and cleansing processes.
  • IT industry and ways of working within multi-disciplinary teams.

 Experience:

  • Working within agile/lean delivery teams and frameworks.
  • Code development using T-SQL, version control and change management process controls.
  • Demonstrable experience with Amazon Web Services data management technologies (Aurora, Glue/ Athena, Kinesis, SageMaker, RDS, EC2, ALB).
  • Data source identification, data profiling, interpretation of patterns and functional specifications.
  • Assessment and improvement of data quality and latency factors.
  • Demonstrable experience within data warehouse and/or data lake solution environments.
  • Data quality, MDM, ETL, ELT and CDC processes.
  • Demonstrable experience with Data Integration patterns and/or Cloud Data Warehouse platforms, e.g. Snowflake.
  • Operating in an international environment.
  • Financial services would be highly beneficial.

 Other: You have a desire to work internationally and are willing and able to travel overseas