Senior Data Engineer - Contractor
OpenCredo, United Kingdom

Experience
1 Year
Salary
0 - 0
Job Type
Job Shift
Job Category
Traveling
No
Career Level
Telecommute
No
Qualification
As mentioned in job details
Total Vacancies
1 Job
Posted on
Jul 28, 2023
Last Date
Aug 28, 2023
Location(s)

Job Description

**Please note this is an inside IR35 contract for 3-6 months initially**

OpenCredo (OC) is a UK based software development consultancy helping clients achieve more by leveraging modern technology and delivery approaches. We are a community of passionate technologists who thrive on delivering pragmatic solutions for our clients' most complex challenges. Curious, tenacious but always sensitive to our clients' context, we are not afraid to speak our minds to help steer our clients towards understanding and achieving their key goals.

We are looking for a hands-on senior data engineer / architect who has worked with modern data technologies and operated with data at scale. You relish building modular configurable pipelines and solutions. You are well-versed in the challenges of working with streaming data, including handling windowing and aggregation challenges. You know the cases for streaming, ETL, ELT and batch processing, with experience bringing these approaches together in a coherent solution. Whilst we welcome broad experience across multiple clouds, you should have recent experience and be comfortable working with AWS data services and ecosystem (eg Kinesis, EMR, S3, Athena). This should include hands-on integration and development using core open source technologies such as Apache Kafka, Flink, Nifi, Spark and Airflow.

What we’re looking for:

  • A Data expert with great communication skills: You will be comfortable articulating and explaining key data concepts as well as diving into the details and the nitty-gritty where required. You are confident in your ability to deliver, upskill and transfer knowledge in your role as a technical Data Expert. You see value in, and make defining and documenting standard operating procedures (SOPs) and/or playbooks a part of helping ensure everyone is on the same page.
  • Data Pipelines amp; LifeCycle: You are comfortable designing and building scalable data pipelines, you know your ETL from your ELT and you have hands-on experience of all phases of the data lifecycle from ingestion, to cleansing, transforming as well as orchestrating the workflow.
  • A Problem Solver with a Can Do Attitude: You can be relied on as the person who gets stuck in and makes things happen.
  • Big Data Architectures: You have developed and worked with Big data architectures. You know what is required to support and run large-scale realtime and batch data processing workloads, including how to distribute load as appropriate.
  • A Skilled Technologist: You have a background in programming and creating data-centric solutions using code. You are an accomplished programmer in one or more programming languages ideally in Java, Python or both.


    Requirements


    • Real-time Streaming: Design amp; Development of real-time data streaming solutions (including handling time series data) leveraging modern technologies and industry practises using technologies such as: Apache Kafka, Flink, Spark, Beam
    • OpenSource amp; AWS Data Solutions: Design amp; Development of data pipelines and solutions within one or more cloud providers but with a focus on AWS, utilising a mixture of open source (for example via EMR) as well as vendor-specific data offerings.
    • Data Modelling amp; Engineering: A solid understanding of data modelling, including handling RDBMS, structured and unstructured data along with different storage formats such as CSV, Avro, Parquet etc.
    • Data Governance amp; MDM: Experience in challenges and solutions required to make legacy operational data available for onward analytical consumption. Practical applications for ensuring data quality and validation is built into different stages of data workflows allowing for good Master Data Management (MDM) including use of progressive data storage layers i.e bronze silver gold.
    • Handling Data At Scale: understanding or experience of one or more of : data lakehouse, data warehouse, data lake.
    • Orchestration amp; Dataflow management solutions: Building and orchestrating pipelines and workflows using Apache Airflow, as well as automating the flow of data between systems using frameworks like Apache Nifi.

    Benefits


    Need more reasons? Here's a few more...

    • Work with some of the most exciting new technologies
    • Spark off co-workers who’ll challenge your thinking and help you to achieve your potential
    • Deal openly and honestly with customers
    • Work alongside senior leaders who understand and value passionate technologists;

    Job Specification

    Job Rewards and Benefits

    OpenCredo

    Information Technology and Services - London, United Kingdom
    © Copyright 2004-2024 Mustakbil.com All Right Reserved.