Oh no! This role has already been filled.

Don't worry, we have lots of other exciting jobs for you!

See available jobs in Technology

That job has expired. Here are some similar roles:

Agile Coach – Banking – Brussels

Agile Coach - Banking Client - Brussels Duration 9 months Rate: Flexible Hybrid: Will need to be onsite 2 days per week in the Brussels office. As a Senior Agile…

€700 - €800 per day

Solutions Architect – (Java, OpenShift, Databases, Kafka)

Solutions Architect (CREST Settlements, Java, Kafka, SOA, Micro-services) - Banking Client - London Duration: 1 year contract - this can be extended Hybrid working: 8 days onsite per month only…

£700 - £900 per day

M365 Transition Manager

M365 Transition Manager required to join an NHS trust based in London to support their move from on prem to off prem, this role is Hybrid and paying up to…

£0.00 - £425.00 per day

Principle Cloud Migration Architect.

Cloud Migration Architect required to work for an NHS organisation and it is fully remote, this is a day rate role and appreciate that it is not as much as…

Up to £395.00 per day

Senior Data Engineer

Technology

Brussels Contract / 6 - 12 months €500 - €700 per day

Senior Data Engineer – Banking – Brussels (onsite in 2022)

Day rate: €600 – €700

Duration: 6 – 12 months

Start: ASAP

My client is currently looking for a Senior Data Engineer with design skills whose core objectives will be to:
– Collect, clean, prepare and load the necessary data onto Hadoop, our Data Analytics Platform, so that these can be used for reporting purposes; creating insights and responding to business challenges
– Act as a liaison between the team and other stakeholders and contribute to support the Hadoop cluster and the compatibility of all the different software that run on the platform (Scala, Spark, Python, …).

Job description:

Identify the most appropriate data sources to use for a given purpose and understand their structures and contents, in collaboration with subject matter experts.
Extract structured and unstructured data from the source systems (relational databases, data warehouses, document repositories, file systems, …), prepare such data (cleanse, re-structure, aggregate, …) and load them onto Hadoop.
Actively support the reporting teams in the data exploration and data preparation phases.
Implement data quality controls and where data quality issues are detected, liaise with the data supplier for joint root cause analysis
Be able to autonomously design data pipelines, develop them and prepare the launch activities
Properly document your code, share and transfer your knowledge with the rest of the team to ensure a smooth transition into maintenance and support of production applications
Liaise with IT infrastructure teams to address infrastructure issues and to ensure that the components and software used on the platform are all consistent

Required skills:

  • Experience with analysis and creation of data pipelines, data architecture, ETL/ELT development and with processing structured and unstructured data
  • Proven experience with using data stored in RDBMSs and experience or good understanding of NoSQL databases
  • Ability to write performant Scala code and SQL statements
  • Ability to design with focus on solutions that are fit for purpose whilst keeping options open for future needs
  • Ability to analyse data, identify issues (e.g. gaps, inconsistencies) and troubleshoot these
  • Have a true agile mindset, capable and willing to take on tasks outside of her/his core competencies to help the team
  • Experience in working with customers to identify and clarify requirements
  • Strong verbal and written communication skills, good customer relationship skills
  • Strong interest in the financial industry and related data.

Will be considered as assets:

  • Knowledge of Python and Spark
  • Understanding of the Hadoop ecosystem including Hadoop file formats like Parquet and ORC
  • Experience with open source technologies used in Data Analytics like Spark, Pig, Hive, HBase, Kafka, …
  • Ability to write MapReduce & Spark jobs
  • Knowledge of Cloudera
  • Knowledge of IBM mainframe
  • Knowledge of AGILE development methods such as SCRUM is clearly an asset.

Job Information

Job Reference: JO-2108-245863_1629198998
Salary: €500 - €700 per day
Salary per: day
Job Duration: 6 - 12 months
Job Start Date: ASAP
Job Industries: Technology
Job Locations: Brussels
Job Types: Contract

Here are some related jobs

×
UK

Upload your CV

Upload your CV to our database.

  • Max. file size: 49 MB.
  • Hidden
  • This field is for validation purposes and should be left unchanged.
Senior Data Engineer

Please let us know where you are, or where you would like to be in the world so we can point you in the right direction.