W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9jbnrlbgxldgvjig5ldy9qcgcvbmv3lwjhbm5lci1kzwzhdwx0lmpwzyjdxq

Hadoop SME

  • Location

    Paris

  • Sector:

    Data Science

  • Job type:

    Permanent

  • Contact:

    Gary Tiller

  • Contact email:

    gary@intelletec.com

  • Job ref:

    GB_HE_gs

  • Startdate:

    May

  • Consultant:

    Gary Tiller

A leading in-memory computing platform is looking for a Hadoop SME to join the BigData cross-functional infrastructure team to assist in developing and operating the data exploitation solution. 

The company provide the leading in-memory computing platform for fast data analytics and extreme transaction processing. Their clients include; Tier-1 and Fortune-listed global organisations across financial services, retail, transportation, telecom, healthcare, and more.

Responsibilities

  • Creating data solutions for IT architecture, infrastructure, and security

  • Maintain production, respond and implement requests for Data Lake solution

  • Document requirements and updates to the solution

  • Stakeholder management and client consultation

Requirements

  • 3yrs commercial experience with hadoop, workingin agile environments
  • Hadoop, HDFS, Kafka,  Nifi ,  Flume , Spark  HBase, Hive , Oozie, ELKA
  • Java Enterprise / J2E and PostgreSQL databases 
  • English - business level (French NOT essential)