-
Location
Paris
-
Sector:
-
Job type:
-
Contact:
Gary Tiller
-
Contact email:
gary@intelletec.com
-
Job ref:
GB_HE_gs
-
Startdate:
May
-
Consultant:
Gary Tiller
A leading in-memory computing platform is looking for a Hadoop SME to join the BigData cross-functional infrastructure team to assist in developing and operating the data exploitation solution.
The company provide the leading in-memory computing platform for fast data analytics and extreme transaction processing. Their clients include; Tier-1 and Fortune-listed global organisations across financial services, retail, transportation, telecom, healthcare, and more.
Responsibilities
-
Creating data solutions for IT architecture, infrastructure, and security
-
Maintain production, respond and implement requests for Data Lake solution
-
Document requirements and updates to the solution
-
Stakeholder management and client consultation
Requirements
- 3yrs commercial experience with hadoop, workingin agile environments
- Hadoop, HDFS, Kafka, Nifi , Flume , Spark HBase, Hive , Oozie, ELKA
- Java Enterprise / J2E and PostgreSQL databases
- English - business level (French NOT essential)