Daily jobs added for your total convenience. The best opportunities in the GTA!

Hadoop Developer – Big data & Java

Amyantek Inc.

This is a Full-time position in Toronto, ON posted August 10, 2017.

Need Hadoop Java and Bigdata developers

Open to both FTE and contract

Location: Downtown Toronto

Design and Build distributed, scalable, and reliable data pipelines that ingest and process data at scale and in real-time.

• Collaborate with other teams to design and develop data tools that support both operations like data quality and product use cases.

• Source huge volume of data from diversified data platforms into Hadoop platform
• Perform offline analysis of large data sets using components from the Hadoop ecosystem.

• Knowledge of Banking domain is an added advantage Skills: • 7+ years of hands-on programming experience with 5+ years in Hadoop platform

• Knowledge of various components of Hadoop ecosystem and experience in applying them to practical problems

• Proficiency with Java and one of the scripting languages like Python etc.

• Experience building ETL frameworks in Hadoop using Pig/Hive/Map reduce/ Data Torrent

• Experience in creating custom UDFs and custom input/output formats / serdes

• Ability to acquire, compute, store and provision various types of datasets in Hadoop platform

• Experience with Hadoop or understanding of its components (HDFS, Pig, Hive, HBase, Spark)

• Experience with data modeling and data management tools.

• Experience working in an agile environment.

• Strong object-oriented design and analysis skills

• Excellent technical and organizational skills

• Excellent written and verbal communication skills Top skill sets / technologies:

• Java / Python ( Mandatory)

• Unix/ETL /DATAWAREHOUSE/SQL knowledge

• Sqoop/Flume/Kafka/Pig/Hive/(Talend or Pentaho or Informatica or similar ETL) / HBase / NoSQL / MapReduce/Spark)

Skills Required

  • Education level: None
  • Work experience (years): None

Package

Salary: N/D