This job board retrieves part of its jobs from: Toronto Jobs | Emploi Directeur | Emploi Restauration

Find the highest paying job opportunities in Toronto!

To post a job, login or create an account |  Post a Job

Toronto Jobs

Find the job of your dreams in the city of Toronto

highest paying job offers

We do our best to bring you only the top paying opportunities

Recruiting in toronto?

Don't miss the chance to publish your offer with us

previous arrow
next arrow

Hadoop Architect

Oalva

This is a Contract position in Toronto, ON posted May 21, 2018.

Job Description

Hadoop Architect

Job Description:

As a Hadoop Architect you will work with the customer’s technical teams to devise and recommend solutions based on the understood requirements.

Working with the software vendor teams, you will present complex technical architectures to the customer’s technical teams. You will design and implement agreed-upon Hadoop architecture. You will drive projects with customers to successful completion. You will ensure the Hadoop architecture is performing above service levels. You will work back with vendor support to timely resolve issues. You will write and produce technical documentation for the customer’s technical teams. You will ensure the customer has everything they need to be successful prior to leaving the engagement. You will participate in the pre

– and post-sales processes, designing and implementing POC’s for customers to demonstrate business case and checking back with customer for additional business need the Hadoop ecosystem can perform. We are looking for someone with expertise in building Hadoop platforms and extensive knowledge in Scala and Spark.

Requirements:

· Experience working with customers designing and architecting large-scale, enterprise Hadoop solutions.

· Ability to understand and translate customer requirements into technical requirements

· Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments

· Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.

· Experience implementing software in the enterprise Linux environment.

· Extensive experience using Hadoop-based data technologies like HDFS, MapReduce, HIVE, Pig, YARN, Spark SQL, and Scala/Java/Python.

· Experience in capacity planning, cluster designing and deployment.

· Experience with RDBMS/ETL platforms.

· Strong scripting skills with languages such as bash, perl or python.

· Expert ability in using and troubleshooting java in the Hadoop environment.

· Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos.

· Experience architecting performance-based large volume systems for optimal performance and firm understanding of cpu/mem/disk/network design points.

· Hortonworks Certified Professional – HDPCD and/or HDPCA

Company DescriptionWe are a big data (hadoop) solutions and consulting services provider based in Overland Park, KS doing business in the central US, south-central US and Canada. Our customers range from the fortune-50 to fortune-1000.

We have mission critical network analytics, business intelligence dashboards and NLP-based sentiment analytics. We also have the Hadoop Data Migrator – the ability for customers to migrate from their expensive Teradata and Netezza data warehouse appliances to the modern, cost-effective, open hadoop data architecture. Our consulting experts perform these implementations and other fulfill other customer’s needs in the big data and analytics space.