Requirement for the Senior Hadoop Architect in Pleasanton, CA

Hi

 

Hope you are doing good.


Below is the requirement for the position of Senior Hadoop ArchitectPlease share qualified resource at harish.reddy@idctechnologies.com


Position: Senior Hadoop Architect

Location: Pleasanton, CA

Duration: 12 months


Technical/Functional Skills  

• Strong  hands-on knowledge in

• Core Java, J2EE, JDBC, Spring, Struts, Hibernate, Hadoop

• Distributed File Systems: HDFS

• Hadoop Map Reduce, Parallel Processing, Stream Processing, Flume, Splunk

• Architectural Patterns

• Analytical Tools: SAS, SPSS, R, Mahout

o Hands-on experience with BI tools and reporting software (e.g. Microstrategy, Cognos, Pentaho)

• Hands-on experience with the Hadoop stack (e.g. MapReduce, Sqoop, Pig, Hive, Hbase, Flume)

• Hands-on experience with related/complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef)

• Hands-on experience with ETL (Extract-Transform-Load) tools (e.g Informatica,  Talend, Pentaho)

• Hands-on experience with any 1 of analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout)

• Hands-on experience with "productionalizing" Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning)

• Ability to critically evaluate multiple architecture options

• Strong technical documentation skills

• Should know how to translate business requirements into analytical requirements

• Strong business development focus with an opportunistic attitude

• Responsible for talent management and recruiting to grow, at the same time, infusing the right "analytics-driven" culture

• Knowledge in any one of the following:

o Previous experience with high-scale or distributed RDBMS (Teradata, Netezza, Greenplum, Aster Data, Vertica)

o Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems

o Knowledge of NoSQL platforms (e.g. key-value stores, graph databases, RDF triple stores

• Experience in applying Big Data technology to any of the following use cases will be a definite plus:

o Unstructured Search

o Telematics

o Fraud Detection

o Call Centre Analytics

Social Media Analytics

• Good understanding of the organization's goals and objectives

• Excellent customer focus attitude

• Good written and oral communication skills

• Ability to present ideas in a user-friendly business language

• Proven analytical and problem-solving abilities

Experience working in a team-oriented, collaborative environment

8+ years


Roles & Responsibilities

Grow the business. Identify, communicate, and reach out within TCS to achieve growth 

Have the technical and business acumen to work with different levels of customer team  

Recruitment and Talent Management

Key liaison between customer and TCS teams when it comes to Harnessing of Big Data

Understand entire lifecycle of Big Data Analytics projects

Exhibit Leadership qualities with customer and TCS in Big Data

Conduct regular trainings and translation of results to business language, using visualization 

Market and Sell the value of actionable Insights to customers across the industry


Thanks,

Harish Reddy

Technical Recruiter

IDC Technologies, Inc.

Direct: (408) 520 2481    

Email:- harish.reddy@idctechnologies.com

--
You received this message because you are subscribed to the Google Groups "softwarejobs_US" group.
To unsubscribe from this group and stop receiving emails from it, send an email to softwarejobs_us+unsubscribe@googlegroups.com.
To post to this group, send email to softwarejobs_us@googlegroups.com.
Visit this group at http://groups.google.com/group/softwarejobs_us.
For more options, visit https://groups.google.com/d/optout.

0 Response to "Requirement for the Senior Hadoop Architect in Pleasanton, CA"

Postar um comentário