Detailed Job Description
Hands on big data architecture end to end project execution
14+ years of software development experience building large scale distributed data processing systems/application or large scale internet systems
Experience of at least 3 years in architecting Big Data solution at enterprise scale with at least one end to end implementation
Strong understanding & experience of Hadoop eco system such as HDFS, MapReduce, Yarn, Spark, Scala, Hive, HBase, Phoenix, Zookeeper, Pig, Hadoop streaming, Sqoop
Knowledge of Hadoop Security, Data Management and Governance
Ability to articulate pros & cons of TO-BE design/architecture decisions across a wide spectrum of factors
Work closely with Operations team to size, scale and tune existing and new architecture.
Experience working in core development Big Data projects, Should be able to perform hands-on development activities particularly in Spark, HBase / Cassandra, Hive, Shell Scripts.
Responsible for designing, developing, testing, tuning and building a large-scale data processing system
Troubleshoot and develop on Hadoop technologies including HDFS, Hive, HBase, Phoenix, Spark, Scala Map Reduce and Hadoop ETL development via tools such as Talend
Must have strong knowledge on Hadoop security components like Kerberos, SSL, and Encryption using TDE etc.
Ability to engage in senior-level technology discussions.
The ideal candidate is pro-active, shows an ability to see the big picture and can prioritize the right work items in order to optimize the overall team output.
Should have worked in agile environments and good to have exposure to Devops
Excellent oral and written communications skills
Hadoop Certified Developer/Architect will have added advantage
Should be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them
Should be able to clearly articulate pros and cons of various technologies and platforms
Should be able to document use cases, solutions and recommendations
Must have good knowledge on third party tools supporting big data
Must possess good knowledge on NoSQL database systems

Other Skills
Excellent oral and written communications skills,

Excellent oral and written communications skills, HDFS, Hive, HBase, Phoenix, Spark, Scala Map Reduce and Hadoop ETL development via tools such as Talend.,

Interview Information
Job Location : Bangalore
Interview Location : Bangalore