Must have :
• 9 + years of IT Experience
• NO OPT/CPT
• Demonstrated work experience with distributed, scalable Big Data programming model and technologies such as Hadoop, Hive, Pig
• Strong experience/exposure across the entire data landscape with the ability to extend traditional data architecture techniques to include big data components.
• Data Strategy and Architecture development, Data Governance, Master Data Management, Metadata management, Data Integration/ingestion, Data Quality management, Data modeling, Data warehousing, Business Intelligence and advanced Analytics
• 8+ year in depth knowledge & experience in Hadoop around all the Hadoop ecosystem (HDP, HDF, M/R, Hive, pig, Spark/scala, kafka, Hbase, Elastic search and log stash a plus)
• 4+ years of experience working in Linux/Unix
• Demonstrated hands-on experience with at-least one of the major Hadoop distributions
• Good understanding & experience with Performance and Performance tuning for complex S/W projects mainly around large scale and low latency.
• Experience with leading Design & Architecture
• Hadoop/Java certifications is a plus.
• Excellent communication skills.
• Ability to work in a fast-paced, team oriented environment.
• Ability to identify and articulate domain specific use cases that can take advantage of big data tools and technologies