3 - 6 yrs
Job Role Software Engineer - Big Data, Java/ETL
American Express Plans Live Audio Webcast of the Fourth Quarter 2015 Earnings Conference Call Most Americans "Deal-aying" Holiday Shopping American Express Declares Regular Quarterly Dividend American Express Serve® Expands Largest Free Cash Reload Network(1) in U.S.
1. 1. At least 3-4 years of experience in hadoop ecosystem.
2. Develops and tests software applications, including ongoing refactoring of code & drives continuous improvement in code structure & quality
3. Primary focus on application build, conducting code reviews & testing in ongoing sprints or performing proof of concepts/automation tools
4. Applies visualization and other techniques to fast track concepts
5. Functions as a core member of an Agile team driving User story analysis & elaboration, design and development of software applications, testing & builds automation tools
6. Works on a specific platform/product or as part of a dynamic resource pool assigned to projects based on demand and business priority
7. Identifies opportunities to adopt innovative technologies
1. Bachelor degree in Engineering or Computer Science or equivalent OR Master’s in Computer Applications or equivalent
2. 3 -6 years of total experience required.
1. Expert on Hadoop Architecture having sound knowledge on HDFS, Map Reduce, Hbase, Pig, Hive, YARN, Zookeeper , Spark, Solr, Sqoop, Flume
2. Must be proficient with SQL
3. Familiarity with Storm, Kafka, Solace, MQ
4. Experience in scripting languages (shell, python, etc) is good to have.
5. Experience in Graph databases and Machine Learning is good to have.
6. Should have experience in analysis, design, development, testing, and implementation of system applications
7. Demonstrated ability to develop and document technical and functional specifications and analyze software and system processing flows.