Hadoop Administrator- Big Data Platform Engineer
- Full-time
- Job Family Group: Technology and Operations
Company Description
At Visa, your individuality fits right in. Working here gives you an opportunity to impact the world, invest in your career growth, and be part of an inclusive and diverse workplace. We are a global team of disruptors, trailblazers, innovators and risk-takers who are helping drive economic growth in even the most remote parts of the world, creatively moving the industry forward, and doing meaningful work that brings financial literacy and digital commerce to millions of unbanked and underserved consumers.
You're an Individual. We're the team for you. Together, let's transform the way the world pays.
Job Description
Essential Functions
- Person will be responsible to Perform Big Data Administration and Engineering activities on multiple Hadoop, Kafka, Hbase and Spark clusters
- Work on Performance Tuning and Increase Operational efficiency on a continuous basis
- Monitor health of the platforms ,Generate Performance Reports and Monitor and provide continuous improvements
- Working closely with development, engineering and operation teams, jointly work on key deliverables ensuring production scalability and stability
- Develop and enhance platform best practices
- Ensure the Hadoop platform can effectively meet performance & SLA requirements
- Responsible for Big Data Production environment which includes Hadoop( HDFS and YARN), Hive, Spark, Livy, SOLR, Oozie, Kafka, Airflow,Nifi, Hbase etc .
- Perform optimization, debugging and capacity planning of a Big Data cluster
- Perform security remediation, automation and self heal as per the requirement
Qualifications
Basic Qualifications
8+ years of work Big Data Engineer experience with a Bachelor’s Degree or an Advanced Degree (e.g. Masters, MBA, JD, MD, or PhD)
Preferred Qualifications
- Minimum 3 years of work experience in maintaining, optimization, issue resolution of Big Data large scale clusters, supporting Business users and Batch process.
- Hands on Experience in Hadoop( HDFS and Yarn) , Hive ,Spark, Kafka is must.
- Hands-on Experience No SQL Databases HBASE is plus
- Prior Experience in Linux / Unix OS Services, Administration, Shell,awk scripting is a plus
- Excellent oral and written communication and presentation skills, analytical and problem solving skills
- Self-driven, Ability to work independently and as part of a team with proven track record
- Minimum of four year technical degree required
- Experience on Hortonworks distribution or Open Source preferred
Additional Information
All your information will be kept confidential according to EEO guidelines.