Senior Hadoop Engineer

  • Bengaluru, India
  • Full-time

Company Description

Common Purpose, Uncommon Opportunity. Everyone at Visa works with one goal in mind – making sure that Visa is the best way to pay and be paid, for everyone everywhere. This is our global vision and the common purpose that unites the entire Visa team. As a global payments technology company, tech is at the heart of what we do: Our VisaNet network processes over 13,000 transactions per second for people and businesses around the world, enabling them to use digital currency instead of cash and checks. We are also global advocates for financial inclusion, working with partners around the world to help those who lack access to financial services join the global economy. Visa’s sponsorships, including the Olympics and FIFA™ World Cup, celebrate teamwork, diversity, and excellence throughout the world. If you have a passion to make a difference in the lives of people around the world, Visa offers an uncommon opportunity to build a strong, thriving career. Visa is fueled by our team of talented employees who continuously raise the bar on delivering the convenience and security of digital currency to people all over the world. Join our team and find out how Visa is everywhere you want to be.

Job Description

·       Develop Hadoop architecture, HDFS commands & utilities

·       Design & optimize analytical jobs and queries against data in the HDFS/Hive environments

·       Develop bash shell or python scripts, LINUX utilities & LINUX Commands

·       Develop data models that helps in platform analytics & hardening

·       Develop self-healing system at scale

·       Able to help and Guide L1/L2 support engineers to fix day-to-day Operational issues

·       Perform Tuning and Increase Operational efficiency on a continuous basis

·       Build framework to gather all deep system level metrics across platforms centrally as part of the control center 

·       Develop central dashboards for all System, Data, Utilization and availability metrics

·       Build data transformation and processing solutions

·       Build high volume data integrations


Required skills:

·       Hadoop (preferably Cloudera or Hortonworks distribution), HDFS, Hive, Impala, Kafka, Spark, Oozie, HBase

·       Strong knowledge on SQL & HQL

·       Strong Linux knowledge and scripting, Python

·       Java, J2EE, Web Applications, Tomcat (or any equivalent App server), Restful Services, JSON,Design Patterns

·       Kerberos, TLS, Senry, data encryptio


·       Minimum 3 years of work experience in developing, maintaining, optimization, issue resolution of Hadoop clusters, supporting Business users

·       Minimum of four-year technical degree in computer science or IT related required

·       Experience in Linux / Unix OS Services, Administration, Shell, awk scripting

·       Experience in building and scalable Hadoop applications 

·       Experience in Core Java, Hadoop (Map Reduce, Hive, Pig, Spark, Kafka, Hbase, HDFS, H-catalog, Zookeeper and OOzie) 

·       Experience in Hadoop security (Kerberos, Knox, TLS)

·       Hands-on Experience in SQL and No SQL Databases (HBASE/Cassandra/Mongo DB) 

·       Experience in building large scale real-world backend and middle-tier systems in Java

·       Experience in tool Integration, automation, configuration management in GIT, Jira platforms 

·       Excellent oral and written communication and presentation skills, analytical and problem solving skills 

·       Self-driven, Ability to work independently and as part of a team with proven track record developing and launching products at scale

·       Develop and enhance platform best practices and educate Visa developers on best practices

Additional Information

All your information will be kept confidential according to EEO guidelines.

Privacy Policy