Sr. Analytics Engineer - Hadoop
- Austin, TX
Common Purpose, Uncommon Opportunity. Everyone at Visa works with one goal in mind – making sure that Visa is the best way to pay and be paid, for everyone everywhere. This is our global vision and the common purpose that unites the entire Visa team. As a global payments technology company, tech is at the heart of what we do: Our VisaNet network processes over 13,000 transactions per second for people and businesses around the world, enabling them to use digital currency instead of cash and checks. We are also global advocates for financial inclusion, working with partners around the world to help those who lack access to financial services join the global economy. Visa’s sponsorships, including the Olympics and FIFA™ World Cup, celebrate teamwork, diversity, and excellence throughout the world. If you have a passion to make a difference in the lives of people around the world, Visa offers an uncommon opportunity to build a strong, thriving career. Visa is fueled by our team of talented employees who continuously raise the bar on delivering the convenience and security of digital currency to people all over the world. Join our team and find out how Visa is everywhere you want to be.
Visa operates the world's largest retail electronic payments network and is one of the most recognized global financial services brands. Visa facilitates global commerce through the transfer of value and information among financial institutions, merchants, consumers, businesses, and government entities.
We offer a range of branded payment product platforms, which our financial institution clients use to develop and offer credit, debit, prepaid and cash access programs to cardholders. Visa's card platforms provide consumers, businesses, merchants, and government entities with a secure, convenient, and reliable way to pay and be paid in more than 200 countries and territories.
The Corporate IT (CIT) organization is responsible for all facets of architecture, software development and production support of key internal systems supporting areas such as Finance, Revenue, Treasury, Human Resources, Legal, Risk, Compliance, CRM, Contact Center and End User. Within this organization, the GRC & HRIT team supports full life-cycle delivery and ongoing support of strategic technology initiatives for Goveranance, Risk, Compliance and HR organizations and the Analytics team work on providing Analytical solutions for these Business units.
The Sr. Software Engineer (Hadoop/BI Development) within the GRC Analytics Delivery team, will be responsible for designing, developing and implementing Data integration and Analytics solutions on Hadoop and using BI tools, Data mining, and/or Machine learning to provide insights on the data. This position requires close collaboration with business partners to understand their business goals, understand requirements, design, develop and implement Data Analytics solutions following SDLC and agile methodologies.
The ideal candidate for this position will have a minimum of five years experience in development and delivery of Analytics projects, and two years development experience with Hadoop.
- Participate in technology project delivery activities such as gathering Business requirements, conceptual approach, design, development, test case preparation, unit/integration test execution, support process documentation
- Develop the source to target mapping documents and ETL code to load data from source to target systems
- Develop workflows and ETL using Hive scripts, Sqoop, Pig, oozie, Spark and other utilities on Hadoop
- Build analytical applications using Hadoop eco system using packaged or open source technologies
- Partner with IT groups such as CIT, Engineering, Product, Security, and Infrastructure on project delivery activities
- Provide insights from data and present to IT and non-technical users to improve operations and productivity
- Unit test data loads and write scripts for data validation
- Support QA, UAT and performance testing phases of development cycle and implement DevOps principles from development to deployment to production
- Bachelor’s Degree in Computer Science or related discipline
- Proven experience in Hadoop development and data integration on Hadoop is required, preferably Hive, Spark etc.
- 5 Years of experience in BI/ Analytics Tools, 2 years of experience in Tableau or any BI tools is required
- 2 Years of expeience in delivering solutions using Hive scripts, Sqoop, Pig, oozie, Spark and other utilities on Hadoop
- Proven experience in developing and deploying BI solutions including ETL and dashboard development is required
- Experience in RDBMS (Oracle and SQLServer) is required and knowledge of emerging technologies (noSQL/Graph Database/Time Series) is nice to have
- Experience in writing complex SQL and Hive QL queries is required
- Experience in Python, Shell/ Perl scripting, scheduling and version control technologies is required
- Experience in real time data ingestion using Kafka/Spark Streaming is nice to have
- Experience in Advanced Analytics, Machine learning and data mining is nice to have.
- Experience in at least one ETL technology is required, Informatica, Talend, Ab Initio experience is preferred
- Strong follow-through, problem identification, analysis and problem solving skills
- Demonstrated ability to learn new Information technologies and apply quickly
- Exceptional communication and customer facing skills, able to interact effectively with diverse groups of global stakeholders, both technical and business users
- Experience in Payment Processing is beneficial
Visa will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of Article 49 of the San Francisco Police Code.
All your information will be kept confidential according to EEO guidelines.