- Baner Road, Pune, Maharashtra, in
PubMatic is a publisher-focused sell-side platform for an open digital media future. Featuring leading omni-channel revenue automation technology for publishers and enterprise-grade programmatic tools for media buyers, PubMatic's publisher-first approach enables advertisers to access premium inventory at scale. Processing over one trillion ad impressions per month, PubMatic has created a global infrastructure to drive publisher monetization and control over their ad inventory. Since 2006, PubMatic's focus on data and technology innovation has fueled the rise of the programmatic industry as a whole. Headquartered in Redwood City, California, PubMatic operates 13 offices and six data centers worldwide.
PubMatic Data center team is looking for a Hadoop Administrator who will be responsible for assisting with the design, implementation, and ongoing support of the Big Data platforms.
As a Hadoop Administrator, you will be responsible for installing, configuring and maintaining multiple Hadoop clusters. You will be responsible for design and architecture of the Big Data Platform, work with development teams to optimize different Hadoop services and deploy and code in to multiple environment
Major Duties and Responsibilities:
- Manage large scale Hadoop cluster environments, handling all Hadoop environment builds, cluster setup, performance tuning and ongoing monitoring.
- Evaluate and recommend systems software and hardware for the enterprise system including capacity modeling.
- Architecture of our Hadoop infrastructure to meet changing requirements for scaling, reliability, performance and manageability.
- Work with core production support personnel in IT and Engineering to automate deployment and operation of the infrastructure. Manage, deploy, and configure infrastructure with Ansible or other automation tool sets.
- Creation of metrics and measures of utilization and performance.
- Capacity planning and implementation of new/upgraded hardware and software releases as well as for storage infrastructure.
- Responsible for monitoring the Linux community and report on important changes/enhancements to the team.
- Ability to work well with a global team of highly motivated and skilled personnel - interaction and dialogue are requisites in this dynamic environment.
- Research and recommend innovative, and where possible, automated approaches for system administration tasks. Identify approaches that leverage our resources, provide economies of scale, and simplify remote/global support issues.
- 2 years of professional experience supporting production medium to large scale Linux environments.
- 2 years of professional experience working with Hadoop (HDFS & MapReduce) and related technology stack.
- A deep understanding of Hadoop design principals, cluster connectivity, security and the factors that affect distributed system performance.
- Experience on Kafka,Hbase,HDFS,Yarn,Spark and Hortonworks is mandatory.
- MapR and MySQL experience a plus.
- Understanding of automation tools (ansible) experience a plus
- Expert experience with at least one if not most of the following languages; python, perl, ruby, or bash.
- Prior experience with remote monitoring and event handling using Nagios, ELK.
- Solid ability to create automation with ansible or a shell.
- Good collaboration & communication skills, the ability to participate in an interdisciplinary team.
- Strong written communications and documentation experience.
- Knowledge of best practices related to security, performance, and disaster recovery.
- BE/BTech/BS/BCS/MCS/MCA in Computers or equivalent
- Excellent interpersonal, written, and verbal communication skills
PubMatic is proud to be an equal opportunity employer; we don’t just value diversity, we promote and celebrate it. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
All your information will be kept confidential according to EEO guidelines.