Image of Pavan Kumar

Pavan Kumar

Sr. Hadoop Administrator @ Quest Diagnostics

Sr. Hadoop Administrator at Quest Diagnostics

Iselin, New Jersey

Ranked #318 out of 6,356 for Sr. Hadoop Administrator in New Jersey

Section title

Pavan Kumar's Email Addresses & Phone Numbers

Pavan Kumar's Work Experience

Quest Diagnostics

Sr. Hadoop Administrator

Greensboro/Winston-Salem, North Carolina Area

Wells Fargo

Hadoop Administrator

June 2014 to November 2014

San Leandro, CA

ExactTarget

Hadoop Admin

January 2013 to August 2013

Indianapolis, Indiana Area

Pavan Kumar's Education

Jawaharlal Nehru Technological University

Bachelor's degree, Electronics & Communications Engineering

2004 to 2008

Pavan Kumar's Professional Skills Radar Chart

Based on our findings, Pavan Kumar is ...

Visionary
Methodical
Cause-and-effect oriented

What's on Pavan Kumar's mind?

Based on our findings, Pavan Kumar is ...

56% Left Brained
44% Right Brained

Pavan Kumar's Estimated Salary Range

About Pavan Kumar's Current Company

Quest Diagnostics

• Experienced in Installing, Configuring, Monitoring, Maintaining and Troubleshooting Hadoop clusters. • Extensively worked with Cloudera Distribution Hadoop, CDH 5.x, CDH4.x• Involved in Cluster Capacity planning, Hardware planning, Installation, Performance Tuning of the Hadoop Cluster. • Worked on installing cluster, commissioning & decommissioning of DataNodes, NameNode recovery and slots configuration.• Hands on experience working with HDFS, MapReduce,...

Frequently Asked Questions about Pavan Kumar

What company does Pavan Kumar work for?

Pavan Kumar works for Quest Diagnostics


What is Pavan Kumar's role at Quest Diagnostics?

Pavan Kumar is Sr. Hadoop Administrator


What is Pavan Kumar's personal email address?

Pavan Kumar's personal email address is pa****[email protected]


What is Pavan Kumar's business email address?

Pavan Kumar's business email addresses are not available


What is Pavan Kumar's Phone Number?

Pavan Kumar's phone (201) ***-*173


About Pavan Kumar

📖 Summary

Sr. Hadoop Administrator @ Quest Diagnostics • Experienced in Installing, Configuring, Monitoring, Maintaining and Troubleshooting Hadoop clusters. • Extensively worked with Cloudera Distribution Hadoop, CDH 5.x, CDH4.x• Involved in Cluster Capacity planning, Hardware planning, Installation, Performance Tuning of the Hadoop Cluster. • Worked on installing cluster, commissioning & decommissioning of DataNodes, NameNode recovery and slots configuration.• Hands on experience working with HDFS, MapReduce, Hive, Pig, Sqoop, Impala, Hadoop HA, Yarn, Cloudera Manager, Hue• Configured various property files like core-site.xml, hdfs-site.xml, yarn-site.xml, mapred-site.xml and hadoop-env.xml based upon the job requirement. • Timely and reliable support for all production and development environment: deploy, upgrade, operate and troubleshoot.• Implemented a Major version upgrade from 4.7.1 to 5.2.3 and also done few Minor Version upgrades from 5.2.5 to 5.3.3 and 5.3.3 to 5.4.3 • Utilized cluster co-ordination services through ZooKeeper. • Designed, implemented and managed the Backup and Recovery environment. • Experience in understanding the security requirements for Hadoop and integrating with Kerberos authentication infrastructure- KDC server setup, crating realm /domain, managing principles, generation key tab file each service and managing keytab using keytab tools. • Managing and scheduling Jobs on a Hadoop cluster using Oozie. • Configured NameNode high availability and NameNode federation. • Used Sqoop to import and export data from RDBMS to HDFS and vice-versa. • Performance tuned the Hadoop cluster to improve the efficiency.• Have worked on a POC project on Hortonworks distribution.• Setting up a cluster on Hortonworks HDP 2.2• Have worked with Ambari for installing, configuring and monitoring the cluster. • Involved in configuring Quorum based HA for NameNode and made the cluster more resilient. Greensboro/Winston-Salem, North Carolina AreaHadoop Administrator @ Wells Fargo • Installed/Configured/Maintained Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop. • Extensively involved in Installation and configuration of Cloudera distribution Hadoop CDH 3.x, CDH 4.x.• Wrote shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions. • Installed and configured Hadoop, MapReduce, HDFS (Hadoop Distributed File System), developed multiple MapReduce jobs for data cleaning. • Involved in clustering of Hadoop in the network of 70 nodes. • Experienced in loading data from UNIX local file system to HDFS. • Developed data pipeline using Flume, Sqoop, Pig and Java map reduce to ingest customer behavioral data and financial histories into HDFS for analysis. • Involved in collecting and aggregating large amounts of log data using Apache Flume and staging data in HDFS for further analysis. • Involved in developing new work flow Map Reduce jobs using Oozie framework. • Collected the logs data from web servers and integrated in to HDFS using Flume. • Worked on upgrading cluster, commissioning & decommissioning of DataNodes, NameNode recovery, capacity planning, and slots configuration. • Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS. • Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing the data onto HDFS. • Involved in the installation of CDH3 and up-gradation from CDH3 to CDH4. • Installed Oozie workflow engine to run multiple Hive and Pig Jobs. • Use of Sqoop to import and export data from RDBMS to HDFS and vice-versa. • Created Hive External tables and loaded the data in to tables and query data using HQL. • Wrote shell scripts for rolling day-to-day processes and it is automated. • Automated workflows using shell scripts to pull data from various databases into Hadoop. From June 2014 to November 2014 (6 months) San Leandro, CAHadoop Admin @ ExactTarget • Installed/Configured/Maintained Apache Hadoop clusters and Cloudera Distribution Hadoop CDH for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop. • Extensively involved in Cluster Capacity planning, Hardware planning, Installation, Performance Tuning of the Hadoop Cluster. • Performed Installation and configuration of Hadoop Cluster of 90 Nodes with Cloudera distribution with CDH3. • Installed Namenode, Secondary name node, Job Tracker, Data Node, Task tracker. • Performed benchmarking and analysis using Test DFSIO and Terasort. • Implemented Commissioning and Decommissioning of data nodes, killing the unresponsive task tracker and dealing with blacklisted task trackers. • Implemented Rack Awareness for data locality optimization. • Dumped the data from MYSQL database to HDFS and vice-versa using SQOOP. • Used Ganglia and Nagios to monitor the cluster around the clock. • Created a local YUM repository for installing and updating packages. • Dumped the data from one cluster to other cluster by using DISTCP, and automated the dumping procedure using shell scripts. • Implemented Name node backup using NFS. • Performed various configurations, which includes, networking and IPTable, resolving hostnames, user accounts and file permissions, http, ftp, SSH keyless login. • Worked with the Linux administration team to prepare and configure the systems to support Hadoop deployment • Created volume groups, logical volumes and partitions on the Linux servers and mounted file systems on the created partitions. • Implemented Capacity schedulers on the Job tracker to share the resources of the Cluster for the Map Reduce jobs given by the users. • Worked on importing and exporting Data into HDFS and HIVE using Sqoop. • Helped in the day to day support for operation. • Worked on performing minor upgrade from CDH3-u4 to CDH3-u6 • Implemented Kerberos for authenticating all the services in Hadoop Cluster. From January 2013 to August 2013 (8 months) Indianapolis, Indiana AreaUnix Linux System Administrator @ JPMorgan Chase & Co. • Installed, configured and administered RHEL 5/6 on VMware server 3.5. • Converted a lot of physical server on Dell R820 into virtual machines for a Lab environment. • Managed file space and created logical volumes, extended file systems using LVM. • Performed daily maintenance of servers and tuned system for optimum performance by turning off unwanted peripheral and vulnerable service. • Managed RPM Package for Linux distributions • Monitored system performance using TOP, FREE, VMSTAT & IOSTAT. • Set up user and group login ID's, password, ACL file permissions, and assigned user and group quota • Configured networking including TCP/IP and troubleshooting. • Designed Firewall rules to enable communication between servers. • Monitored scheduled jobs, workflows, and related to day to day system administration. • Respond to tickets through ticketing systems. From January 2011 to December 2012 (2 years) Greater Chicago AreaUnix Linux System Administrator @ American Family Insurance • Installation and configuration of Red Hat Enterprise Linux (RHEL) 5x, 6x Servers on HP, Dell Hardware and VMware virtual environment• Installation, setup and configuration of RHEL, CentOS, OEL and VMware ESX on HP, Dell and IBM hardware• Installation and Configuration of Sun Enterprise Servers, HP and IBM Blade Servers, HP 9000, RS 6000, IBM P series• Expertise in enterprise class storage including SCSI, RAID and Fiber-Channel technologies• Configuration and maintenance of virtual server environment using VMWAREESX 5.1/5.5, VCenter• Creating user accounts, user administration, local and global groups on Solaris and Red Hat Linux platform• Setup, Implementation, Configuration, documentation of Backup/Restore solutions for Disaster/Business Recovery of clients using TSM backup on UNIX, SUSE &Redhat Linux platforms.• Installed and configured Netscape, Apache web servers and Samba Server• Installed and Configured WebSphere Application servers on Linux and AIX• Installation and configuration MySQL on Linux servers.• Setup, Implementation, Configuration of SFTP/FTP servers• Setting up JBoss cluster and configuring apache with JBoss on RetHat Linux. Proxy serving with Apache. Troubleshooting Apache with JBoss and Mod_jk troubleshooting for the clients• Installed Jenkins and created users and maintained Jenkins to deploy Java code developed by developers and build framework.• System performance monitoring and tuning.• Provide 24x7 oncall production support on rotation basis. From August 2009 to June 2011 (1 year 11 months) Madison, Wisconsin AreaLinux Administrator @ IBM • Performed Red Hat Enterprise Linux, Oracle Enterprise Linux (OEL), Solaris, and Windows Server deployment to build new environment by using Kickstart and jumpstart. • Preformed Installation, adding and replacement of resources like Disks, CPU’s and Memory, NIC Cards, increasing the swap and Maintenance of Linux/UNIX and Windows Servers.• Implemented NFS, SAMBA file servers and SQUID caching proxy servers• Implemented centralized user authentication using OpenLDAP and Active Directory• Worked with VMware ESX Server Configured for Red Hat Enterprise Linux.• Configured IT hardware- switches, HUBS, desktops, rack servers• Structured datacenter stacking, racking, and cabling• Install, configured, troubleshoot, and administer VERITAS and Logical volume manager and managing file systems.• Monitored system performance, tune-up kernel parameter, added/removed/administered hosts and users• Created and Administered user Accounts using native tools and managing access using sudo.• Actively participated and supported in the migration of 460+ production servers from old data center to New Data Center• Involved in using RPM for package management and Patching. • Creating documentation for datacenter hardware setups, standard operational procedures and security policies• Create and maintain technical documentation for new installations and systems changes as required From May 2007 to June 2009 (2 years 2 months) Hyderabad Area, India


Pavan Kumar’s Personal Email Address, Business Email, and Phone Number

are curated by ContactOut on this page.

10x your recruitment & sales conversations

Contact over 200M professionals
instantly by email or phone. Reveal
personal & work email addresses, as
well as phone numbers accurately with
our ContactOut Chrome extension.

In a nutshell

Pavan Kumar's Personality Type

Introversion (I), Intuition (N), Thinking (T), Judging (J)

Average Tenure

1 year(s), 5 month(s)

Pavan Kumar's Willingness to Change Jobs

Unlikely

Likely

Open to opportunity?

There's 92% chance that Pavan Kumar is seeking for new opportunities

Pavan Kumar's Social Media Links

/company/q... /school/ja...
Engage candidates 10x faster

Enjoy unlimited access and discover candidates outside of LinkedIn

Trusted by 400K users from

76% of Fortune 500 companies

Microsoft Nestle PWC JP Morgan Merck Rackspace WarnerMedia Randstad Yelp Google

The most accurate data ever

CCPA Compliant
GDPA Aligned
150M Personal Emails
300M Work Emails
50M Direct Dials
200M Professional Profiles
30M Company Profiles

Hire Anyone, Anywhere
with ContactOut today

Making remote or global hires? We can help.

  • 50 contacts/month
  • Works on standard LinkedIn only
  • Work emails, personal emails, mobile numbers
* 1 user per company limit

No credit card required

Try ContactOut for Free