• 6+ years of experience in the telecommunications sector with 4+ years of technical experience developing, architecting, deploying software and managing large clusters and cloud services
• Leadership experience managing individual contributors as well as software teams. Recognized and nominated for the leadership and supervisor track within the organization
• Proven track record in customer facing roles, complete SDLC, Scrum Agile, Kanban, Waterfall and project estimation tasks.
• Lead multiple projects from concept to planning, execution, deployment and evolution
• Technical and product lead for a number of internal and public facing web scale products
• Extensive experience with cloud infrastructure - ETL, monitoring, alerting, PaaS, SaaS, data collection, policy management, security and application level networking protocols
SKILLS AND TOOLS
• Java, Python, Bash, former experience (C and C++), working knowledge (Ruby, JavaScipt, Scala)
• Hadoop, Map/Reduce , Kafka, Elasticsearch, Spark, Flume, Avro
• Warehousing ETL/Automation - Hive & Impala, Pig, Luigi, Oozie, Oracle, MySQL
• Monitoring - Graphite, Grafana, Sensu, Ganglia, Nagios, OpenTSDB
• Infrastructure – Docker, Ansible, Chef, Amazon Web Services (AWS), Apache & NGINX
• Data analysis – SQL, R, Tableau
• Networking – Custom TCP/IP, UDP application protocols, Network Management Systems (NMS/EMS), SNMP, Cisco SCE, NetFlow, IPFIX, REST APIs, Syslog
• Linux, JIRA, Maven, Ant, JUnit, Artifactory, Git, GitHub, Perforce, Eclipse, IntelliJ IDEA, Wireshark
Technical Lead & Senior Big Data Engineer @ ⇨ Data Analysis & ETL
• Designing the ETL pipeline to efficiently support underlying schema changes for the reporting platform based on Hadoop, Pig, Luigi and Impala and added failure recovery and monitoring.
• Technical lead of the in-line capture and custom classification appliance that provides new insights into application network usage, not currently available through existing traffic classifiers.
• Developed ad-hoc and automated reports including technical and business analysis for a number of internal teams including marketing, operations and business development.
• Developed the highly available live backend system to provide insights into traffic usage patterns for external customer care agents and operations teams.
⇨Data Collection
• Design lead of the data collection architecture for 5 geographically separate collection points located across the US with the scalability to handle real time network metrics of over 2TB/day.
• Identified technologies, designed, and implemented the complete in-line data processing pipeline. Extended Apache Flume for – UDP/TCP event processing, Cisco SCE RDR extraction, encryption, real time stream filtering, compression, and Kafka storage. The data collection system provides significant redundancy, scalability and maintenance benefits over the previous scheme.
• Part of a team developing a unified data collection platform based on Kafka and Storm.
⇨Subscriber Page Experience Measurement
• Proposed a project to develop a browser plugin for reporting metrics to the cloud platform. The approach provides actual page load time and file download performance and is crucial for HTTPS sites where no other performance metric is available.
• Supervised two interns over the summer of 2014 and continued developing the solution.
• Scaled the system to support thousands of users by architecting the https endpoint, app servers and storage mechanism.
• Currently deployed on Firefox Add-Ons store and Chrome Web Store for Exede subscribers. From 2012 to Present (3 years) Software Engineer @ • Part of a team developing the Element Management System (EMS) Tomcat 6 Webapp and was used extensively by Tier 4 Operations for performance management of network components.
• Developed the service layer, Hibernate ORM layer with Oracle 10g and MySQL, realtime SNMP metric collection and Web UI with Google Web Toolkit.
• Followed a Scrum Agile process with 3-week sprints for requirements gathering from stakeholders and rapid delivery of new features.
• Exclusively developed a Java portal based on Liferay 6.0 CE for submission of Pig and Map/Reduce jobs to the Hadoop cluster for ad-hoc and regular scheduling. The plug-in based architecture allowed new reports to be easily added; a mechanism did not exist prior to the portal.
• Authored a number of Pig UDFs to load, store custom data sources and transform datasets. From 2011 to 2012 (1 year) Software Engineer @ • Architected the distributed system responsible for collecting globally distributed NMS logs into an AWS based Hadoop cluster. Designed a custom event-logging framework through Log4J and Apache Flume including a real-time secure XML based network status feed to external partners.
• Solely responsible for the billing system design of the ArcLight mobile broadband network.
• Engaged with external customers to gather requirements and exchange development feedback.
• Developed Java portlets on Glassfish 2 with a modular Python processing backend for reporting.
• Billing system was solely responsible for multi-million dollar revenue calculation for over 1500 global customers. System was in operation for over 3 years, providing crucial insights into business development and operations. From 2009 to 2011 (2 years) Graduate Research Assistant @ • Graduate Research - Computer Networking Protocols, Software Defined Radios, Service Discovery From August 2008 to July 2009 (1 year) Software Engineering Intern @ • Part of the development team for the ArcLight satellite mobile broadband system. Developed a client-server based remote event logging scheme and distributed alerting system for the Network Management System (NMS) component using Java RMI. Collaborated with project leads, test and systems engineers, and presented the effort to the management. From May 2008 to August 2008 (4 months)
Master of Science, Electrical Engineering @ Virginia Tech From 2007 to 2009 Bachelor of Engineering, Electronics @ V.J.T.I., University of Mumbai From 2003 to 2007 Rohit Rangnekar is skilled in: Java, Python, Hibernate, SQL, Tomcat, Oracle, Software Development, Hadoop, Cloud Computing, JUnit, Git, DevOps, Amazon Web Services (AWS), Apache Kafka, Apache Spark
Websites:
http://www.viasat.com,
http://www.cognitiveradio.wireless.vt.edu/dokuwiki/doku.php?id=home,
http://www.isa.org.vt.edu