Over Sixteen(16) + years of IT experience in performing Analysis, Design, Development, Testing, Implementation and Maintenance of various Database Applications, Data Warehousing & Business Intelligence and Enhanced with strong Programming and DBA skills on an assortment of hardware platforms and in varied industries including Finance, Telecom, Insurance Distribution, Sales, Marketing, & Consumer Products.
Exceptional analytical and problem solving skills. Team Player with the ability to communicate effectively at all levels of the development process.
Ability to work independently with minimal supervision to meet deadlines.
Specialties: Technologies in Database, Data Warehouse, Business Intelligence, Business Analytics, Performance Management,Change Management, Solution Development and Delivery.
Hadoop (MapReduce, Storm, Spark, Scala, Hive, Pig, Oozie, Sqoop, Talend Big Data Edition, Tableau, Kafka, Elastic Search, Flume, AWS), NoSQL (HBase, MongoDB, Cassendera), Metadata Hub, Abinitio, Informatica, Data stage, Business Objects, Cognos, Oracle, DB2 UDB, Teradata, Netezza, Sybase IQ, MS SQL Server, SSIS, SSRS, MySQL, JAVA, J2EE, Unix Shell scripts, Python, Perl, Webmethods,Websphere, weblogic, UNIX: AIX, HP-UX, SunOS, Linux, RedHat.
Sr Big Data Engineer/ Hadoop Administrator @ Responsible for Architect and Building a Distributed Hadoop solution.
Installed and build a Hadoop cluster, configure and tune the Hadoop environment to ensure high throughput and availability.
Implemented Multitenancy, Integrated Security, Authentication with kerberos and LDAP integration via PAM and ACL.
Designed and developed Data Ingestion, Data processing and Data export and visualization frameworks.
Technologies used : MapR, Yarn, Drill, Kylin, Hadoop, Java, XML,JSON, Spark, Python, Hive, HBase, SQL, Sqoop, Oozie, Talend and Tableau. From May 2015 to Present (8 months) Sr Big data /Hadoop Consultant @ Responsible for building scalable distributed data solution to different business units such as GE Aviation, Healthcare , oil and gas and Transportation.
Driving the development of Big Data Platform Datalake architectural artifacts on pivotal Hadoop cluster.
Design and implemented Data Pipeline..,Tableau visualization and reports.
Technologies used: Hadoop, Java, Python, Map Reduce, Hive, Pig, HBase, HAWQ, MongoDB,Sqoop, Talend, Tableau, Maven, Git, YARN, Kafka, Storm, Spark, Scala. From May 2014 to Present (1 year 8 months) Big data /Hadoop Consultant @ Involved in integrating Oracle RTD ( Real Time Decision System Offer Management software) with schwab.com which determines the right message/offers for the right client using schwab.com Web channel. Created necessary stubs and skeletons to be used by the web client for the Web Services method invocation.
Developed customer profile data using Hive from IDW customer data and loaded to Hbase table using storm (Spout/Bolt Topology) and Map Reduce.
Written Jax RPC provider handler / servlet filter which will integrate with RTD and perform Enterprise logging, authentication and authorizaion. From August 2013 to July 2014 (1 year) Sr Systems Analyst/ Big Data Consultant @ Developed multiple Map Reduce jobs in java, Pig, Hive, Sqoop, HBase, Cassandra, Spark for data cleaning and preprocessing.
Data Quality Analysis, identification data issues and gap assessment and implementing resolution.
Build data systems that speak to different database platforms, enabling product and business teams to make data driven decisions.
Worked on Meta data hub customization workbench for Web Services Integration, Involved in Extending Object Model Schema. Writing schema extensions and reports. From September 2012 to August 2013 (1 year) Sr ETL Architect/ Big Data consultant @ Redefined the system architecture for functional correctness/performance optimization project, Redesign process flows and recreate some of the ETL scripts without breaking the original functionality of the code and optimizes the IPDS Application.
Installed, configured and deployed Hadoop cluster for development, test and production.
Designed and allocated HDFS quotas for multiple groups. Developed multiple MapReduce jobs.
Created Hive tables both internal or external defined static and dynamic partitions for efficiency.
Analyze and developed Oozie workflow and determine to manage interdependent Hadoop jobs including Apache Hive, Apache Pig, Sqoop. From December 2011 to August 2012 (9 months) Sr Lead Application Developer @ Involved in 4 major Data Warehouse project implementation,Managed project tasks from design phase through implementation and support and responsible for building up an global ETL support/off shore development team.
responsible for interacting with the business team to gather and formulate requirements and translating the requirements in to detailed design specifications for developers on the team.
Participating in the kickoff meetings of various projects that need data to be stored on ODS for Real time access.
Worked in CDI (Initiate systems) Customer Data Integration Team and was responsible on Hosting Capitol one’s Enterprise level customer data on PEDW using DDE as the data movement and processing environment. From August 2009 to November 2011 (2 years 4 months) Analyst/ ETL Lead Developer @ Worked on the "FAS91 Amortization" stream of Restatement and Get Current, which is related to amortizing the Guarantee Fee components, securities, loans, and other financial products.
The project scope involves implementing the Inbound/Outbound interfaces between Amortization Data Repository (ADR) and Guarantee Fee (GFAS) and corporate tax amortization for "FAS91 Amortization" stream using Ab Initio. Also involves in implementing Analytical reporting ETL GFAS/SECURITIES in building Guarantee Fee datamart(GFASDM) and Security Data Mart Team (SCBSLDM) for generating various kinds of monthly, quarterly and yearly reports.
Project involvement includes all phases of the system development lifecycle with emphasis in scoping enterprise data analysis, data management, data design, data movement architecture and implementing optimal and effective enterprise solutions. From December 2005 to August 2009 (3 years 9 months) Analyst/Lead Developer @ Designed and Performed detailed and accurate source systems analysis, developed logical and physical data models, source to target data mappings and data movement task in support of their IDN Datamanagement enterprise data warehouse. From January 2005 to November 2005 (11 months) ETL/Data Warehouse Developer @ Designed and developed REVAMART data warehouse / reporting system for a Broadband ADSL provisioning network. From April 2004 to January 2005 (10 months) Data warehouse Developer @ Designed and Developed Data warehouse system to Automate the CART (Contacts Adhoc Retention and Telemarketing) procedure and Developed logical and physical data models in support of data warehousing, decision support, and executive information system in support of their enterprise data warehouse. From October 2003 to March 2004 (6 months) Software Analyst @ Developed Product and customer information Integration system using event base middle ware (web methods Enterprise Platform) and Supported the Oracle Financials and Purchasing applications,Written PERL Interface for Rational software’s e*shop to enable communication with Oracle and Vantive Database using VanAPI, oraperl From August 2001 to September 2003 (2 years 2 months) Software Analyst/DBA @ Designed and structured accounting and financial management system in support of Analyzing the budget and project costing applications From July 1999 to July 2001 (2 years 1 month) Software Analyst/DBA @ Designed and developed Integrated Information System. From May 1998 to June 1999 (1 year 2 months)
Yaseen Mohammed is skilled in: Data Warehousing, Business Intelligence, ETL, SDLC, Unix Shell Scripting, Teradata, Oracle, Databases, Enterprise Architecture, Shell Scripting, DB2, Business Objects, PL/SQL, SSIS, Microsoft SQL Server