Now you can Instantly Chat with Hemanshu!
About Me
11+ years of total IT experience building Datawarehouse & Datalake environment on premises & cloud (Azure & AWS ) platform. Leading Data Center of excellence for Morgan Stanley wealth management Expertise in managing platform both inpremise and Cloud...
Show MoreSkills
Portfolio Projects
Description
Operationalized services on Azure from scratch with a target of onboarding at-least 3 beta customers in a quarter. Built automated cross region backup of hundreds of terabytes of S3 data for disaster recovery purposes Built custom cloud auto-scaling models for scaling app servers Built CI/CD pipeline from scratch that includes automated testing for Git PRs, promotion of builds to staging and prod and sanity tests. Developed a RESTful API platform that allowed programmatic access to the resources and 20% of the existing customers signed up for the usage in the beta phase Developed a Apache Spark based log processor that processes and enriches logs at 15000-20000 QPS and delivers it to customers Amazon S3 bucket Architected and developed a HA and scalable log dispatcher to dispatch logs to downstream analytics system.
Show More Show LessDescription
Leading the team of Hadoop COE in Morgan Stanley. Managing a team of 15 people involving development of best practices and guideline for Hadoop Actively involved in Migration of workload from Teradata system to Hadoop Cluster Developing best practices guideline for application team. Developed best practices to be involved with Sqoop job while extracting and moving data from RDBMS to Hadoop Cluster. Developed automated Sync script which move data from one environment to other environment in Hadoop Cluster. Worked on POC to find the best tool for data processing in hadoop involving o Mapreduce job - Hive/Pig o Impala o Hive on Spark o Spark Actively involved with Hadoop Administrator which involves o Upgrade activities o Benchmark Testing o Production issue and troubleshooting o Workload management in Hadoop using Yarn and Admission control o Services and its scheduled Maintenance. o Access Right and authentication Mechanism. o Space Management o HDFS Rebalancing Actively involved with Application team with responsibility which involves o Resolving and Troubleshooting application issues o Performance tuning of production jobs o All communication with App dev team o Defining optimized guideline for app team. o Judging the best tool to accomplish the task o Compression Analysis and data storage format o Actively involved in PDM Review process o Actively involved in Production review process Manage the Realtime streaming platform of MSWM leveraging Confluent Kafka. Expertise In designing application to be search efficient using Apache Solr Actively Involved with Workload Management of Hadoop Cluster using Fair scheduler in Yarn /Admission control
Show More Show LessDescription
Individual contributor and actively working with stake holders directly. Designed & Developed an Automated process for DEV PDM review process Designed and Developed an automated process for Production job review process for performance. Actively involved with 80+ Application team for troubleshooting any issues. Actively monitored and resolved any production Teradata issues. Designed and Developed process for data validation between Production and BCP cluster which involves Rowcount / Attribute & complete data validation process for Disaster recovery compliance. Designed and developed automated process for Compression. Coordinating with development team in order to perform code review before implementation in production. Preparing Dashboards for jobs running in lower environment and on the basis of this pass them to move them to Production. Implementing efficient Role Strategies along with streamlining User security management. Generating Performance audit system report and presenting same to the client on quarterly basis. Working on Production issues and troubleshooting the same. Resolving Spool out errors in Teradata in order to minimize resource wastage. Identifying bad performing queries from the system and providing recommendation to perform them better. Deploying Production changes, handling data copy using UDM. Responsible for View Point monitoring to check the system health or any impactful query/session Creating and managing objects in production system. Creating alerts on View Point Performing database maintenance activities on monthly basis. Worked on MVC analysis for few production tables for space saving on Production systems.
Show More Show LessDescription
EBI Operation is the Production support team of Freescale, Managing entire EBI Cycle of Freescale , which includes development of Change request received time and again from the client in datastage and Teradata SQL.
ØActively involved in managing the entire EBI operation cycle of Freescale.
ØDaily activities include working on change request received from the client (i.e changing the Bteq script to load data as per new business logic and creation of new table loading script and further documentation of the details)
ØInvolved in writing scripts for loading data to target data warehouse using teradata utilities like BTEQ, Fast Load and MultiLoad.
ØAutomation of Process for business by standardization and scheduling.
ØAbend fixing, discrepancy resolution.
ØDefining strategies for migration of Database Objects, Data and Data Quality.
ØMonitored 1600+ jobs running in production environment and provided quick and fast resolution to any load failures.
ØExperienced in archiving, restoring and recovering data on Teradata using ARC utility and TARA GUI.
ØVery well trained and experienced in scheduling backups and recovery of the entire EDW databases across various geographical locations for the business continuity and response time.
ØHandling efficient way of data and DDL backups and environment refreshes.
ØExpert in UNIX scripts for Database extraction and filtering.
ØWrote SQL scripts for backend databases, custom stored procedures, macros and Packages along with Referential Integrity triggers.
ØEnhanced and Customized load scripts to the newly developed and in design projects.
ØExpert in UNIX and shell scripts for Database extraction and filtering.
ØControlled and tracked access to Teradata Database
ØGood experience in creating dynamic Bteq queries using shell script.
ØCreation of one-shots to fix discrepancies In data or processing.
ØInvolved in unit testing Involved and Preparing test cases.
ØInvolved in gathering the requirements for the forms.
ØWritten scripts using Bteq, Fastload, Multiload, Fastexport, Tpump and Queryman.
ØInvolved on error handling with the help of tables like ET, UV and WT.
ØWorked with joins, sub queries, set operations extensively.
ØDid performance tuning at various levels.
ØPrepared BTEQ import, export scripts for tables.
ØWritten BTEQ, FAST LOAD, MULTI LOAD scripts .
ØPrepared FAST EXPORT script for tables.
Show More Show LessDescription
EBI Operation is the Production support team of Freescale, Managing entire EBI Cycle of Freescale , which includes development of Change request received time and again from the client in datastage and Teradata SQL, Actively involved in managing the entire EBI operation cycle of Freescale. Daily activities include working on change request received from the client (i.e changing the Bteq script to load data as per new business logic and creation of new table loading script and further documentation of the details) Involved in writing scripts for loading data to target data warehouse using teradata utilities like BTEQ, Fast Load and MultiLoad. Automation of Process for business by standardization and scheduling. Abend fixing, discrepancy resolution. Defining strategies for migration of Database Objects, Data and Data Quality. Monitored 1600+ jobs running in production environment and provided quick and fast resolution to any load failures. Experienced in archiving, restoring and recovering data on Teradata using ARC utility and TARA GUI. Very well trained and experienced in scheduling backups and recovery of the entire EDW databases across various geographical locations for the business continuity and response time. Handling efficient way of data and DDL backups and environment refreshes. Expert in UNIX scripts for Database extraction and filtering. Wrote SQL scripts for backend databases, custom stored procedures, macros and Packages along with Referential Integrity triggers. Enhanced and Customized load scripts to the newly developed and in design projects. Expert in UNIX and shell scripts for Database extraction and filtering. Controlled and tracked access to Teradata Database Good experience in creating dynamic Bteq queries using shell script. Creation of one-shots to fix discrepancies In data or processing. Involved in unit testing Involved and Preparing test cases. Involved in gathering the requirements for the forms. Written scripts using Bteq, Fastload, Multiload, Fastexport, Tpump and Queryman. Involved on error handling with the help of tables like ET, UV and WT. Worked with joins, sub queries, set operations extensively. Did performance tuning at various levels. Prepared BTEQ import, export scripts for tables. Written BTEQ, FAST LOAD, MULTI LOAD scripts . Prepared FAST EXPORT script for tables.
Show More Show LessDescription
Product Cost Planning was an area within SAP, to plan costs for manufactured materials, services, other intangible goods and set prices for materials and other cost accounting objects. The purpose was to calculate the cost of goods manufactured (COGM) and determine the Cost of Goods Sold (COGS) for each product unit. Cost breakdown by each product and to calculate value added at each stage of completion. Assist in strategic decisions like make vs. buy, pricing, market analysis, Inventory valuation, variances and reserve calculations. The objective of this project was to migrate the lively running database (For Freescale Semiconductor Inc.) from SAP to TeraData
Show More Show LessDescription
ØWorked in all phases in Software Development Life Cycle from requirement gathering to production Implementation of an Entity.
ØUnderstand complex business requirements and transformation for ETL and Reporting with continuous interaction with Clients and BA's.
ØAnalyze and Design the new requirements and find the optimum solution to develop the requirement in DB, Data Modeling, ETL and Reporting.
ØWritten scripts using Bteq, Fastload, Multiload, Fastexport, Tpump and Queryman.
ØInvolved on error handling with the help of tables like ET, UV and WT.
ØCreation of Datastage jobs.
ØWorked with joins, sub queries, set operations extensively.
ØDid performance tuning at various levels.
ØAs part of the SEI CMM Level 5 process, create various documents viz. Estimation sheets, Requirement Analysis, Design Documentation, Mapping Document, Test Cases, RTT log, MOM, Review log, Timesheet and Checklists for ETL, Scripting and Reporting.
Continuous effective coordination and communication with my Team Members, Team Leads, Project Manager, Onsite coordinator, Client, BA's.
Understand and develop complex ETL (Graphs and Shell Scripting), SQL Queries and Reports for various Projects and Modules.
Show More Show LessDescription
ØDevelop effective sourcing and recruiting strategies from `initial screening' to the `offer closure'
ØResponsible for providing desired recruitment solutions to internal and external project within the stipulated time.
ØClosing candidates proactively for Niche skills and generate higher revenues for the company.
ØExtensively experience in working with C2C, W2, 1099.
ØRecruited and placed consultants for Global leaders like, Johnson & Johnson, Wachovia,BCBS, IBM,JPMC, Freddie Mac, Fannie Mae,AT&T, Verizon Telecom, Verizon Business, AIG and more
ØSuccessfully recruited consultants for various software projects on different verticals like Financial, Banking, Brokerage, Insurance, & Consulting firms on Contract.
ØInteraction with Technical Panel/Technical leads/Resource Managers for requirement understanding, technical screening, involve in the overall end-to-end process of recruitment
ØRecruiting via Internet job boards (Dice.com, Monster.com) direct phone sourcing, vendor network, referrals and expanded Internet sourcing.
ØSubmit qualified candidates to the US team.
ØResponsible for screening candidates to ensure their qualifications meet open positions.
ØConducting phone interview for prospective candidates, Coordinating interviews.
Internet savvy to perform targeted Internet research/searches identifying candidates to meet our client needs.
Show More Show Less
Description
CATS ( Collabera Applicant tracking System) is the project of Collabera Consulting Pvt ltd
ØWorked as SQL developer.
ØInteracted with application developers in creating SQL queries for the Functional module.
ØConverted MS Excel worksheets into MS SQL Server 2000 database, such as: Created database and tables/views, Setup relationship among tables, wrote stored procedures, and triggers.
ØOptimized SQL server performance using tools like MS Query analyzer, MS SQL profiler, and index tuning wizard.
Used Crystal Report 11 to generate various reports for business analysis purpose.
Show More Show Less