Gudipati S.

Gudipati S.

Datastage / ETL Developer

Hyderabad , India

Experience: 14 Years

Gudipati

Hyderabad , India

Datastage / ETL Developer

33364 USD / Year

  • Notice Period: Days

14 Years

Now you can Instantly Chat with Gudipati!

About Me

12+ years of experience in the field of Information Technology as a Datastage / ETL Developer in Data Warehousing with strong experience in all phases. 4.5 years of experience in using Hadoop and hive for dataware housing project using Datastage 11.3...

Show More

Portfolio Projects

Description

Citibank was looking to develop a reporting solution to mitigate its non-quantifiable Franchise risks and comply with the OCC(Office of the Comptroller of the currency) stipulations of having contracts level data provisioned for timely, accurate submissions. It needed the vendor to take ownership of building a single data warehouse to host a common pool of contracts, Positions and Balances, organized on an enterprise wide basis spanning all its LOB’s.

Show More Show Less

• Involved in doing POC for using Hadoop and hive technologies for ETL datastage

Contribute

• Involved in doing POC for using Hadoop and hive technologies for ETL datastage tool. • Technical Lead for this project. • Responsible for UAT and PROD support. • Involved in Jenkins code promotion.

Description

The objective of the project is to deliver a compliance data platform that enables HSBC to extract, integrate and house FCC identified data centrally for system control analytics (SAC) and management information and data governance (MI & DG) purpose, providing consistent descriptions and view of data from existing federated solutions. Primary focusing on generation of reports to provide insight and control for future operation of the business and allow presentation of fully productionalised MI and Analytics with improved data quality.

<!--[if !supportLists]-->Ø <!--[endif]-->CDP is fully secure model with data access available to respective regions/business units.

<!--[if !supportLists]-->Ø <!--[endif]-->Consistent and quality view of data from existing federated solutions

<!--[if !supportLists]-->Ø <!--[endif]-->An analytics environment with access to data in its raw format as well as normalized and dimensional data models.

<!--[if !supportLists]-->Ø <!--[endif]-->An audit controlled application architecture that meets data management strategic requirements.

At high level CDP is considered as a combination of data ingestion process, a storage layer named data lake, a management information layer named normalized area and framework components.

Show More Show Less

Description

The objective of the project is to deliver a compliance data platform that enables HSBC to extract, integrate and house FCC identified data centrally for system control analytics (SAC) and management information and data governance (MI & DG) purpose, providing consistent descriptions and view of data from existing federated solutions. Primary focusing on generation of reports to provide insight and control for future operation of the business and allow presentation of fully productionalised MI and Analytics with improved data quality. CDP is fully secure model with data access available to respective regions/business units. Consistent and quality view of data from existing federated solutions An analytics environment with access to data in its raw format as well as normalized and dimensional data models. An audit controlled application architecture that meets data management strategic requirements. At high level CDP is considered as a combination of data ingestion process, a storage layer named data lake, a management information layer named normalized area and framework components.

Show More Show Less

Description

As part of the Bell IPTV Personalized Recommendation and Enhanced Search project, the RECO Engine , which will sit in the network and integrate to Fibe TV to provide recommendation to customers, requires Bell TV subscribers’ information.

The Business Intelligence team’s work for this project is to ensure that daily change of Bell TV subscribers’ information and/or their subscriptions will be captured and the new subscribers’ information and/or their subscriptions data will be sent to the RECO Engine. BI will also produce full refresh Channel Mapping and Service Authorization daily feeds for RECO Engine.

The BI team will also generate the one-time initial load for the Bell TV subscribers’ information, Bell TV subscriptions and Media Room information from existing BVu’s tables

Show More Show Less

Description

My Bell: Phase1B-EPG system is designed to provide the normalized EPG (Electronic Programming Guide) guide to Bell TV which contains 14 days of schedule data from the current date. On a scheduled daily basis, EPG Source systems (Nagra and MSTV) sends linear channel data (which includes metadata for broadcast channels and for PPV events scheduled) to My Bell-EPG. My Bell-EPG generates a standard Normalized EPG file for both DTH and Fibe and publishes it to an FTP server along with the Images in a zip file format. If any of the source systems sends the files other than daily schedule, My Bell-EPG will then generate a new copy of the Normalized EPG file and place it in the FTP server. My Bell-EPG will then call the notification service in order that ESB will make bell.ca aware of the availability of updated normalized EPG guide.

Show More Show Less

Description

The new iService system is designed to fortify and replace the current iService application to create a repository for Normalized EPG data.

The new iService system processes the scheduling and programming information for DTH and IPTV services and creates an EPG (Electronic Programming Guide) data for variety of target systems. The new system will receive data from multiple source systems, over a variety of protocols and data formats which are then integrated and stored in a normalized database. The data is further extracted to the files, with the file formats that are agreed between iService system and corresponding downstream systems, and are published to the agreed locations. The file transfer is also notified real time by making webservice call.

The iService2 system is redesigned to create a single all-encompassing repository of programming and scheduling information. The repository will provide the infrastructure to store the data in one database, and the capability to feed existing downstream system with the information required without changing the current file formats and delivery systems.

Show More Show Less

Description

DataStage environment is scheduled for upgrade from the current version of DataStage 7.5.1 to DataStage 8.1. NA Tools team has to maintain current version of DataStage 7.5.1 to enable NA DWH team and OHBI Account Opening projects to run their application till they are ready for migration. A new set of servers are planned to be utilized for DataStage 8.1 and once the migration is completed, current DataStage 7.5.1 servers will be retired. This DataStage 8.1 is R2 driven upgrade from the current version of 7.5.1. DataStage 8.1 has a significant change from its predecessor versions and has additional hardware requirements. A WAS Server Layer and DB2 Metadata Layer has been introduced to get rid of propriety database repository used in earlier versions. A two server cluster will be used to create in each environment. Server 1 will host DataStage Engine layer and Server 2 will host WAS Service Layer and DB2 Metadata Layer.

Show More Show Less

Description

HSBC/Household has many websites in running state and all these websites are extensively used by their Customers. From the business perspective, continuous monitoring of these websites is required to make the websites more attractive. The main objective of this project is to analyze user/customer activities on the websites so that management can analyze the reports created in system and take decision to modify the website. Catalyst is the CCS effort to marry online customer behavioral data (e.g. website usage) with offline customer data and attributes (e.g. demographics, profitability etc.). This allows Business users to customize web pages to present products, offers and account services to internet customers based on the observed customer behavior.

Show More Show Less

Description

We will extract the HSBC Employees data from legacy database (PeopleSoft) and will process to the HR Database. HR Analytics Process: The process copies the previous months snap to the current month with SNAP_DT as the current month snap date. The details from the employee extract file (i.e. the delta file which in turn is created from the people soft file) are updated for each record in the database for the current month.

Show More Show Less

Description

This is a development project. Basic functionality of this project is to extract the date from different source systems and processing the data to the staging database by applying transformation rules given by the functional team. Then from the staging database ETL-XML team generates a set of XML files. This file needs to be loaded in WCC database through WCC MDM tool. After processing these files WCC trigger some message to EAI and SAP. Project has two parts – one is DI and another ETL-XML part. DI part taking data from sources like JDE, Mainframe and XLS etc and load data in Staging DB. Staging DB is in Informix environment. ETL-XML part takes data from Staging DB perform transformation and generate the XMLs. These XMLs s are used by WCC. WCC load these data in its own database that is in DB2.There is another layer i.e. EAI layer that transform the data and transfer it to SAP to generate some response message. Project is going on in two different tracks – Location and Supplier Track. Location Staging Database holds Retail Data. Data maintain particular hierarchy structure containing Segment, Subdivision, Region, Market, Location and Department. DataStage ETL generates 9 xml files based on these Hierarchy item. There is also one separate xml file that generate Hierarchy xml. All these xml require loading in WCC. DataStage ETL tool is used to generate these XMLs files. Lombardi and .Net are the interface those are used as GUI for this project.

Show More Show Less

Description

Wells Fargo is an American Banking company under private ownership, holding a dominant position in banking services. The main aim of developing the Single Platform is to migrating the existing business and logical grouping of raw source files based on EDW and ADW subject areas and then merging into the single EIW Platform. This project is developed to perform Extraction, Transformation and Load (ETL) from DB2 to Teradata using Datastage PX 8.0.1 as per the requirement. In this project the whole Development work is divided into number of groups as per requirement.

Show More Show Less

Description

Cognizant has/is implementing key applications for capturing its business information through the implementation of PeopleSoft applications as part of the Compass Initiative. The next step in this is to make this information available to users for reporting, analysis and for informed decision making. This is to be achieved through the Insight Compass project implementing a Business Intelligence (BI) solution enabling end to end visibility into Cognizants business. The Business Intelligence solution aims to tie together customer, associate, project and financial information housed in PeopleSoft and non PeopleSoft applications enabling powerful analytics, drilldowns and reporting capabilities and a single source of management reporting. The solution will aim to leverage on the existing work done in the current reporting solution IBIS, the operational scorecard developed as part of the Reflect project and the technology investment made in PeopleSoft EPM solution

Show More Show Less

Description

Centrica is the largest supplier of gas to domestic customers in the UK, and one of the largest suppliers of electricity, operating under the trading names Scottish Gas in Scotland and British Gas in the rest of the UK.The Meter to Cash Dashboard project involves developing of KPI jobs. The project gathers the data from different systems, here termed as source system. This data is extracted, transformed (as per report logic) and loaded in the VMTC001 (target) database. For this, the ETL tool used is IBM Web sphere Data Stage (Enterprise edition).The whole project is being divided in multiple releases based on Centrica roadmap. Therefore, it becomes necessary to manage the changing items of the project. Thus, Configuration Management and Release management need to be in place. The manual is intended to give the audience a clear explanation about the operations management of Meter to Cash Dashboard Application.

Show More Show Less