Kranthi V.

Kranthi V.

ETL Developer

, Malaysia

Experience: 7 Years

Kranthi

ETL Developer

68571.5 USD / Year

  • Immediate: Available

7 Years

Now you can Instantly Chat with Kranthi!

About Me

7 years of experience in IT Industry in data warehouse environment in Banking Domain. 6 years of experience as an ETL developer 1 year experience in application support. Good Knowledge in FSLDM and Dimensional Modeling. 7 years of experience in Terad...

Show More

Portfolio Projects

ETL developer and Analyst worked in Banking Domain

Contribute

As per MAS610 regulataryrequirements ,written mapping from different source tables and developed code and generated reports .Unit testing.Writing Complex SQLs in Teradata.Preparing datastage jobs.

Description

The Monetary Authority of Singapore (MAS) has issued revised regulatory requirements which set out the revised reporting standards for banks in Singapore. These changes will take effect on 1 October 2020. This is in line with MAS’ objectives to collect data in machine-readable format and to reduce duplicate data submissions by financial institutions (FIs).

The key changes to the regulatory requirements include:

(i) Collecting more granular data of banks’ assets and liabilities by currency, country and industry. Greater granularity allows better identification of potential risks to the banking system;

(ii) Rationalizing the collection of data on RMB business activities and deposit rates. The standardized requirements will provide greater consistency and reusability of the data; and

(iii) removing the Domestic Banking Unit and Asian Currency Unit1and for banks toreport their regulatory returns in Singapore dollar and foreign currency instead.

Show More Show Less

Description

The Monetary Authority of Singapore (MAS) has issued revised regulatory requirements which set out the revised reporting standards for banks in Singapore. These changes will take effect on 1 October 2020. This is in line with MAS objectives to collect data in machine-readable format and to reduce duplicate data submissions by financial institutions (FIs). The key changes to the regulatory requirements include: (i) Collecting more granular data of banks assets and liabilities by currency, country and industry. Greater granularity allows better identification of potential risks to the banking system; (ii) Rationalizing the collection of data on RMB business activities and deposit rates. The standardized requirements will provide greater consistency and reusability of the data; and (iii) removing the Domestic Banking Unit and Asian Currency Unit1 and for banks to report their regulatory returns in Singapore dollar and foreign currency instead

Show More Show Less

Description

OCBC EDW has upstream and downstream. From multiple data sources loads the data into core EDW (Upstream).From EDW, transform the data as per different user requirements and loads into multiple Datamarts like ALMDM, GFRDM, RPMDM, RRDM...etc.

Show More Show Less

ETL DEVELOPER

Contribute

Prepared LLD documents & Source to Target Mappings.written complex B.teq scripts for transformations and loading.Datastage job preparation and autosys JIL preparation.Unit Testing.Application support

Description

PRDS (Party Referential Data Store) is a ADS (Authorized Data Store) for Client/Customer data in Bank. It holds the customer/client, Customer to Account linkage and aml-kyc information received from multiple Upstream and sends to multiple Downstream applications. PRDS is a component of the Master Reference Data Strategy which serves as a data store to support enterprise initiatives and repository for combined wholesale and retail customers.

Show More Show Less

Description

PRDS (Party Referential Data Store) is a ADS (Authorized Data Store) for Client/Customer data in Bank. It holds the customer/client, Customer to Account linkage and aml-kyc information received from multiple Upstream and sends to multiple Downstream applications. PRDS is a component of the Master Reference Data Strategy which serves as a data store to support enterprise initiatives and repository for combined wholesale and retail customers.

Show More Show Less

Description

The aim of CIF MDM is to create MASTER customer and accounts tables and views in oracle Database as part of Mainframe migration. As part of this project we get the source data in terms of dat files from CIF servers. After performing the basic cleansing process and other transformations specific to the client requirement and load into the Stage tables. After that we created views by joining different stage tables and applied transformations. These Views will be used as source for the CODS process.

Show More Show Less

ETL Developer

Contribute

Understanding mainframe Cobol programming logic.Migrating Mainframe to Datastage.Preparing Source to Target Mapping.Extensively used Datastage 9.1 for ETL processing.Creating drafts in control-M.

Description

CODS Transactions module contains all different types of transaction details done by the customers in North America. As part of this project we get the source data in terms of Cobol Copybook files from different servers. After performing the basic cleansing process and other transformations specific to the client requirement and load into the Stage, Target table which is the ODS. Further, based on Cobol program logics, we generate outbound files. This project main intension is to migrate of Mainframe to ETL.

Show More Show Less

Description

HSBCNET is the one of the important application in HSBC. It Contains the Corporate Users profile and Audit information. As part of this project we get the source data in terms of log files from different servers. After performing the basic cleansing process and other transformations specific to the client requirement and load into the Stage, Details table which is the EDW. Later aggregate the data and load into summary tables.

Show More Show Less

Description

P2G and HSBCNET are two different application.P2G deals with end-users profile and audit details whereas HSBCNET deals with corporate users profile and audit details. The architecture of these two applications is same. We get the source data in terms of log files from different servers. After performing the basic cleansing process and other transformations specific to the client requirement and load into the Stage, Details table which is the EDW. Later aggregate the data and load into summary tables.

Show More Show Less

Description

The objective of this project is to maintain a centralized repository for the EDW metadata (Data Definitions) which is captured from data modeling tools, relational databases, XML Schemas, COBOL copybooks, and other sources, and the detail of the transformations that occurred in ETL tools.

Show More Show Less