Venu V.

Service Info Developer III

Commitment
0/ 5
Competency
0/ 5
Reliability
0/ 5
  • Overall Experience: 13 Years  
  • Agile Software Development:
  • Amazon Relational Database Service:
  • Application Lifecycle Management:
  • Artificial Intelligence:
  • Big Data:

Venu V. 

Service Info Developer III

Commitment
0/5
Competency
0/5
Reliability
0/5

Time zones ready to work

  • Eastern Daylight [UTC -4]
  • Central Daylight [UTC -5]
  • Mountain Daylight [UTC -6]
  • Pacific Daylight [UTC -7]
  • Eastern European [UTC +2]
  • Eastern EST [UTC +3]
  • Greenwich Mean [UTC ±0]
  • Further EET [UTC +3]
  • Australian CDT [UTC +10:30]
  • Australian EDT [UTC +11]
  • Dubai [UTC +4]
  • New Delhi [UTC +5]
  • China (West) [UTC +6]
  • Singapore [UTC +7]
  • Hong Kong (East China) [UTC +8]

Willing to travel to client location: Yes  

About Me 

8.1 Years of Experience in Development of various projects & domains like banking, insurance, Retail Marketing and Healthcare with ETL Tool. Expertise in ETL Tool IBM Inf
8.1 Years of Experience in Development of various projects & domains like banking, insurance, Retail Marketing and Healthcare with ETL Tool. Expertise in ETL Tool IBM InfoSphere DataStage 11.5. Knowledge on data warehouse concepts like Star Schema, Snow Flake, Dimension and Fact tables. Hands on experience in performance tuning, process identifying and resolve performance issues in Parallel jobs. Understanding the business rules completely and implementing the data transformation and methodology. Hands on experience in designing, developing and debugging Data Stage PX jobs using DataStage Designer. Knowledge of Big Data tools Hive,Sqoop,Hbase and ETL Testing. A versatile, determined & extremely self-motivated individual who thrives on meeting tight deadlines and is comfortable in a pressurized environment. Broad technical awareness with the ability to communicate at all levels.
Show More

Interview Videos

Signup to see videos

Risk-Free Trial, Pay Only If Satisfied.

Portfolios

CMR Project - AIMIA

Role:

Project title : CMR Project
Client : Aimia
Role : Senior ETL Consulatant
Environnent : IBM InfoSphère DataStage 11.3, Oracle

Project Description:

AIMIA, formerly Groupe Aeroplan, is a data-driven marketing and loyalty an

Project title : CMR Project
Client : Aimia
Role : Senior ETL Consulatant
Environnent : IBM InfoSphère DataStage 11.3, Oracle

Project Description:

AIMIA, formerly Groupe Aeroplan, is a data-driven marketing and loyalty analytics company based in Montreal, Canada, has close to 4,000 employees in 20 countries, and is publicly listed on the Toronto Stock Exchange, which manages various loyalty programs including Aeroplan in Canada, Nectar (Italy and the UK) and provides loyalty strategy, program development and management services to clients underpinned by product and technology platforms such as the AIMIA Loyalty Platform and Smart Button, and through their analytics and insights business, including Intelligent Shopper Solutions. Merchant rewards is a loyalty program where the respective cardholders of AIMIA’s customer have a dizzying array of points, miles and cash-back rewards programs to choose from. Respective customers of AIMIA can exchange their points earned by purchasing any product of their choice available in AIMIA’s Merchant provide source. They can also opt for exchanging their points for miles by booking any rental cars and redeem their points.

Roles & Responsibility:

Worked on enhancements to update existing datastage jobs.
Developed and modified the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Unit test cases and capturing the result.
Integrating the individual data stage PX jobs into Data stage sequence based on the dependency.
Schedule the data stage jobs using Control M scheduler based on the requirement.
Prepared the Control M scripts for DS jobs and ran the scripts to load the data into the tables.
Prepared the delivery checklist and required code to be migrated for higher environments.
Participated in code migration acitivities and Verified the code after code migration.

Show More

Skills: Data WarehousingLinuxOraclePL/SQLSQLUnix

Tools: Control MPutttyWinSCP

CMR Project

Role:

AIMIA, formerly Groupe Aeroplan, is a data-driven marketing and loyalty analytics company based in Montreal, Canada, has close to 4,000 employees in 20 countries, and is publicly listed on the Toronto Stock Exchange, which manages various loyalty programs including Aeroplan in Canada, Nectar (Italy
AIMIA, formerly Groupe Aeroplan, is a data-driven marketing and loyalty analytics company based in Montreal, Canada, has close to 4,000 employees in 20 countries, and is publicly listed on the Toronto Stock Exchange, which manages various loyalty programs including Aeroplan in Canada, Nectar (Italy and the UK) and provides loyalty strategy, program development and management services to clients underpinned by product and technology platforms such as the AIMIA Loyalty Platform and Smart Button, and through their analytics and insights business, including Intelligent Shopper Solutions. Merchant rewards is a loyalty program where the respective cardholders of AIMIAs customer have a dizzying array of points, miles and cash-back rewards programs to choose from. Respective customers of AIMIA can exchange their points earned by purchasing any product of their choice available in AIMIAs Merchant provide source. They can also opt for exchanging their points for miles by booking any rental cars and redeem their points.
Show More

Skills: Oracle

Tools:

UK Business Intelligence - HSBC

Role:

Project title : UK Business Intelligence
Client : HSBC
Role : ETL Consultant
Environnent : IBM InfoSphère DataStage 11.3, Teradata

Project Description:

HSBC is one of the largest banks in the world. HSBC is one of the gro

Project title : UK Business Intelligence
Client : HSBC
Role : ETL Consultant
Environnent : IBM InfoSphère DataStage 11.3, Teradata

Project Description:

HSBC is one of the largest banks in the world. HSBC is one of the growing multinational financial services company. The main aim of the project is to maintain the existing data warehouse and also enhance/create new flows to enhance the data warehouse according to the requests received from UK Business. The project uses Data Stage 11.3 as the ETL tool and the Cognos for its reporting and analysis requirements. The major applications involved in warehouse for UK business intelligence as part of the (UCD) mart, SPM, CARDs, Complaints, CTA and UK Warehouse Applications.

Roles & Responsibility:

Develop the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Unit test cases and capturing the result.
Integrating the individual data stage PX jobs into Data stage sequence based on the dependency.
Schedule the data stage jobs using data stage scheduler based on the requirement.
Run the jobs from the director to load the data into the tables.
Prepared the System test cases for system testing.

Show More

Skills: Teradata

Tools: Control MPutttyWinSCP

UK Business Intelligence

Role:

HSBC is one of the largest banks in the world. HSBC is one of the growing multinational financial services company. The main aim of the project is to maintain the existing data warehouse and also enhance/create new flows to enhance the data warehouse according to the requests received from UK Busine
HSBC is one of the largest banks in the world. HSBC is one of the growing multinational financial services company. The main aim of the project is to maintain the existing data warehouse and also enhance/create new flows to enhance the data warehouse according to the requests received from UK Business. The project uses Data Stage 11.3 as the ETL tool and the Cognos for its reporting and analysis requirements. The major applications involved in warehouse for UK business intelligence as part of the (UCD) mart, SPM, CARDs, Complaints, CTA and UK Warehouse Applications.
Show More

Skills: Teradata

Tools:

Investment ODS - Prudential Financial

Role:

Project title : Investment ODS
Client : Prudential Financial
Role : ETL Consultant
Environnent : IBM InfoSphère DataStage 11.5, DB2, SQL Server

Project Description:

Prudential Financial has helped individual and institution

Project title : Investment ODS
Client : Prudential Financial
Role : ETL Consultant
Environnent : IBM InfoSphère DataStage 11.5, DB2, SQL Server

Project Description:

Prudential Financial has helped individual and institutional customers grow and protect their wealth. We are known for delivering on our promises to our customers, and are recognized as a trusted brand and one of the world’s most admired companies. We strive to create long-term value for our stakeholders through strong business fundamentals, consistent with our mission guided by our vision and directed by our company's core values.

Regarding IODS project, previously the business asked us to provide all of the data to Markit Digital and they would work with MD on the business rules, As the complexities of the business rules become more apparent, there is growing concern in the Technology team about information consistency and risk of multiple interpretations of business rules, Here the concern has been voiced that we should not leave the interpretation of business rules up to any of our consumers including Markit Digital. IODS Main responsibilities are information acquisition, refinement, mastering and publication. Here refining and mastering process are where the business rules implemented and provided the data for publication.

Roles & Responsibility:

Extracted the data from different source systems and loaded data into target warehouse.
Designed the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Unit test cases and capturing the result.
Involved in Performance Tuning and resolve performance issues.
Integrating the individual data stage PX jobs into Data stage sequence based on the dependency.
Provided the updates to mapping team queries and issues in timely manner.
Worked in shifts to give continuous support to the on-shore team.
Providing technical and investigative support for defects, change requests, debugging issues for the reported issues.
Maintained high quality of documentation - In the form of Application Design documents and Test cases.
Prepared the DataStage sequencers as per client requirement’s.

Show More

Skills: DB2

Tools: FileZillaPutttyWinSCP

+ More

Employment

ETL Developer

2011/01 - 2012/02

Skills:

Your Role and Responsibilities:

Roles & Responsibility:

Migration of jobs from DataStage 7.5 to DataStage 8.1 using DS export.
Identify the problems in the job if failing and debug the jobs.
I modified the scripts and run the UNIX scripts.
Execute the SQL queries check the data in the databa

Roles & Responsibility:

Migration of jobs from DataStage 7.5 to DataStage 8.1 using DS export.
Identify the problems in the job if failing and debug the jobs.
I modified the scripts and run the UNIX scripts.
Execute the SQL queries check the data in the database.
Testing the jobs and uploading the file in dashboard and check the data is moving to target database.
Prepare the test cases in for testing and updated in HP quality center.
Involved in Performance Tuning and resolved the performance Issues.
Worked on Application Change Requests (ACR).

Show More

Data Specialist

2012/03 - 2018/04

Skills: Data CleansingDB2

Your Role and Responsibilities:

Extracted the data from different source systems and loaded data into target warehouse.
Designed the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Unit test cases and capturing the re

Extracted the data from different source systems and loaded data into target warehouse.
Designed the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Unit test cases and capturing the result.
Involved in Performance Tuning and resolve performance issues.
Integrating the individual data stage PX jobs into Data stage sequence based on the dependency.
Provided the updates to mapping team queries and issues in timely manner.
Worked in shifts to give continuous support to the on-shore team.
Providing technical and investigative support for defects, change requests, debugging issues for the reported issues.
Maintained high quality of documentation - In the form of Application Design documents and Test cases.
Prepared the DataStage sequencers as per client requirement’s.

Show More

Senior Developer

2018/05 - 2018/07

Skills: Teradata

Your Role and Responsibilities:

Roles & Responsibility:

Develop the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Unit test cases and capturing the result.
Integrating the individual data st

Roles & Responsibility:

Develop the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Unit test cases and capturing the result.
Integrating the individual data stage PX jobs into Data stage sequence based on the dependency.
Schedule the data stage jobs using data stage scheduler based on the requirement.
Run the jobs from the director to load the data into the tables.
Prepared the System test cases for system testing.

Show More

Service Info Developer III

2018/09 -

Skills: Data WarehousingData WarehousingLinuxOraclePL/SQLSQLUnix

Your Role and Responsibilities:

Roles & Responsibility:

Worked on enhancements to update existing datastage jobs.
Developed and modified the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Uni

Roles & Responsibility:

Worked on enhancements to update existing datastage jobs.
Developed and modified the Data stage parallel jobs using transformer, Join, Lookup, Filter to load the data into the warehouse tables.
Prepared the Unit test cases, executed the Unit test cases and capturing the result.
Integrating the individual data stage PX jobs into Data stage sequence based on the dependency.
Schedule the data stage jobs using Control M scheduler based on the requirement.
Prepared the Control M scripts for DS jobs and ran the scripts to load the data into the tables.
Prepared the delivery checklist and required code to be migrated for higher environments.
Participated in code migration acitivities and Verified the code after code migration.

Show More

Education

2003 - 2005


2000 - 2003


Skills

Agile Software Development Amazon Relational Database Service Application Lifecycle Management Artificial Intelligence Big Data

Tools

WinSCP Puttty FileZilla

Preferred Languages

English - Fluent