Shifa M.

Shifa M.

ETL Lead ,Data Analytics,Data Scientist

Bengaluru , India

Experience: 6 Years

Shifa

Bengaluru , India

ETL Lead ,Data Analytics,Data Scientist

10971.4 USD / Year

  • Immediate: Available

6 Years

Now you can Instantly Chat with Shifa!

About Me

  • 6.5 years of experience in ETL tools and DataWarehousing.
  • Worked on ETL Tools like Informatica Power Center 9.1, Informatica Power Center 10.1.0,SSIS,SAP BODS and Talend Open Studio in Data conversion and transformation projects u...
  • Strong knowledge in Data Warehousing Concepts
  • Experience in data modeling using Star Schema,Snowflake Schema.
  • Strong skills in Data Analysis and Data Mapping for ETL processes.
  • Part of Requirement Analysis
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Extensive experience in ETL design, development and maintenance using SQL, Informatica Power Center 8.6 and 9.x
  • Well versed in developing complex SQL queries, unions and multiple table joins and knowledge on Indexes,Views,Functions.
  • Implemented Slowly Changing Dimensions - Type I & II
  • Involved in Quality Assurance (Unit testing , Integration testing , negative testing) and experienced at creating effective Test data  to ensure successful execution of the data
  • Excellent documentation and presentation skills
  • Excellent verbal and written communication skills, willingness to learn and a quick learner 

Show More

Portfolio Projects

Description

Project Name: Network International

Domain: Banking

Team size: 4

Description: Network International provides technology-enabled payments solutions to merchants and financial institutions in the middle east. It deals with Card Acceptance, Authorization and Processing, ATM Management and Monitoring, Merchant Acquiring, Consumer Finance Services, Point-of-sale (POS), Merchant Settlement and Chargeback Services, Dynamic Currency Conversion, E-commerce services, Loyalty Programme, Multi-Currency, Fraud Monitoring.

Mashreq Bank needs to on-board on Mercury Payment Systems as a PoS & ATM

Acquirer which will enable Mercury card acceptance on Mashreq Bank Merchant/ PoS

terminals and ATM terminals. ADIB needs to on-boarded on Mercury Payment Systems as a PoS Acquirer which will enable Mercury card acceptance on ADIB Merchant/ PoS terminals.

Diners card report generation using Talend and DataMart. Calculating Diners Transaction and discount amount and incorporating with other schemes except diners (Visa, MC, AMEX, CUP, JCB, MERCURY, Private Label)

Role and Responsibilities:

  • TALEND need to enhance the mapping to accommodate Mashreq Bank PoS and ATM acquiring member.
  • TALEND need to enhance the mapping to accommodate ADIB Bank PoS acquiring member.
  • Input data was flat files, xml files or database.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis
  • Creating complex SQL scripts to get the data from various tables, transform and normalize in Talend to create reports.

Show More Show Less

Description

Project Name: United States Cellular Corporation(USCC)

Domain: Consumer and Industrial Products

Team size: 12

Description: United States Cellular Corporation(USCC), is a regional carrier which owns and operates the fifth-largest wireless telecommunications network in the United States.

USCC underwent a major technology transformation to update its Billing system, CRM, Financial and order management system, Enterprise data warehouse and Self-care portal

Deloitte USI helped them with an assessment of their testing work and with setting up a Testing CoE.

Role and Responsibilities:

  • Validate the movement of data elements on a field by field basis from source to target, validation is done by using approved Source to Target mapping sheet
  • Design test cases for each column
  • Check for data Integrity and data correctness
  • Run historic ETL loads and execute designed test cases
  • Identify and raise defects
  • Run incremental ETL loads and execute incremental test cases
  • Identify and raise incremental defects
  • Check for performance of the query and optimize it
  • List down the count of Nulls ,Zeros, Positive and Negative values for each KPI column and submit this report to client.

Show More Show Less

Description

Domain: Consumer and Industrial Products

Team size: 8

Description: Clarivate Analytics owns and operates a collection of leading subscription-based businesses focused on scientific and academic research, patent analytics and regulatory standards, pharmaceutical and biotech intelligence, trademark protection, domain brand protection and intellectual property management.

The key objective of Project Elevate is to successfully lead and facilitate the end-to-end implementation of core business information systems while facilitating the transformation of our business.

I was involved in mapping and transforming Finance objects like Vendor Bill, Vendor, Purchase Order from SAP to NetSuite using Talend ETL Tool. Data was loaded from SAP to Oracle, then Talend was used to map and convert Oracle source data to Netsuite required format using components like tWriteJSON, tMap etc.

Role and Responsibilities:

  • Created Talend jobs both design and code to process data from Oracle source to staging and then to Netsuite
  • Used components in Talend like tMap, tFilterRow, tjava, toracle, tWriteJSON, tdelimited files, tlogrow,tjavarow
  • Worked on Joblets (reusable code) & Java routines in Talend.
  • Created local and global Context variables in the job
  • Responsible for creating lookup and staging tables and other database objects like views, function, indexes and constraints
  • Implemented Error handling in Talend jobs
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs.
  • Tuned sources, targets and jobs to improve the performance
  • Developed complex ETL jobs to source from flat files and loaded into target databases using Talend OS ETL tool.
  • Interact with business community and gathered requirements based on changing needs. Incorporated identified factors into Talend jobs to build the Data Mart.
  • Performance tuning - Using the tmap cache properties, Multi-threading and Parallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.
  • Involved in Preparing Detailed design and technical documents from the functional specifications.

Show More Show Less