Now you can Instantly Chat with Gagandeep!
I have total of 12 years of experience in IT industry.
For past 4.5 years I have been working on technologies like:-
2) Apache Spark
5) Python/ Pyspark... Show More
Developed Pyspark code that reads data from a dB data source. It then performs data manipulations like grouping, sorting, joins etc. depending on use case requirements and eventually stores it again in a database table. It is then used by web services to display this information on the web. Prepared graphs and visualizations for different business needs like to identify the age group in which a particular funds seems to be more popular etc.Show More Show Less
SkillsApache Cassandra PySpark
SCV 2.0 (Single Customer View)
Developed Informatica Power Centre mappings to perform initial data load from legacy database to SCV2.0 Cassandra database, which has a flattened data model as compared to legacy data model.
Developed a pyspark code that reads real time data and after applying all the required business logics and data quality checks, stores it Cassandra tables. Thereafter, nightly pyspark jobs read these tables and generate EOD feeds for the warehouse team. These feeds are made available in HDFS for downstream consumers.
Moreover, certain analyses is done on top of these eod feeds using Hive in order to assist portfolio management teams. This analyses involved identifying top performing funds in a certain geography, which fund is most famous among which age group etc.Show More Show Less
My responsibilities as a part of Informatica India team included analyzing the BRD (Business Requirement Document) and thereafter get in touch with business analyst team to discuss and close all the open points if any in the BRD. These include new BRDs as well as some old BRDs where some update is required.
After finalizing the BRD, perform coding as per these documents, performing unit testing and get sign-off from business before moving the code into production.Show More Show Less
SkillsInformatica Shell Scripting SQL Unix
Plan Sponsor Site.
I was a part of ETL team for this project where my role was to take care of monthly data processing for all the clients. There are 3 categories of services that are offered to the clients for this product, i.e. DB (Defined Benefit), DC (Defined Contribution) and HW (Health & Welfare). We received client specific flat files (fixed width) on monthly basis for each client depending upon the services which the client has opted for. These files are then loaded to data warehouse with the help of Unix and Informatica. This data is then further used to prepare data sources, which are again flat files (delimited) and cognos cubes get build upon these data sources.
I was supposed to resolve data issues raised by the client, wherein we were required to track back the issue for e.g. a mismatching figure in the report for which client had raised an issue to the source file to identify that if something went wrong in our data processing or it was as it is in the source files.Show More Show Less
In this project, my role was to take care of updating data (received on daily basis from different sources in the form of flat files) in staging tables with the help of Informatica and Unix. Worked on various enhancements on Informatica level to improve the data quality being delivered and make the system more stable. Identified and implemented various automation techniques which helped us reduce manual monitoring efforts and turnaround time in case of any issue.
Apart from this, my responsibilities also included to look out for enhancements in the system so as to reduce manual efforts and improve performance and stability and to mentor new team members.Show More Show Less
SkillsInformatica Shell Scripts SQL Unix
Core banking Project
I was a member of application support team wherein we performed daily backend tasks such as monitoring the system for any congestion or performance issue and to perform end-of-the-day activities which included tasks like interest calculations for deposit and loan accounts. Other activities also included month-end and year-end activities and migrating some files to all the branches currently live on centralised banking.Show More Show Less
SkillsShell Scripts SQL Unix