Vijay P.

Vijay P.

Data Analyst at Barnes and Noble

New Delhi , India

Experience: 9 Years

Vijay

New Delhi , India

Data Analyst at Barnes and Noble

USD / Year

  • Start Date / Notice Period end date:

9 Years

Now you can Instantly Chat with Vijay!

About Me

  • A seasoned IT professional with around 8 years of progressive experience in the area of BI/Data Warehouse & Machine Learning using R and Python.
  • Previous companies were Ca...using R and Python.
  • Previous companies were Capgemini, Eccella Consulting, Deloitte where I worked from March-2010 to May-2013 and May-2013 to May-2014 respectively.
  • Currently Associated with StudyMode (Barnes & Noble), working as Data Analyst and Analytics/ML consultant.
  • Experience on Data Science, Machine Learning, Statistical Data Analytics using R,Python exploratory and descriptive data analysis using ggplot2.
  • Two years of experience teaching Mathematics and Physics for IIT JEE.
  • Had association with one of the top website www.goiit.com during college time, worked as online subject expert and counsellor for students across the globe.

Show More

Portfolio Projects

Description

The project aimed at the Customer Churn Analysis where the loyalty of the customer needs to be identified using the descriptive analytics and provide the insights how to retain the present existing customers in the company. The project involved the deep understanding of the data and analyzing the same with the help of SQL, Regression, Correlation and programming languages – Python and R.

Role:

  1. To fetch the data from database based on business criterion and load the same in csv file for further processing in Python.
  2. To calculate the tenure of the customer and create a new variable which will determine the loyalty of the customer within organization.
  3. To select the important variables and deep dive in the pattern followed in the data set to understand the customer sentiments.
  4. Develop the predictive model to classify whether customer will churn or not.

Show More Show Less

Description

Studymode is an online education platform spread across US, Canada, Europe, Australia and South America which helps student prepare for their education, it is currently acquired by Barnes and Nobel education.

Role:

  1. As Business Analyst my role is to keep track of business, do the data analysis and drive the business smoothly.
  2. Building and maintaining Dashboards/Reports to keep track of various KPIs.
  3. Track the offers and discount and report their performances.
  4. Prepare financial metrics and deferral calculations every month.
  5. Do the deep analysis for any data anomalies and find out the root cause.
  6. Report any data discrepancies in live system and alert the business.
  7. Writing complex SQL and Python scripts to extract the raw data for further analysis.
  8. Reduce the time of report generation from 30 minute to 1 second by writing Python script to extract the data and generate report automatically.
  9. Monitor key metrics such as revenue per unit, gross margin, provisioning liquidation, conversion, average order value, customer acquisition, top performing styles, etc.
  10. measure the effectiveness of marketing campaigns to evaluate and improve future performance.
  11. Identify, analyze, and interpret trends or patterns in complex data sets-Filter and clean-data by reviewing computer reports.

Show More Show Less

Description

Client is global automobile manufacturing company , the projects involved collecting data from various sources , aggregating them to one common place in Datawarehouse, do the initial level of cleanup here, finally colleting the required data for analysis to be done in R, it includes finalizing attributes to be selected from various tables with join, get the data in R Data Frame.

Perform descriptive and exploratory data analysis using various R Packages such as base package, ggplot2, dplyr. Generate the analysis report in html using R Markdown and R Notebook to be sent to peers for review. The goal was to analyze the customer’s purchase behavior in various 4 wheeler segment and based on the data predicting the range in which customer will spend so the right offers and promotions can be targeted to customer to increase the sales.

Do the data cleaning , data transformation and feature engineering and build the model on train data and perform the test on testing data to get the accuracy.

WE used various algorithm such as Rpart, Random Forest, glm and Gradient Boosting to check the best performance, we had used repeated k fold cross validation to tradeoff the variance in different model built using each iteration.

Finally create The newly predicted column into the target table directly from R’s RODBC package .

Show More Show Less

Description

Client is a global biotechnology tools company providing premier systems, consumables, and services for scientific researchers around the world. The project comprises of various small projects going on to enhance and support the BI jobs.

These small projects include DW, FTP and EDW jobs to run every day which finally trigger cube processing.

Overall, it aims to run the train efficiently to produce reports which are ultimately reviewed and discussed by the business.

The project involves managing the clients EDW (Enterprise Data Warehouse) and EDM (Enterprise Data Management). The EDW is loaded on a daily basis with data from APAC (Asia Pacific), Europe and North America regions. The loads are scheduled through tools such as Control M and Maestro with ETL loads running on Informatica PowerCenter and BO Data Integrator. Finally the data is made presentable to the users through Cognos reports and Cubes. Data quality is maintained with Informatica Data Quality application. Enhancements and issues tracked are resolved and tracked through Remedy tool. Overall the applications are managed to provide the business users critical decision making data available at the right time.

Vijay has been introduced to a live project in Informatica through Life Tech. He has started actively monitoring EDW loads using Control-M and PowerCenter9.1.1.
Through the internal knowledge transfer sessions on DW and E1 jobs, he has developed understanding of the data flow. he has also worked on various enhancement requests with the team which required database and ETL knowledge. And has observed few issue resolutions during the course of his monitoring loads.

Show More Show Less