Now you can Instantly Chat with Vamsidhar!
- Seventeen years of IT experience with focus on designing and implementing Data Warehousing, Data Marts, EDW, and Data Analytics.
- Eight years of experience driving various data analytics programs w...
- Eight years of experience driving various data analytics programs with Six years’ of experience in leading offshore resources.
- Three years of experience in financial Industry (Lehman Brothers and Moodys Investors) and Hospitality Industry (IHG).
- Eight years of experience in telecommunication industry ( AT & T, Verizon and RIM)
- Two years of experience in Auto industry (GPC/NAPA).
- Four years of experience in Bigdata and Cloud technologies (Google Cloud).
- Google Cloud Certified Associate Cloud Engineer (Linux Academy)
- LinkedIn - Transitioning from Individual Contributor to Manager
- Programming for Everybody (Python ) - License BVDVFPNWCPJ5
- The Data Scientist’s Toolbox ((Data camp) License: PUWSYAT95DK3
- Intro to Python for Data Science (Data camp) License: 4805515
- Intermediate Python for Data Science (Data camp) License: 4832292
- Introduction to R Course (Data camp) License: 3998794
- Informatica Certified Mapping Developer
- Oracle 9i PL/SQL Developer Certified Associate (SQL and PL/SQL)
- Teradata Certified Professional V2R5
Data & Analytics
IHG Sept 2016 – Current
Solution Lead/ Sr Product Owner
The team was responsible for supporting CMH Bonus Plan, IHG Loyalty (rewards/membership) program, New Brand Integration and SMART (Sales Mart). We were also responsible for data quality issues related to Legal, Sales and Reservation business teams. We are also IT data Stewarts for Enterprise reference data.
- Used GCP BigQuery to analyze the colleague data by comparing with legacy teradata tables
- Coordinated with Solution Architects to identify sources, define confirmed dimensions, junk dimensions and design ETL logic.
- Coordinated with offshore resources to delegate various ETL development and data analysis tasks.
- Lead the cyber breach analysis to understand the impact for potential breach at 2000+ hotels and saved about 2 million dollars by identifying and de-duping the potential impacted customers.
- Lead many code reviews for ongoing development, bug fixes and data analysis.
- Analyzed the eight-year billing data for AMEA hotels to identify overstated room revenues resulting in higher fee collected from various hotels.
- Worked strategically to reduce the Data Quality ticket count from 95 to 18 in a span of 6 months.
- Worked with Booking, commissions and other teams to load/analyze the data from Regent and Principal hotels to map the metrics (rate category, rate code, sales channels) with IHG data.
- Saved IHG around $50K per each hotel since our analysis proved that we have only over charged for $400K revenue compared to $800K initially reported by hotel.
- Supported Legal teams by providing the Subpoena information on Hotel guests.
- Lead many data analysis initiatives related to IHG Loyalty members and their rewards
- Designed and developed the Data Quality Engine to report the metadata match percentage between data model (Erwin) and databases (Oracle and Teradata).
- Worked with Revenue Analytics business team, to define various use cases to account for mismatches between Guest reservations and corresponding stay information.
- Worked as IT Stewart for Enterprise reference data to handle various changes proposed for ISO and non-ISO standard reference data.
Environment: GCP (Google cloud), Hortonworks (HDP 220.127.116.11), Zeppelin 0.7.0, HDFS, Hive, Teradata 15.00, Informatica PowerCenter 9.6.1, Oracle 11g, Linux and Version One.Show More Show Less
- Designed the Delta process from PUB (hadoop) to Stage , Core and Netezza DAQ tables.
- Worked with Architect for partitioning the Hadoop Fact table for optimal query performance.
- Created Hive External tables for Staging and Core for development and unit testing.
- Lead various sprint stories where multiple developers were responsible for few Bigdata tasks and Informatica Tasks.
- Created Informatica BDE mappings for Dimension delta loads
- Created Hive QL for Fact (Impression and clicks) hourly delta loads
- Created SQOOP jobs for exporting data from Hadoop staging tables to Netezza DAQ tables.
- Created Audit jobs to compare Hadoop stage record count with Netezza DAQ row count.
- Created Oracle Tables, indexes, Constraints, Synonyms and other objects for Chassis team.
- Created UNIX wrapper scripts for executing ETL mappings, Sqoop job, Audit job and others.
- Created Informatica deployment groups for migration from Dev, QA to production
- Designed Informatica Stub mappings for Dimension tables when Operative source is unavailable.
- Coordinated with BI operations, Infrastructure team and DBA’s for various deployments.
- Participated in various phases of agile from Sprint planning, tasking, backlog grooming, Sprint demo and retrospectives.
Show More Show Less
SkillsCloudera Hadoop Hive Hue Apache Sqoop Informatica BDM Informatica Data analytics Database Administrator Netezza
EDW Phase 1
EDW Phase 1
- Worked with Peoplesoft source team to identify the interfaces needed to produce APG financial reports related to P & L, Fixed Asset depreciation and AR.
- Created JIA (Joint Interface Agreement) between Peoplesoft and EDW related to financial consolidation project.
- Created STM (Source to Target mapping) documents for ODS, EDW (Dimensions and Facts) and Dimensional stubbing.
- Worked with Store governance (MDM) team to design the publication feeds needed for Store dimension in EDW.
- Designed the Event Table, Status table and Store Change tables to capture all the required information for store count report in Store governance (MDM) project.
- Created and maintained the Joint interface Agreement documents for Service now (MDM) Inbound, Recurring and outbound feeds.
- Worked with Data architect to capture service now data in Oracle Exadata structures (MDM DB).
- Designed change data capture methodology for various dimension and fact tables.
- Designed and developed the dependencies between Informatica processes using event waits and command tasks.
- Created and used Informatica reusable schedules to manage various ETL processes that needed to be scheduled at different time period.
- Created mappings to populate the dimensional and fact tables using relational sources.
- Loaded 3 years of historical data from legacy systems before starting the ongoing loads.
- Coordinated with DBA’s, Source teams and architects for EDW Launch tasks
- Planned through Microsoft project to execute historical and incremental EDW loads.
- Lead the production simulation effort to load the EDW data from various source systems.
- Identified performance bottlenecks in the ETL processes that were running for long time.
- Tuned the Informatica processes for optimal performances.
- Performed various Informatica administration tasks like creating a repository, Informatica server maintenance (restarting etc), creating Informatica users and assigning roles.
- Setting up parameter files for historical fact deals and corresponding incremental loads.
- Worked with Exadata DBA’s to create the process to load Fact tables using external tables.
- Worked with QA testers to identify the test cases for various ETL loads.
- Coordinated with various consumer systems like APP400, EDW, JDEdwards to create interfaces with store governance
Show More Show Less
SkillsInformatica Informatica Administration Oracle Exadata SQL Server 2008 R2 Apache Subversion (SVN) Jira Cognos
Black Berry operations.
Black Berry operations.
- Redesigned existing link-stat ETL process to handle new SPIRIT source files
- Participated in Teradata POC project to replace existing HP Neoview platform for the EDW.
- Worked closely with Teradata professional services team and RIM functional architect to evaluate Teradata for EDW implementation.
- Created Teradata BTeq scripts to replace existing HP Neoview scripts
- Converted Neoview related processes in to Teradata processes using Teradata utilities.
- Designed and developed daily ETL processes to capture South Africa PIN Traffic.
- Designed and developed weekly ETL process to generate the enterprise PFS reports.
- Coordinated with the Business objects team to understand the impacts of ETL changes.
- Developed ETL process to capture enterprise PFS data for paying sales compensations.
- Worked on Intellisync project to capture historical data from SAP in to the EDW.
- Worked closely with the Data architect to design ETL processes for various projects.
- Designed and created rollback scripts for APPWORLD tables for restart ability after failure.
- Designed and developed fix for Appworld revenue fact table to handle the source changes.
- Created frame works for daily, weekly processes using control process table.
- Tuned link-stat ETL process using Informatica tuning and database tuning techniques.
- Leading the effort to transitions existing production jobs to production support team.