Sanchita S.

Sanchita S.

Experienced Big Data,Hadoop ,Solution developer, Solution Architect / Manager

Kolkata , India

Experience: 15 Years

Sanchita

Kolkata , India

Experienced Big Data,Hadoop ,Solution developer, Solution Architect / Manager

96000 USD / Year

  • Immediate: Available

15 Years

Now you can Instantly Chat with Sanchita!

About Me

  • Customer-focused BigData Solution Architect with leadership experience in Solution design, Cloud based development, Data Warehousing, Data Mining, PreSales, Agile Software development, Relationship management, Business Analytics, Process im...
  • 15+ years of experience with 4.9 years experience in Big Data solution implementation.
  • Helped clients decide best implementations for Cloud in their infrastructure by engaging in Pre-Sales activities, creating technology roadmap. Designed ETL architecture and developed visualization with trending-data warehousing solution and reporting solution.
  • Implemented complex data processing needs, optimized file formats for cloud based environments. Consistently able to deliver enhancements and solutions that drive customer satisfaction and loyalty.
  • Proven mentor and trainer, skilled at communicating with cross-functional teams to develop a shared vision and foster a culture of excellence. Experience in various domains like healthcare, Retail, Banking, Manufacturing, Supply Chain, Data Security and Transportation. 

Show More

Portfolio Projects

Pre-sales Solution Architect

Company

Pre-sales Solution Architect

Role

Product Manager

Description

Project Brief: Current opportunity is Pre sales Solution Architect position with Wipro Technologies supporting BFSI including 7 Business Units Pre-Sales for On Prim & Cloud based opportunities. The scope of work starts with technology evaluation for any sales request, defining technology roadmap supported by POCs and artifacts and contributing to overall sales target.

 

Key Responsibilities: The key responsibilities involve, System Analysis, Technology evaluation, Enterprise Technology Roadmap preparation, Presentation of Roadmap to Customer and sales tea m. Apart from this the additional responsibilities are POC for Technology evaluation and System Automation, Risk Assessment, Consultation to the Development team.  

Show More Show Less

Skills

Hadoop Big Data

Software Architect

Company

Software Architect

Description

Project Brief: Project Crawlers crawls the data from Cerner Millennium and non Millennium systems. This data as is placed in HBase for business analytics. The PopHealth Analytics project uses the data from different systems to be be pulled, processed and pushed to Vertica for more analysis.

 

Key Responsibilities: The key responsibilities involve, System Analysis and Enhancements, POC for Technology evaluation and System Automation, Development, Project Management, Risk Assessment and Client Communication and Team Management.

 

Show More Show Less

Skills

Hadoop Big Data

Tools

Hive Python java

Cornerstone Data Management

Company

Cornerstone Data Management

Description

Client:    American Express, Phoenix (USA)

Duration:    April 2015–Nov 2015

Employer:    Infosys Ltd

Project Title:    Capillary

Environment: Hadoop, Hive, Pig, Java, Mysql, Shell Scripting, Tumbleweed SFTP, Spring Batch workflow

Tools:    MS Visio, MPP

 

Project Brief: Retail customer data from Mom&Me stores are captured and analyzed for displaying most accurate promotions and offers at the retail stores. As part of this project customer and offer data received from Mom&Me stores goes through a series of business logic processing to produce resultant data compatible with POS system display.

 

Key Responsibilities: Project Manager(April 2015): The key responsibilities involves, Project Management, Risk Analysis, Budget overflow Analysis, Vendor management, Client communication, Logical and physical data design, modeling & implementation of enterprise data. Defining, designing, developing & optimizing ETL processes for enterprise data warehouse. Coding (Hive, Pig, Java, Spring Batch Workflow, Shell Script), testing, Release management, UAT, User Documentation and Support handover.

 

Show More Show Less

Capillary

Company

Capillary

Description

Client:    American Express, Phoenix (USA)

 

Duration:    April 2015–Nov 2015

Employer:    Infosys Ltd

Project Title:    Capillary

Environment: Hadoop, Hive, Pig, Java, Mysql, Shell Scripting, Tumbleweed SFTP, Spring Batch workflow

 

Tools:    MS Visio, MPP

 

Project Brief: Retail customer data from Mom&Me stores are captured and analyzed for displaying most accurate promotions and offers at the retail stores. As part of this project customer and offer data received from Mom&Me stores goes through a series of business logic processing to produce resultant data compatible with POS system display.

 

Key Responsibilities: Project Manager(April 2015): The key responsibilities involves, Project Management, Risk Analysis, Budget overflow Analysis, Vendor management, Client communication, Logical and physical data design, modeling & implementation of enterprise data. Defining, designing, developing & optimizing ETL processes for enterprise data warehouse. Coding (Hive, Pig, Java, Spring Batch Workflow, Shell Script), testing, Release management, UAT, User Documentation and Support handover.

 

Show More Show Less

Skills

Hadoop

Tools

Python

Global Customer Analytics

Company

Global Customer Analytics

Description

Client:    Apple Inc, Cupertino, CA (USA)

 

Duration:    April 2012–Sep 2013 (Onsite)

Employer:    Infosys Ltd

Project Title:    Global Customer Analytics

Environment: Teradata, Oracle, AIX, Tableau, Shell Scripting, Java

 

Tools:    SVN, MPP, Fasttrack, OmniGraffle, Radar, Espresso, Tableau

 

Project Brief: Global Customer Analytics system is focused on providing unified aggregated view of Apple customers at different angles like personal, purchase, and product for Apple marketing team. In this project the customer, order and invoice data along with product information flows through different systems and is collaborated into a single view to have easy access for the information while doing the reporting through Tableau.

 

Key Responsibilities: Lead Consultant role: The responsibilities involve develop aggregated views using Teradata, Oracle, Java and Tableau reporting. The job role also also covered data analysis and requirement gathering along with testing, release, and user documentation. Part of this project mentored a team of 4-7 offshore team members.

 

Show More Show Less

Skills

Teradata