Vikash A.

Vikash A.

Big Data/Hadoop with 13 years of IT experience in application development using Hadoop and .Net

Mumbai , India

Experience: 13 Years

Vikash

Mumbai , India

Big Data/Hadoop with 13 years of IT experience in application development using Hadoop and .Net

41304 USD / Year

  • Start Date / Notice Period end date: 2019-12-15

13 Years

Now you can Instantly Chat with Vikash!

About Me

Skilled in Cloudera, Apache Kafka, Hive, Apache Spark, Scala, .Net MVC, (c#), API and Scrum.

...

Show More

Portfolio Projects

It’s a tool for storing and processing all global GM product data digitally.

Contribute

• Created high and detail level architecture diagram to build end to end project using UML diagram. • Create and maintain optimal data pipeline architecture using Kafka and Streamset. •

Description

· Created high and detail level architecture diagram to build end to end project using UML diagram.

· Create and maintain optimal data pipeline architecture using Kafka and Streamset.

· Read from messages from Kafka by spark Scala to process XML document and created data model in HDFS.

· Create and maintain development patterns, standards, processes and norms.

· Analyze existing processes and user development requirements to ensure maximum efficiency.

· Work with stakeholders including the Executive, Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

· Analyze the solution impact on existing systems and process, and work with various teams to finalize the scope and depth of testing required for production deployment.

· Design and develop actionable Advanced Analytical solutions (Consumer Insight) that provide insights to decision makers

· Proficiency in Data identification, preparation, data mining, integrating data from various sources, dealing with structured and unstructured.

· We followed Agile scrum and WBS process.

Show More Show Less

Inject structure and unstructured data to HDFS, build different Data Model

Description

· Using Sqoop processed various structure data from oracle to Hadoop.

· Process Ocado client CSV data into Hadoop with the help of Scala where implemented unpivot dynamic data.

· Build the common Pynamic tool which helps data movement from any system to HDFS quickly. Build this tool by using pynamic and mongo db.

· Build the common datamover tool which help to mining the data before Sqoop or pynamic load.

· Created Data Model and Data Mining as per the customer requirement with different data set.

Show More Show Less

It’s a tool for storing and managing marketing holistic data.

Description

It’s a tool for storing and managing marketing holistic data. From different source and helped business to plan for future. Its developed using SSIS 2010.

Involved in daily scrum and sprint planning.

Developed package to get data from different API into Oracle.

Show More Show Less

Description

Under this project developed and maintain various web application, API and services at enterprise level to support company’s digital marketing and ecommerce sales and integration.

· Developed small, medium and large application to support company different platform services.

· Develop application to support digital recipe to company sites and customer integration.

· Maintain complex product hierarchy with its nutrition details in site and customer.

· Involved in daily standup update with self-managed team to follow Kanban process.

Show More Show Less