Ankesh J.

Ankesh J.

Technical Services Specialist

Pune , India

Experience: 6 Years

Ankesh

Pune , India

Technical Services Specialist

32000 USD / Year

  • Notice Period: Days

6 Years

Now you can Instantly Chat with Ankesh!

About Me

Having 6.5 years of Experience on Hadoop/Big Data Technologies. Experience in developing applications using Apache Spark, Scala, Kafka, Nifi & Kylo, Azure. Knowledge on GCP and Python and Pandas. Experience in Mongo DB, Redis, Neo4j, Shell Scripting,...

Show More

Portfolio Projects

Description

The main purpose of this project is to transformed customer and account data for two banks and store into Hive after transformation in Spark and Scala and main the client account relationship is separate hive tables.

Show More Show Less

Description

Processed data coming from multiple sources and apply transformation and store the data in Hive. We are developing data pipelines and schedule them on Kylo.

Show More Show Less

Description

This project was a launch on work with Tech-M for the ATT client. The main purpose of SSOT graph DB was, to have a centralized data store that would help us to find other interconnected data sources. Final outcome needs to be seen in a graphical format. [Graph with edges and nodes]

Show More Show Less

Contribute

As as software developer

Description

Working on processing of real data json data stream using Spark and Kafka Integration. We store intermediate data in redis and final output we were stored in Mongo DB.

And We updated data in Mongo DB using real data stream.

Show More Show Less

OPENITEM - Convert Openitem Mainframe job into Hadoop

Contribute

Analyzing Requirements of client. Understanding the logic of program. Prepare the code in Hadoop environment in pig and hive. Code review maintaining the client standard.

Description

Open Item batch is an item information maintenance system. It holds product information. An item identifies the Division, Category, Store location and Selling price (Retail) of the merchandise. It assigns the Key Stock Number (KSN) which signifies the product purchased by the customer. KSN identifies the style, size, color and flavor. Open Item batch is not an ordering system.

Show More Show Less

Di-Hadoop

Contribute

Analyzing the LLD (low level document). Understanding the logic of program. Involved in Unit Testing, Integration Testing, and Prod Parallel Testing. Prepare issue log to explain to Clients.

Description

Data for tables is extracted from Teradata. This data is used by other systems such as Local Publishing or Dynamic Pricing. Item and Pricing information must be extracted from Enterprise Data Hub (HADOOP)

Show More Show Less

HTC - Design Nifi FLow and Schedule on Kyo

Contribute

As a developer

Description

The main purpose of this project is to process data coming from multiple sources and apply transformation and store the data in Hive. We are developing data pipelines and schedule them on Kylo.

Technology Used -Spark, Scala, Nifi, Kylo, Azure

Show More Show Less