Prashanth R.

Prashanth R.

Bigdata engineer/Data engineer

Hyderabad , India

Experience: 5 Years

Prashanth

Hyderabad , India

Bigdata engineer/Data engineer

32000 USD / Year

  • Notice Period: 45 Days

5 Years

Now you can Instantly Chat with Prashanth!

About Me

  • 3+ years of experience inHadoopand Peoplesoft.
  • Industry experience includes projects for Manufacturing and Banking Domain.
  • Proficient in Big Data Technologies Hive, Hdfs, Impala, Spark and Sqoop.
  • Knowledge in w...
  • Knowledge in writing complex SQL queries.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational
  • Database Systems (RDBMS) and from RDBMS to HDFS.
  • Experience in importing and exporting data using Sqoop from Relational Database System to HDFS and vice-versa.
  • Applying Transformation logic Using Spark core, Hive and Data Frames as per the client requirement.
  • Knowledge on Data science and Machine learning concepts (Chat bot,Computer vision,Pandas, NumPy,Matplotlib)
  • Hands on experience in PeopleSoft HCM module - Core HR, Position Management.
  • Having an experience in customizing and developing various reports using Application Engine, PS Query and SQR.
  • Being a good team player and listener has helped building rapport with the teams.
  • Effective in working independently and collaboratively in teams.
  • Flexible and ready to take on new challenges.      
  • Experience working in Development & production support.

Show More

Portfolio Projects

Healthcare domain : we are finding the anomaly data related to claims and providers

Company

Healthcare domain : we are finding the anomaly data related to claims and providers

Description

Healthcare domain

Show More Show Less

WORLD BANK

Company

WORLD BANK

Description

Position

Technical Consultant

Environment

People Soft 9.1, PeopleTools 8.53

Overview:

  • The World Bank is an international financial institution that provides loans to developing countries for capital programs.
  • The World Bank's official goal is the reduction of poverty.
  • HCL provided me the opportunity to work with World Bank Client as a PeopleSoft Technical Resource.

Roles & Responsibilities:

  • Core HR, Position Management Simplification.
  • Have gone through the application functionality and understood the business process and shared the knowledge to the team members.
  • Analyse and resolve problem tickets in a timely manner.
  • Prepared and executed test cases as per system requirements.
  • Proficient in understanding Software Requirement Specification and identifying the required test scenarios.
  • Provide technical support in design, development, testing, and deployment of PeopleSoft applications..
  • To coordinate with Technical Leads and functional users for requirement understanding.
  • Analysing and fixing the bugs in the existing HCM system.
  • Maintain project technical documentations for management review.
  • Migration of objects & coordinating the go-live activities.

Show More Show Less

Skills

Peoplesoft

Company

Supply chain

Description

I have worked on retail domain ,especially handling supply chain data from different sources as well order details .

Show More Show Less

Commonwealth bank of Australia

Company

Commonwealth bank of Australia

Description

Position

Hadoop Developer

Environment

Hadoop Ecosystem (HDFS, HIVE, SQOOP, IMPALA, SPARK)

Overview:

The Commonwealth Bank of Australia is an Australian multinational bank with businesses across New Zealand, Asia, the United States and the United Kingdom. It provides a variety of financial services including retail, business and institutional banking, funds management, superannuation, insurance, investment and broking services. The Commonwealth Bank is the largest Australian listed company on the Australian Securities Exchange as of August 2015 with brands including Bank west, Colonial First State Investments, ASB Bank (New Zealand), Commonwealth Securities (CommSec) and Commonwealth Insurance (CommInsure). Commonwealth Bank is also the largest bank in the Southern Hemisphere

Roles & Responsibilities:

  • Importing and exporting the data from relational databases, Using SQOOP.
  • Involving in creating Hive tables, loading with data and writing hive queries.
  • Used Hive to analyse the partitioned and compute various metrics for reporting.
  •  Used hive optimization techniques during joins and best practices in writing hive scripts using HIVEQL.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Using Spark-SQL to load different formats of data and create schema RDD and loaded into Hive Tables and handled structured data using Spark SQL.
  • Created Impala tables and used Impala queries for creating reports based on requirement
  • Interactive analysis of Hive tables through various data frame operations using SparkSQL.

Show More Show Less

Sellin

Company

Sellin

Description

Position

Hadoop Developer

Environment

Hadoop Ecosystem (HDFS, HIVE, SQOOP, IMPALA, SPARK)

Overview:

Microsoft Corporation is an American multinational technology company with headquarters in Redmond, Washington. It develops, manufactures, licenses, supports, and sells computer software, consumer electronics, personal computers, and related services.

Roles & Responsibilities:

  • Importing the data from relational databases, Using Spark.
  • Involving in creating Hive tables, loading with data and writing hive queries.
  • Used Hive to analyse the partitioned and compute various metrics for reporting.
  • Used hive optimization techniques during joins and best practices in writing hive scripts using HIVEQL.
  • Using Spark-SQL to load different formats of data and create schema RDD and loaded into Hive Tables and handled structured data using Spark SQL.
  • Worked on Spark-SQL on top of Hive.

Show More Show Less

Supply Chain

Company

Supply Chain

Description

Position

Data Engineer

Environment

Hadoop Ecosystem (HDFS, HIVE, SQOOP, IMPALA, SPARK,SCALA)

Overview:

Supply-chain management, techniques with the aim of coordinating all parts of SC from supplying raw materials to delivering and/or resumption of products, tries to minimize total costs with respect to existing conflicts among the chain partners. An example of these conflicts is the interrelation between the sale department desiring to have higher inventory levels to fulfill demands and the warehouse for which lower inventories are desired to reduce holding costs

Roles & Responsibilities:

  • Importing and exporting the data from relational databases, Using SQOOP.
  • Involving in creating Hive tables, loading with data and writing hive queries.
  • Used Hive to analyse the partitioned and compute various metrics for reporting.
  • Worked on Custom build tool Automic for scheduling.
  • Using Spark-SQL to load different formats of data and create schema RDD and loaded into Hive Tables and handled structured data using Spark SQL and Data frame.
  • Worked on Spark and Kafka.
  • Worked on GitHub.

Technologies: Hive, Sqoop, Oracle, Hdfs ,Impala, Spark.

Show More Show Less
Share:

Verifications

  • Phone Verified

Preferred Language

  • English - Fluent

Available Timezones

  • New Delhi [UTC +5]

  • Dubai [UTC +4]

  • China (West) [UTC +6]

  • Singapore [UTC +7]

  • Hong Kong (East China) [UTC +8]

  • Australian EDT [UTC +11]

  • Australian CDT [UTC +10:30]

  • Greenwich Mean [UTC ±0]

  • Eastern European [UTC +2]

  • Further EET [UTC +3]

  • Eastern EST [UTC +3]

  • Eastern Daylight [UTC -4]

  • Central Daylight [UTC -5]

  • Mountain Daylight [UTC -6]

  • Pacific Daylight [UTC -7]