Venkata Raghavendra Bhanu Prasad P.

Venkata Raghavendra Bhanu Prasad P.

Specialist Programmer

Chennai , India

Experience: Year

Venkata Raghavendra Bhanu Prasad

Chennai , India

Specialist Programmer

40036.8 USD / Year

  • Notice Period: Days

Year

Now you can Instantly Chat with Venkata Raghavendra Bhanu Prasad!

About Me

Seeking an IT position that utilizes my expertise in a creative and challenging environment. To acquire knowledge and contribute to the development of industry. Planning, organizing and getting things done amongst my virtues....

Show More

Portfolio Projects

Description

Integrated Metric System (IMS) Incites replaces the old Incites applications dependent on oracle database to view different metrics of Web Of Science(WOS) data & Journal Citation Reports(JCR) using HBase.

Show More Show Less

Description

Integrated Metric System (IMS) replaces the plain old existing database of generating the zip files which contain the entire information of all the books updated to that year and given to the clients as per their requirement. This existing one takes a long time to give the results to the given queries. So currently we are replacing it with Hadoop Distributed File System (HDFS) which stores all the information in the form of text files so as to make the time to search for results for a given query is much faster when compared to the old one. We worked with one of the top Business Analytic Service providers in achieving this goal.

Show More Show Less

Description

Integrated Metric System (IMS) replaces the plain old existing database of generating the zip files which contain the entire information of all the books updated to that year and given to the clients as per their requirement. This existing one takes a long time to give the results to the given queries. So currently we are replacing it with Hadoop Distributed File System (HDFS) which stores all the information in the form of text files so as to make the time to search for results for a given query is much faster when compared to the old one. We worked with one of the top Business Analytic Service providers in achieving this goal.

Show More Show Less

Description

Profile Re-host project is mainly for migrating the architecture of Payments and Transfers module (in Chase.com, which allows customer to manage online transactions) from Oracle to DB2. In this project we are dealing with Bill Pay, Quick Pay modules.

Show More Show Less

Description

Client : JPMC

Organisation : Virtusa

Duration : JUL 2010 - APR 2011

Description: As part of this project, we supported multiple email and mobile support as an enhancement to the previous release, where the user is allowed to use multiple emails and a mobile number of a payee as his recipient. So, a customer can make a payment to one of the available emails plus to the mobile number of the recipient.

Technology Involved : Java, Spring IOC, Oracle SQL

Roles & Responsibility:

Involved in development and testing of the application.

Coded the business logic classes for the most critical use cases in Java with the best coding standards among different functionalities modules.

Implemented stored procedures for most of the functionalities on Oracle 10g Database using Toad 9 tool.

Solved many issues, during the IST and QA phase of the project, being an active member from offshore.

Show More Show Less

Description

Client : JPMC

Organisation : Virtusa

Duration : MAY 2011 - JUN 2011

Description: Profile Re-host project is mainly for migrating the architecture of Payments and Transfers module (in Chase.com, which allows customer to manage online transactions) from Oracle to DB2. In this project we are dealing with Bill Pay, Quick Pay modules.

Technology Involved: Java, Spring IOC, DB2

Roles & Responsibility:

Requirement gathering & details design.

Identifying contradictions with Oracle technical specifications with DB2 specifications and implementing the changes accordingly.

Identifying and fixing the ERA defects.

Identifying and fixing IST and QA Defects.

Show More Show Less

Description

Client : JPMC

Organisation : Virtusa

Duration : JUL 2011 – NOV 2011

Description: This project aims to enhance the Online BillPay experience by enabling users to add BillPay payees via the mobile device by capturing an image of the payment coupon, thus making it easier for the user and eliminating manual entry where possible.

Technology: Java, Spring IOC, SQL

Roles & Responsibility:

Requirement gathering & details design.

Use case development & integration testing by locally deploying the application using Web sphere.

SOAP UI testing.

Identifying and fixing IST and QA Defects.

Show More Show Less

Description

Client : JPMC

Duration : DEC 2011 – DEC 2012

Organisation : Virtusa

Description: This project is entire re-architecture of entire chase online, as parts of this several new functionalities were introduced by Chase.

Technology: Java, Spring IOC, SQL

Roles & Responsibility:

Requirement gathering & details design.

Use case development & integration testing by locally deploying the application using Web sphere.

SOAP UI testing.

Identifying and fixing IST and QA Defects.

Show More Show Less

Description

Client : Thomson Reuters

Organisation : Virtusa

Duration : JAN 2013 – DEC 2013

Description: Integrated Metric System (IMS) replaces the plain old existing database of generating the zip files which contain the entire information of all the books updated to that year and given to the clients as per their requirement. This existing one takes a long time to give the results to the given queries. So currently we are replacing it with Hadoop Distributed File System (HDFS) which stores all the information in the form of text files so as to make the time to search for results for a given query is much faster when compared to the old one. We worked with one of the top Business Analytic Service providers in achieving this goal.

Technology Involved: Map-Reduce, Hive

Roles & Responsibility:

Using Map-Reduce, Formed Structured files as per our data model by taking the input as XML’s.

Using Map-Reduce and Hive developed jobs we call it as Postprocessing Jobs like Re-categorizing of Articles, Normalizing the Country and Institution Unification.

Involve in ongoing maintenance issues.

Coordinate with Onsite members.

Involved in storing raw data into HDFS.

Show More Show Less

Description

Client : Thomson Reuters

Organisation : Virtusa

Duration : JAN 2014 – JUN 2014

Description: Integrated Metric System (IMS) replaces the plain old existing database of generating the zip files which contain the entire information of all the books updated to that year and given to the clients as per their requirement. This existing one takes a long time to give the results to the given queries. So currently we are replacing it with Hadoop Distributed File System (HDFS) which stores all the information in the form of text files so as to make the time to search for results for a given query is much faster when compared to the old one. We worked with one of the top Business Analytic Service providers in achieving this goal.

Technology Involved : Map-Reduce, HBase, Hive

Roles & Responsibility:

Requirement Analysis & Approach - As a Team Lead, analysed the requirement and distributed tasks to the members asper their area of expertise.

With a team of 5 members, developed a dataset which is in Production now and it’s been using by most of TR customers.

Coding & Unit Testing

Involve in ongoing maintenance issues.

Coordinate with Onsite members

Show More Show Less

Description

Client : Thomson Reuters

Organisation : Virtusa

Duration : JUL 2014 – SEP 2015

Description: Integrated Metric System (IMS) Incites replaces the old Incites applications dependent on oracle database to view different metrics of Web Of Science(WOS) data & Journal Citation Reports(JCR) using HBase.

Technology Involved: Map-Reduce, HBase, Hive

Map-Reduce jobs were used to load the HBase tables, a one-time job per baseline.

HBase Java API to interact with HBase tables & fetch different metrics, Co-Processors were used to calculate different metrics.

Hive to generate & deliver Journal Citation Reports.

Roles & Responsibility:

Requirement Analysis & Approach - As a Team Lead, analysed the requirement and distributed tasks to the members asper their area of expertise.

With a team of 5 members, developed a dataset which is in Production now and it’s been using by most of TR customers.

Coding & Unit Testing

Involve in ongoing maintenance issues.

Coordinate with Onsite members

Show More Show Less

Description

Client : Apple

Organisation : Infosys

Duration : Oct 2015 – Dec 2017

Description: Development of cluster monitoring & automation tools for NoSQL Databases like Couchbase, Solr & Elastic Search as part of Apple Connect application.

Technology Involved: Couchbase, Solr, Elastic Search, Java, Python

Roles & Responsibility: (Technology Lead)

Creating central monitoring scripts for Couchbase, Solr & Elastic search clusters using python which will be monitoring cluster health, different KPIs such as ops/sec, CPU usage, memory usage and DB specific parameters at cluster & host level.

Publishing weekly health check reports which will be capturing cluster/host capacity, data size and data growth trend, weekly cluster stats etc., and automation of audit.

Show More Show Less

Description

Client : EA

Organisation : Infosys

Duration : Jan 2018 – Sep 2019

Description: To maintain the cluster’s HDFS footprint by implementing retention policies over datasets and archiving the datasets beyond configured retention to Amazon S3, Implementation of GDPR Publication Subscriber module using S3 Step Functions and benchmarking migration of Hive ETLs to Spark

Technology Involved: Java, Scala, Hadoop (HDFS, Hive, Oozie and Airflow), AWS S3, AWS Step Functions, Spark SQL, Presto, MySQL, Rest Services, dockerizing services.

Roles & Responsibility: (Technology Lead)

Development & maintenance of Data management & GDPR modules.

ETL job migration from Hive to Spark & Presto.

Show More Show Less

Description

Client : Microsoft

Organisation : Infosys

Duration : Oct 2019 – May 2020

Description: To develop a testing framework for HDI product where we implemented scenarios to test all the Spark functionalities. This module is integrated with Azure HDInsights product build pipeline so that for any future Spark releases the scenarios which we implemented must be executed successfully.

Technology Involved:

Spark (with Scala), Kafka, Azkaban, Azure Event hubs, HBase, SQL

Roles & Responsibility: (Specialist Programmer)

Requirement Analysis & Approach – Implement the scenario with Spark Scala framework, as part of this we developed generic reader and writer framework which will be used during the scenario implementation.

Developed data generation framework which can support data parallelization during Spark execution.

Developing ETL jobs using Spark, SQL and Hive.

Coding & Unit Testing

Show More Show Less

Description

Client : Microsoft

Organisation : Infosys

Duration : Jun 2020 – Till Date

Description: Azure Intelligence Platform uses azure cloud services and Hadoop in Acquisition, Integration and Distribution layers. We use ADF to orchestrate and build ETL jobs.

We are building different data models using this platform that are useful to integrate all the subscriptions across the globe under single data platform.

Technology Involved:

Azure Data Factory, Azure Databricks, Spark Scala, Cosmos, SQL, Kusto with different cloud storages and compute services (Blob, ADLS Gen1 and Gen2, Kusto, SQL Server, ADLA)

Roles & Responsibility: (Specialist Programmer)

Requirement Analysis & Approach – Creating ETL pipelines using Azure Data Factory.

Developing ETL jobs using Spark, SQL and Hive.

Coding & Unit Testing

Show More Show Less