V.s.n R.

V.s.n R.

Full Stack Developer 12+ years of experience on java, j2ee and BigData Technologies

Bengaluru , India

Experience: 12 Years

V.s.n

Bengaluru , India

Full Stack Developer 12+ years of experience on java, j2ee and BigData Technologies

72000 USD / Year

  • Immediate: Available

12 Years

Now you can Instantly Chat with V.s.n!

About Me

Having 12+ years of IT experience with specialization in Java based web technologies, SQL, PL/SQL and UNIX shell scripting and BigData Technologies.

Technical Skills

La....

Technical Skills

Languages                         :  Java and PL/SQL, scala

RDBMS                             :  Oracle, mysql

Big Data Technologies      :  Hadoop Ecosystem and Sparki, Hive, HBASE, No-SQL DB

Cloud                                 :  Azure DataLake and DataBricks

Markup/Scripting              :  HTML, JavaScript and AngularJs.

Web Technologies            :  Servlets, JSP. 

Tools                                :  TOAD(9.6.1),SQL Developer, Jasper Report.

Message Series                :  JMS.

Frame works                    :  Spring, Struts, Aspose, Quartz Api and Jaxb Api and Webservice(Rest API and SOAP)

Web & Appn Servers         :  Apache Tomcat, Web Logic, JBos and Websphere.

IDEs                                   :  Eclipse3.4 and  Rad7.1.

Source Control                  :  IBM Clear Case, Tortoise SVN, CVS and Microsoft VSS.

Automation Tools              :  ANT, MAVEN

Scheduling Tools              :  Falcon, Quartz Api, Task Scheduler

 

Show More

Portfolio Projects

Description

We mainly provide civil score for bank customers. I am working as technical architect, mainly we are using scala, spark, Kafka, hive, Impala,

Show More Show Less

Description

Company : IntelliFour Software Pvt Ltd

Client : HPE

Duration : July 2016 – Till Date

Environment: Java, Spark, Scala, HDFS, Hive, HBase, Azure DataLake, Unix, Qlikview

Role : BigData Technical Lead

About HPE:

Hewlett Packard Enterprise Company(commonly referred to asHPE) is an American multinational enterpriseinformation technologycompany based in Palo Alto, California, founded on 2015 as part of splitting of theHewlett-Packardcompany. The Company operates through five segments: Enterprise Group, Software, Enterprise Services, Financial Services and Corporate Investments.

Description:

Entaly Platformis aproven,purpose-builtplatform to meet the specific challenges of traditional enterprises trying to go digital. It is aninnovative platform built on the best open-source technologies. Using pre-defined modeling constructs to create representations of flows that represent enterprise physical and decision-making activities. Typical flows include sales and marketing, supply chain and service processes. By modeling all Behavior and Flow aspects of the enterprise, the platform provides a real-time view of what is happening in the business.

Responsibilities:

  • Involved in loading Data from source system to HDFS and store the net change raw data to Hive tables .
  • Processing the data using spark.
  • Created SparkData Frames and optimization of code for the faster performance.
  • Prepared Unit Test Cases and Unit Testing.
  • Import the data from different source systems includes txt delimited files, structured & semi structured data into Hadoop Data Lake.
  • written generic code to ingest the data for multiple source systems to destination system using configuration file.
  • written business rules to process the data using spark-scala, This is configurable as per the requirement of the business needs.
  • written code to handle duplicates and process only the net change records to the destination .
  • written java & scala code making the Entaly platform generic.
  • Automated the jobs on trigger based system.
  • created views to the reporting team to extract the data and generate the reports.
  • Provided the post production support to the supporting team.

Show More Show Less

Description

Company : IntelliFour Software Pvt Ltd

Client: HPE

Duration : Sept 2015 – Nov 2016

Environment : Java, Spring, Mysql, Hive, Vertica, Unix, Qlikview, AngularJS

Role : BigData Technical Lead

Description:

The analysis is mainly on the no of orders ACTIVE, SHIPPED,CANCELED,REJECTED,ESD .

Data from multiple source systems like ftp server,windows shared drive & sharepoint is injested into the HDFS using java. The raw data is transformed as per the business rules using Java and the cleansed netchange data is loaded into hive tables. Data from hive is pushed to vertica database and further CRUD operations applied and finally populated to target tables.QlikView / QlikSense reporting tool is connected to vertica and extract the reports to the business users for analysis and for making decisions.

Responsibilities:

  • Responsible to manage data from multiple sources.
  • Transformed data as per the business requirement using java.
  • Coordinate with both the on-site and off-shore team.
  • Creating the tables in hive.
  • Load the data to HDFS/Hive tables.
  • Export the processed data from HDFS to Vertica.
  • Creating views for QlikView reporting team and working with the reporting team.
  • Job Workflow/Scheduling using Cron Job API.

Prepare the Test case documents and writing unit test scripts and document the test results.

Show More Show Less

Description

Company : Intellifour

Role : Developer

Team Size : 16

Environment : Hadoop, Hive and Sqoop

Duration Nov 2014 to Sept 2015

Description:

This project is mainly on the redesign of the platform which is running In Oracle 11g DB to Hadoop to process large date sets from different sources with the increasing data volume in order to meet the client requirements . The data will be stored in Hadoop file system and processed using Hive HQL queries.

Data from different source systems is loaded into HDFS . Raw Data is then processed using Hive queries as per the business requirement and populated into HIVE target tables.

Responsibilities:

  • Involved with the Business Team, Architects & Sme's to understand the requirements.
  • Created the design the document.
  • Moving flat files generated from various retailers to HDFS for further processing.
  • Importing structured data from RDBMS to Hive tables using Sqoop
  • CreatedHIVE tables with ORC format for structured data using Hive QL.
  • Processing the data as per the BRD and Load the processed data into final target tables.
  • Writing oozie work flow and usage of falconto Schedule Hadoop Jobs.
  • Prepare the Test case documents and writing unit test scripts and documented the test results.

Show More Show Less

Description

Role : Full Stack Developer

Environment : Java, Jsp, Struts, Oracle11g, Quartz, Jaxb and Weblogic 10.3.

Duration : Feb 2011 to Nov 2011

Description:

Interface is a fully automated system it receives and sending data to different systems like K+ system (it is a front office system), Core Banking system and Referential application (BDR). This has capability to FO and BO Reconciliation of data which receive. Technically it has more matured Quartz job process to handle the ASCI files. It also provides the sophisticated reports to system admin. It has capable to process the FX Rate files (Fimmda and Routers)

Responsibilities:

  • Gathering the requirements from various subsidiaries and discussing the feasibility, technicality and functionalities
  • Discussion with the Business Analyst Team for validating the requirements.
  • Setting up the environment and deploying the application using ANT script.
  • Taken the owner ship for the development of the Interface application.
  • Involved in writing Java, Action Class, Form Bean Class, and DAO.
  • Involved in Writing Presentation layer for the views (JSP).
  • Responsible for coding Database Connections using JDBC.
  • Responsible for writing PL/SQL script.
  • Responsible for Unit Testing.

Show More Show Less

Description

Role : Developer

Environment : Web service, Axis2, Weblogic 10.3, Oracle11g.

Duration : Nov 2012 to Nov 2013

Description:

E-banking is a fully automated system. And E-banking is maintain the customer out standing deal information so Market implemented Customer service to provide the customer outstanding deal information to E-Banking system. Whenever customer to check his outstanding deal information. E-Banking system will send request to E-Banking Customer service with Customer ID, then Customer service send all outstanding deal information as response back to E-Banking system.

Responsibilities:

  • Gathering the requirements from various subsidiaries and discussing the feasibility, technicality and functionalities
  • Discussion with the Business Analyst for validating the requirements.
  • Setting up the environment and deploying the application using ANT script.
  • Responsible for implementing Customer Service(Webservice)
  • Responsible for writing PL/SQL script.
  • Responsible for Unit Testing.
  • Responsible for deploying Web service in production environment.

Show More Show Less

Description

Client : ABNAMRO BANK (NL)

Role : Developer

Team Size : 2

Environment : Java, Jsp, Struts, Oracle10g.

Duration : May 2009 to Feb 2011

Description:

GUA is an application which is developed mainly to provide the common database services to other applications

Global Transaction Services (GTS) is a set of applications that are used to do various operations of ABN AMRO Bank. Within GTS, several applications make use of same data for its own operations. To ensure that all these applications will use same data and to facilitate the maintenance of these data, the common data needs to be centralized. Thus the centralization of these data in a single database is known as Common Database also referred as GUA application

Responsibilities:

  • Involved in writing Java, Asps, Action Class, Form Bean Class and DAO.
  • Involved in Writing Presentation layer for the views (JSP).
  • Responsible for coding Database Connections using JDBC.
  • Responsible for writing PL/SQL script.
  • Responsible for Unit Testing.

Show More Show Less

Description

Client : ABNAMRO BANK (NL)

Role : Developer

Team Size : 2

Environnent : Java, XML, Hibernante, IBM MQ Series and Oracle10g.

Duration : Jan 2009 to April 2009

Description:

Though the name resembles like a database part, it is just a combination of java programs and PL/SQL procedures that are used to populate the common database.

The mainframe system will put the data or records in the form of messages on a particular queue in a daily basis or weekly basis. There is a java program that keep on requesting C-Bus* layer through CBus api for any messages. Hence in the CBus layer, there must be a program that will keep on be looking on a particular queue for any kind of messages. Once the java program receives any message through CBus, it will call the respective PL/SQL procedures to populate it into the common database. Hence all the extraction and population of data is done only by PL/SQL procedures and not by java programs.

Responsibilities:

  • Involved in writing Java Class, Java Beans, XML and Unix Shell Scripting
  • Responsible for coding Database Connections using JDBC.
  • Responsible for writing PL/SQL script.
  • Responsible for Unit Testing.

Show More Show Less

Description

Client : American Airline (USA))

Role : Developer

Team Size : 10

Environment : Webservice and EJB.

Duration : Jan 2008 to Dec 2008

Description:

Air SOA is the online flight reservation system. Using this application Any user can book the flight tickets in online application for AA flight, The if the user want he can change ticket and cancel the ticket in online.

Responsibilities:

  • Involved in writing Java, Webservice implementation class and Session Beans.
  • Responsible for Unit Testing.

Show More Show Less

Description

Client : HP (India)

Role : Developer

Team Size : 4

Environment : Java, JSP.

App. Server : Jobs

Duration : Jan 2007 –Dec 2007

Description:

The HP Server Migration Pack – Universal Edition (SMP Universal) simplifies the server consolidation process. HP Server Migration Pack – Universal Edition migrations involve migrating an operating system, applications, and data from one server to another, instead of manually redeploying these elements on a new server. SMP Universal provides the following migration capabilities.

  • Physical-to-ProLiant (P2P) migration-Migrates a physical machine to a ProLiant server
  • Physical-to-virtual (P2V) migration—migrates a physical machine to a virtual machine guest
  • Virtual-to-ProLiant (V2P) migration—migrates a virtual machine guest to a ProLiant server
  • Virtual-to-virtual (V2V) migration—migrates a virtual machine guest between different virtualization layers

Responsibilities:

  • Responsible for writing Java Class and Form Beans.
  • Involved in Writing Presentation layer for the views (JSP).
  • Responsible for Unit Testing.

Show More Show Less

Description

Company : IntelliFour Software Pvt Ltd

Client: HPE

Duration : Sept 2015 – Nov 2016

Environment : Java, Spring, Mysql, Hive, Vertica, Unix, Qlikview, AngularJS

Role : BigData Technical Lead

Description:

The analysis is mainly on the no of orders ACTIVE, SHIPPED,CANCELED,REJECTED,ESD .

Data from multiple source systems like ftp server,windows shared drive & sharepoint is injested into the HDFS using java. The raw data is transformed as per the business rules using Java and the cleansed netchange data is loaded into hive tables. Data from hive is pushed to vertica database and further CRUD operations applied and finally populated to target tables.QlikView / QlikSense reporting tool is connected to vertica and extract the reports to the business users for analysis and for making decisions.

Responsibilities:

  • Responsible to manage data from multiple sources.
  • Transformed data as per the business requirement using java.
  • Coordinate with both the on-site and off-shore team.
  • Creating the tables in hive.
  • Load the data to HDFS/Hive tables.
  • Export the processed data from HDFS to Vertica.
  • Creating views for QlikView reporting team and working with the reporting team.
  • Job Workflow/Scheduling using Cron Job API.

Prepare the Test case documents and writing unit test scripts and document the test results.

Show More Show Less

Description

Maalive is an Entertainment website for Live tv and Movie, This website is developed and managing by myself.n

I have used Angular js, HTML5, BootStrap3 Technologies used for Front end deveolopment, Spring MVC, Hibernate, MySQL and PL/Sql Technologies for Backend Development.

Show More Show Less

Description

Company :Innovations

Client : Emaar Dubai

Duration : Jan 2019 – Till Date

Environment : Java, Spark, Scala, Hive, HBase, Azure DataLake, Event Hub, Unix, PowerBI

Role : BigData Principal Consultant

Description:

Emaar Properties is one of the world’s most valuable and admired real estate development companies. With proven competencies in properties, shopping malls & retail and hospitality & leisure

The company operates internationally providing property development and management services. With six business segments and 60 active companies, Emaar has collective presence in 36 markets across the Middle East, North Africa, Pan-Asia, Europe and North America.

Create uniform platform which will pull out data from multiple source and store into azure Data Lake -Big Data Platform which will help to create dashboard for reporting purposes.

Current data source are Salesforce, Avaya, Vista, Social Networks..

Responsibilities:

  • Involved in Designing Data flow from source system to Azure DataLake and store the net change raw data to Hive tables .
  • Processing the data using spark.
  • Created Spark Data Frames and optimization of code for the faster performance.
  • Prepared Unit Test Cases and Unit Testing.
  • Import the data from different source systems MS Sql server, Salesforce, Avaya, Vista, structured & semi structured data into Azure Data Lake.
  • Written generic code to ingest the data for multiple source systems to destination system using configuration file.
  • Written business rules to process the data using spark-Scala, This is configurable as per the requirement of the business needs.
  • Written code to handle duplicates and process only the net change records to the destination .
  • created views on Hive Tables to the reporting team to extract the data and generate the reports

Show More Show Less