Now you can Instantly Chat with Jainendra!
Skills
Positions
Portfolio Projects
Description
This was a standalone application responsible for populating some catalogue data intodatabase. Catalogue data were the collection of xml files which were packaged into a zip file. Followingare the description of content of the zip file:Zip file had one descriptor file and multiple datafiles.Every data file belonged to some category of data.Every data had limited max size. That is, if data of some specific category exceed themax size, there were multiple files for that category of data.Descriptor file described about all the datafiles and its categories.Following are the descriptions of the thread pool. The applications took this zip file as input andprocessed them to extract the information to be inserted into the database. The application hadfollowing components:Unzipper: This component was responsible for unzipping the file.File Processor: This component executed in multiple thread pools. Thread poolswere of two type - one for success flow and one for failure flow.1> Success Flow: This flow inserted the data into database in auto commit mode.2> Failure Flow: This flow rolled back the data in case of failure. All the inserts were rolledback in case of even a single failure.Role and Responsibility: I worked as an individual contributor in this project.
Show More Show LessDescription
This project was about enabling an enterprise application to be deployed on VMWareEAAS cloud. Following were the activities to implements this:Implementing Consul: All the application configuration properties were moved to consultokens.Implementing RPM: All the application artifacts, i.e. ear, war and other relatedconfigurations and archives, were packaged into an RPM (Red hat Package Manager) file atbuild time. RPM file was uploaded to nexus repository.Implementing Ansible: Application specific ansible playbooks were created around RPMfile. These playbooks were responsible for downloading to the RPM from nexus location totarget machine and then deploying the artifacts on application or web server.Implementing Consul Service Discovery: This application was part of a large softwaresystem where it was service provider to many other applications as well as a client of manyother applications. It registered itself to consul service registry so that other application candiscover it by name. It discovered its service providers by name from consul service registry.Role and Responsibility: I worked as an individual contributor in this project.
Show More Show LessDescription
It was about automating the manual work performed by connection team. Team hadto perform do manual activity to adopt vendor domain to our standard data model. Following are thedetails of the activity:A team member had to download a large CSV file form FTP.They had to compare this file with a previous file based on some key fields in CSV in order todetermine the insert or update candidates.There were no guarantee of the sequence or order of the lines in new file. New/updatedlines could be appended on start, in between, or at the end of the file.Team members had to check each line of the new file in previous file to determine thenew/updated records.After determining the insert and update candidates, SQL queries were generated fordatabase schema.These queries were executed in a staging area to verify the correctness of the SQL queries.After verification, SQL queries were packed into an installable package.Activity was performed every month and took at least two weeks to create the installablepackage.This project automates the above activities. Following are the components of this project:Scheduler: This is the user of the application which keeps executing the application afterspecific intervals (15 Days). It is not a part of the actual application.Launcher: This is the entry point of the application that handles all the activities fromreading input to writing output.Downloader: This encapsulates the download process of vendor input file from specifiedservers.Input: This encapsulates parsing of input files and populating them into the input modelobjects (Java Objects).Mapping: This encapsulates parsing of mapping file and populating it into mapping modelobjects (Java Objects).Query: This encapsulates the creation of SQL queries. It uses the input and mappingcomponents while creating SQL queries.Query Executor and Package Creator: It executes the queries in staging area and thencreates the installable package.Role and Responsibility: I worked as an individual contributor in this project and completed itsingle handedly.
Show More Show LessDescription
This was an eclipse plugin which was used to create and update a mapping file. Thismapping file was used to map vendor data to specific column in a standard schema. The application hadthe following two components:File Creation and Update: This component was responsible to create a new mappingfile and update an existing mapping file. This was done in an eclipse editor inmapping designer perspective.File Execution: This was a simulation activity to test and connect the mapping.Execution was done in run perspective where one could select mapping file, inputfile and the data base schema to check whether an input field was inserted in anintended table and column.Role and Responsibility: I worked as an individual contributor in this project.
Show More Show LessDescription
This was a web based J2EE application deployed centrally on an application server witha web interface to be accessed by the end users from different sites. The application broadly had thefollowing three scenarios to deal with: -File Upload: Functionality to upload JT (a large media file with <. JT> extension) files on thecentral File server and storing the metadata on the database (DB). Each time any file isuploaded the metadata will first be indexed using SOLR indexing.Search: Free text search functionality for searching indexed metadata information. Thesearch results will be fetched and displayed in the UI for the userJT Part View: Functionality to display the selected part in the viewer. On selection of anychosen part, the part metadata info will be fetched from the DB and the JT file will befetched from the users local cache. If the file is not found, then it will be fetched from thecentral file cache and displayed in the viewer along with the metadata info. The file fetchedfrom the central file cache will also be stored in the local cache, so that for any futurereference it can directly be fetched locally.Role and Responsibility: I worked as an individual contributor in this project and completed itsingle handedly.
Show More Show LessDescription
It was an integration project where two different applications had exchange data of theapplication domain with different representations and terminologies. It was a standalone project whichconverted a specific XML file to another specific file by applying complex business logic.Role and Responsibility: I played the role of architect cum developer with 2 more associatedevelopers to complete the project.
Show More Show LessDescription
It was a project on Event Management System.Responsibility: This was a client server project where I was responsible developing server side anddatabase for the project. There were multiple mobile clients and a PHP web client which requested theserver to get the response. This project was developed from scratch where I was the analyst, designerand developer on server side and database. I handled creating communication protocol to exchangedata with clients.
Show More Show LessDescription
It was a project on Change Management System.Responsibility: This was the first project that I had completed single handedly. There was no teammember other than me. This project was developed from scratch where I was the analyst, designer anddeveloper. I was also responsible for creating the testing strategies for QA team.
Show More Show LessDescription
It was a framework and a middle tier application for the various front tiers to interactwith various information systems including Databases, Mainframes, ERPs etc.Responsibility: I was involved as developer in developing and analyzing the code, ProductionTickets and bug resolving, Debugging, and Testing.
Show More Show LessDescription
Pay-Per-View (often abbreviated PPV) is the system in which television viewers canpurchase events to be seen on TV and pay for the private telecast of that event to their homes. Theevent was shown at the same time to everyone ordering it, as opposed to video on demand systems,which allow viewers to see the event at any time. Events can be purchased using an on-screen guide, anautomated telephone system, or through a live customer service representative. Events include featurefilms, sporting events, and pornographic movies.Responsibility: I was involved as programmer.
Show More Show Less