Now you can Instantly Chat with N S ANIL!
About Me
Had 2+ years experience in machine learning pipelines development for image analysis. Had 3 years experience in web services development and deployment for production scale. Had 5 years experience in backend development for real time applications in ...
Show MoreSkills
Portfolio Projects
Description
- Researchers need to run existing sequence across multiple bio-informatics pipelines. Later they need to see the hits found across different disease hits. Once they resolve their hits, they can apply for regulatory wing for further greenhouse testing. Before getting approval, it will be screening of different experts and based on their comments, researcher will get approval.
- We provide a website to researcher, where user can run their analysis as per requirement. They can edit DNA further in that tool and resubmit same sequence for analysis again. Once they are good about the results it was getting, they can tag approver for further approval. Approver can enter their comments in that results page itself, once approver convinced with researcher results, he can approve/reject on the website result page itself.
- Integrate existing Bioinformatics pipelines to this application.
- Develop services in python side for results. Using ORACLE and MYSQL for storing data based on input source.
Description
- Researchers store all interesting activities experiment information in database. Whenever they are going to do experimentation with new homolog, they will mine existing database and find out which are left our homologs with better results.
- Provide a web application which will track all data user entered and track ownership of each activity.
- Provide web services for UI (Angular 4) from ORACLE database. Using Flask framework for providing these web services and use blueprints concept for maintaining modularity, SQLAlchemy for ORM. Alembic is used for database migrations and future changes tracking.
Description
Reserchers will do BLAST for initial analysis of genes. As part of compnay migration to cloud from in-house, we have to come up with this project. In this project will make sure security of our internal datasets and reduce the time required for analysis.
Writing various services dbman, jobman, jobresults serivces and deployed into cloud using Lambda.
Setup automatic dataset synchronising tool BioMAJ, create our own docker image with required functionalities. Trigger this synchronisation task using cloud watch event based/ schedule based events.
Job execution will be taking care by batch, where we divide it as two queues. One queue is for small jobs, where predefined instance will always up and running. Second queue is used for large jobs, which will scale according to requirement.
Show More Show LessDescription
Need to identify the diseases for plant based on the pictures taking from drone cameras. Convert RGB images into Digital Elevation Models to get actual shape of plant. Train model with existing ground truth images and expert input bias. Using Tensorflow Keras library to do deep learning with relu activation function for hidden layers. Softmax activation function for output layer. Run Tensorflow training in GPU instances (Amazon EC2 P3 instances) for fast execution. Tune hyper parameters for selecting model. Integrate model into pipeline which will feed real time images.
Show More Show LessDescription
Corteva in India recruits several thousands of farmers every year for seed production. Corteva invested highly on farmers training. This expenditure is continuous because retention rate is very low. Work with business team for understanding current data and feature engineering. Applied random forest with number of tress as 5000 and batch size as 300 Get a model that predicts data with 80 percent accuracy. Business team is going to use this model for future years data. Developed model in R and provide visualizations of Spotfire files.
Show More Show LessDescription
Propose pipeline for analyze data from real time feed and provide recommendations for end user as part of hack-a-thon. Create cluster using AWS EMR (master and 11 slave) for spark jobs to train model. Design architecture for pipeline integration on AWS cloud and store location details along with weather prediction values for future use. Main idea is to process the location details and weather predictions to identify the flow of disease geographically.
Show More Show LessDescription
Capture RGB images, LIDAR images from drones. Construct DEM (Digital Elevation Models) from RGB images. Construct models on different image formats. And select best format. Run Tensorflow training in GPU instances (Amazon EC2 P3 instances) for fast execution. Tune hyper parameters for selecting model. Integrate model into pipeline which will feed real time images.
Show More Show Less