Ankit S.

Ankit S.

Lead Data Scientist with 9.4 Years of Experience

New Delhi , India

Experience: 9 Years

Ankit

New Delhi , India

Lead Data Scientist with 9.4 Years of Experience

66736.7 USD / Year

  • Notice Period: Days

9 Years

Now you can Instantly Chat with Ankit!

About Me

  • MCA from Harcourt Butler Technological Institute (HBTI) Kanpur.
  • Manage a team of data scientists, machine learning engineers and big data specialists.
  • Lead data mining and collec...
  • Lead data mining and collection procedures.
  • Ensure data quality and integrity.
  • Interpret and analyze data problems.
  • Conceive, plan and prioritize data projects.
  • Build analytic systems and predictive models.
  • Test performance of data-driven products.
  • Visualize data and create reports.
  • Experiment with new models and techniques.
  • Align data projects with organizational goals.
  • Highly focused and process driven.
  • Proven experience implementing and deploying advanced statistical solutions using R/Python.
  • Knowledge of full application life cycle design tools and methodologies
  • Apply machine learning algorithms, statistical data analysis, computational linguistics, and similar tools to find useful patterns in data collections.
  • Practical experience doing hands-on Data Analysis and prototyping Data Products.
  • Strong technical background with proven success in projects involving image and video processing and computer vision – including open source platforms in computer vision and machine learning.
  • Knowledge of Imaging API (Filters, Transforms, Colour Domain, Image Codecs, Object classification, Colour supervision) in Open CV.
  • Practical experience with TensorFlow, Keras, PyTorch, Caffe, Theano & developing on Linux OS with GPU machines (GTX 1080).
  • Measure, interpret, and derive learnings from results of analysis that will lead to improvements in underlying customer business processes, products or services.
  • Natural Language Processing (NLP), Linguistics, Advanced Semantic Design, Information Extraction, Information Retrieval, Probabilistic Decision Making, image recognition, deep learning, Machine Learning, cognitive science and analytics
  • Ability to work with both data and analysis at scale.
  • Deep understanding of a variety of Statistical Methods and Machine Learning Algorithms.
  • Ability to test and debug applications effectively.
  • Capable of learning new technologies quickly and evaluating their architectural applicability.

Self-starter with minimum supervision and Work under pressure & tight deadlines.

Show More

Portfolio Projects

Description

  • Installation of Work Station and configure NVIDIA GTX 1080 using Cuda with Tensor flow GPU.
  • Creating input images Black, BSOD, Green, Green Patch & Flicker using Python Script.
  • Pre-processing and feature engineering on images.
  • Creating model from Scratch using Python & CNN.
  • Data Generation of real time anomalies.
  • Tuning the parameter and Evaluate Image parameters.
  • Doing image and video analysis using CNN.

Deployment of Project using Python.

Show More Show Less

Description

VC Firm is venture capital investment company constantly monitoring start-ups and other companies to find the prospective companies to invest and fund. They have been following a manual process of decision making in deciding to choose the companies to invest in which has increasingly become unreliable due to the nature of business and the decisions being made out of the research were unaccountable. Their objective is to automate the decision making based on advanced analytical and statistical modelling techniques using data instead of merely using the empirical experience and also to be accountable and to be able to expand in the future. They have been constantly looking for ways to improve the decision making using better ways which are more accountable and reliable.

Show More Show Less

Description

Nuance has a need to manage a high volume of a wide variety of data (structured and unstructured), at different frequencies with virtually no scaling limits. (for ex: EMRs, HIE, Nuance’s products, 3rd Party vendors, etc.). NDL(Data Lake) will have a workflow engine and scheduling for data curation, cleaning, computing, filtering, and formatting. NDL goal is to enable products with a federated pipeline of curated data for reporting, analytics and product’s consumption.

Show More Show Less

Description

WIFIRE is an integrated system for wildfire analysis, with specific regard to changing urban dynamics and climate. The system integrates networked observations such as heterogeneous satellite data and real-time remote sensor data, with computational techniques in signal processing, visualization, modelling, and data assimilation to provide a scalable method to monitor such phenomena as weather patterns that can help predict a wildfire's rate of spread.

Show More Show Less

Description

AC Miner is a coal mining company based out of Australia and is one of the largest coal miners in the world. They use 100s of heavy machinery which are very expensive (in $Millions) and needed a way to constantly monitor the downtime to get an idea of the resource utilization of various machines used in coal mining. A small downtime of any of these machines would have an impact of thousands to million dollars a day and wanted to find ways to predict if any of the machines were about to fail as part of their predictive maintenance strategy.

Show More Show Less

Description

Team Size 6 Role Lead Data Scientist Platform Linux Technical Skills Python, Numpy, Pandas, Scikit-Learn, CNN, Image Analysis, Video Analysis, Computer Vision, Audio Analysis IDE Python IDLE, Jupyter Notebook, Spyder, Tensor Flow, Keras, Deep Learning Project Overview Intel Confidential Project Responsibility ● Installation of Work Station and configure NVIDIA GTX 1080 using Cuda with Tensor flow GPU. ● Creating input images Black, BSOD, Green, Green Patch & Flicker using Python Script. ● Pre-processing and feature engineering on images. ● Creating model from Scratch using Python & CNN. ● Data Generation of real time anomalies.● Tuning the parameter and Evaluate Image parameters. ● Doing image and video analysis using CNN.

Show More Show Less