ASHISH P.

Sr.AWS AI ML Solution Architect

Commitment
0/ 5
Competency
0/ 5
Reliability
0/ 5
  • Overall Experience: 9 Years  
  • Agile Software Development:
  • Algorithm Development:
  • Apache Solr:
  • Apache Spark:
  • API Development:

ASHISH P. 

Sr.AWS AI ML Solution Architect

Commitment
0/5
Competency
0/5
Reliability
0/5

Time zones ready to work

  • Pacific Daylight [UTC -7]
  • Eastern European [UTC +2]
  • Eastern EST [UTC +3]
  • Greenwich Mean [UTC ±0]
  • Further EET [UTC +3]
  • Eastern Daylight [UTC -4]
  • Central Daylight [UTC -5]
  • Mountain Daylight [UTC -6]
  • New Delhi [UTC +5]
  • Dubai [UTC +4]

Willing to travel to client location: Yes  

About Me 

Result-driven AI/ML leader with 8+ years of specialized experience in AI/ML and over12+ years in delivering enterprise-grade scalable solutions. Expertise spans AI Agents
Result-driven AI/ML leader with 8+ years of specialized experience in AI/ML and over12+ years in delivering enterprise-grade scalable solutions. Expertise spans AI AgentsGenerative AI predictive modeling MLOps data preprocessing feature engineeringmachine learning deep learning computer vision natural language processing audioprocessing satellite image analysis quantum computing and enterprise LLM services.Proven ability to integrate cutting-edge technologies to solve complex business challenges anddrive innovation.
Show More

Interview Videos

Signup to see videos

Risk-Free Trial, Pay Only If Satisfied.

Portfolios

Time Series with Forecasting (All kinds of Data Support related to time -Generic solution)

Role:

Develop Time Series Web service in Python. Web service divided in three modules data analyse, prepared the arima model and predict the forecast. Design of the first module for trend, seasonal and residual. In second module base on data nature automatically generate the arima model . and third pha

Develop Time Series Web service in Python. Web service divided in three modules data analyse, prepared the arima model and predict the forecast. Design of the first module for trend, seasonal and residual. In second module base on data nature automatically generate the arima model . and third phase predicate forecast value using arima which develop in second phase.

Model: Arima ,
Technique : Time series with Forecasting
Package: Falsk, arima, pandas, numpy, pickle
Language : python

Time series purely work base on web service which is develop in python using flask package. ( Consider as API)

Show More

Skills: PythonTensorFlowKerasFlaskMachine LearningSciKit-LearnPandasNumPy

Tools: AnacondaVSCodeSublime TextGithub

Audio/ Sound Classification with Environment Sounds Dataset

Role:

The ESC-50 dataset is a labeled collection of 2000 environmental audio recordings suitable for bench marking methods of environmental sound classification. The dataset consists of 5-second-long recordings organized into 50 semantical classes (with 40 examples per class) loosely arranged into 5 ma

The ESC-50 dataset is a labeled collection of 2000 environmental audio recordings suitable for bench marking methods of environmental sound classification. The dataset consists of 5-second-long recordings organized into 50 semantical classes (with 40 examples per class) loosely arranged into 5 major categories. I trained Convolution Neural Network for sound classification. I achieved classification accuracy of approx ~83%. MFCC (mel-frequency cepstrum) feature is used to train models. Other features like short term fourier transform, chroma, melspectrogram can also be extracted. This dataset is challenged with 50 classes related to accuracy of Audio prediction. Achieved 83?curacy with mapping weight technique of previous maturity of model. I got prediction 80% correct.

Show More

Skills: SciPyAnacondapyAudio

Tools: Anaconda

Image Classification

Role:

The largest memory manufacturer wants to classify defective and non-defective images with high accuracy and a lower overkill rate.

In this project, the client provided 10K images containing defective and non-defective images. The main objective of the project is to classify the two type

The largest memory manufacturer wants to classify defective and non-defective images with high accuracy and a lower overkill rate.

In this project, the client provided 10K images containing defective and non-defective images. The main objective of the project is to classify the two type of images and store the result. Project required to read massive images of the network path defined in the configuration. We have developed system which runs every hour and reads the files from the network. These files are passed into the trained model to predict the result. The results are saved in two formats, one in a flat file and the other in the database, which can be used to generate reports.

The challenge was to identify defective images with a high accuracy of nearly 99%. As a projectile, we need more images to form the deep learning model generated by the image enhancement technique. In a second step, read the image using Open CV and apply the learning transfer to develop the model.

As a result, our image classification model obtained a satisfactory default image classification with a Type 2 error of zero.

Responsibilities:

• Collection of requirements of the Business team.
• Increase training data using the image augmentation technique.
• Used the concept of transfer learning to obtain good accuracy.
• Implementation of the Deep Learning Image Classification Model.
• Model validated in thousands of images.
• Implementation of the model in GPU.
• Weekly Scrum meeting.

Show More

Skills:

Tools: Anaconda

News analytic to predict stock price performance(Finance Assets)

Role:

The ubiquity of information today empowers investors at any scale to make better investment decisions. Challenge in this project to ingesting and interpreting the data to figure out which information is helpful, finding the flag in this ocean of data. By investigating news data to anticipate(pred

The ubiquity of information today empowers investors at any scale to make better investment decisions. Challenge in this project to ingesting and interpreting the data to figure out which information is helpful, finding the flag in this ocean of data. By investigating news data to anticipate(predict) stock prices, and the condition of research in understanding the prescient power of the news. harnessed, to help predict financial outcomes and produce critical monetary effect everywhere throughout the world. This undertaking depends on news data affect so we dissect the sentiments first using TFIDF(NLP Technique) and other NLP techniques which generate features and check their effect on stocks.

Responsibilities:

• Requirement gathering from the Business Team.
• Gather market and news data.
• Explore data to understand the sentiments.
• Generate feature of news data using TFIDF.
• Implemented and Designed different parameter for ML model.(Microsoft Designed Light GBM)
• Model validation/Deployment

Show More

Skills: Gensim

Tools: AnacondaGoogle CloudAWS

Predictive Maintenance

Role:

In this project one of leading automotive manufacture wants to detect When their robots need a maintenance before shutdown. So based on problem they can save their build an LSTM network in order to predict remaining useful life (or time to failure) of robots. The network uses simulated robot’s

In this project one of leading automotive manufacture wants to detect When their robots need a maintenance before shutdown. So based on problem they can save their build an LSTM network in order to predict remaining useful life (or time to failure) of robots. The network uses simulated robot’s sensor values to predict when an robot will fail in the future so that maintenance can be planned in advance.

Time Series Analysis : How many more cycles an in-service robot will last before it fails?
Binary classification: Is this robot going to fail within number of cycles?

Responsibilities:
• Requirement gathering from the Business Team.
• Gathering data.
• Used sequence to sequence Modeling approach.
• Training LSTM model.
• Model validation/Deployment.

Show More

Skills: Azure ML

Tools: AnacondaVisual Studio Code

+ More

Employment

Data Scientist

2018/01 -

Skills: Power BITableau

Your Role and Responsibilities:

• Data mining using state-of-the-art methods.
• Hands-on experience in selecting features, building, and optimizing classifiers using machine-learning techniques.
• Enhancing data collection procedures to include information that is relevant for building analytic systems
• Pro

• Data mining using state-of-the-art methods.
• Hands-on experience in selecting features, building, and optimizing classifiers using machine-learning techniques.
• Enhancing data collection procedures to include information that is relevant for building analytic systems
• Processing, cleansing, and verifying the integrity of data used for analysis.
• Doing ad-hoc analysis and clearly presenting results.
• Creating anomaly detection systems and constant tracking of its performance.
• Having strong knowledge in Statistics, Feature Engineering, Model Selection, Model Evaluation, and Feature Scaling to build accurate machine learning.
• Experience in tuning and optimize ML/DL model for performance improvement.
• Hands-on experience working with live projects on different machine learning techniques.
• Having strong problem-solving capabilities.
• Working with most demand industrial technology Framework like pandas, numpy, matplotlib, nltk(NLP), Keras(Deep Learning, tensor flow scikit learn(Machine learning), open cv

Projects: Image Classifications, Facial Expression Recognition, Recommendation system, Predictive Modeling, Regression Analysis

Show More

Jr.Data Scientist

2016/12 - 2017/12

Skills: The Natural Language Toolkit - NLTKPythonNatural Language ProcessingMachine LearningDeep LearningKerasPytorchTensorFlowSpacy

Your Role and Responsibilities:

● Research raw data and disseminate easily understood insights to provide organizational direction.
● Produced a company guidelines and best practices document based on consumer insight analysis.
● Performed regular web scraping to synthesize data.

Project: Image Classifica

● Research raw data and disseminate easily understood insights to provide organizational direction.
● Produced a company guidelines and best practices document based on consumer insight analysis.
● Performed regular web scraping to synthesize data.

Project: Image Classification, Seasonality Trends Analysis, Recommendation System

Show More

Sr. Data Scientist AI Consultant

2020/04 -

Skills:

Your Role and Responsibilities:

Interact with customers, develop use-cases on ML/AI, and use the same in solution integration on existing business. Selecting features, building, and optimizing classifiers using machine learning and Deep learning techniques. Data mining using state-of-the-art methods and handling large-scale data u
Interact with customers, develop use-cases on ML/AI, and use the same in solution integration on existing business. Selecting features, building, and optimizing classifiers using machine learning and Deep learning techniques. Data mining using state-of-the-art methods and handling large-scale data using ETL processing tools such as Pyspark and Data Bricks. Extending company\'s data with third party sources of information when needed. Enhancing data collection procedures to include information that is relevant for building analytic systems. Processing, cleansing, and verifying the integrity of data used for analysis. Doing ad-hoc analysis and presenting results in a transparent manner. Creating automated anomaly detection systems and constant tracking of its performance. Research and devise innovative statistical models for data analysis. Work as the lead data strategist, identifying and integrating new datasets that can be leveraged through our product capabilities and work closely with the engineering team to strategize and execute the development of data products. Collaborate with product design and engineering to develop an understanding of needs and Quality. Analyze data for trends and patterns and Interpret data with a clear objective in mind. Provide deep learning and machine learning-based solutions to Solve various Real-world problems.
Show More

Sr. Data Scientist

2019/09 - 2020/02

Skills:

Your Role and Responsibilities:

Research and devise innovative statistical models for data analysis. Work as the lead data strategist, identifying and integrating new datasets that can be leveraged through our product capabilities and work closely with the engineering team to strategize and execute the development of data products
Research and devise innovative statistical models for data analysis. Work as the lead data strategist, identifying and integrating new datasets that can be leveraged through our product capabilities and work closely with the engineering team to strategize and execute the development of data products. Collaborate with product design and engineering to develop an understanding of needs and Quality. Formulating, suggesting, and managing data-driven projects which are geared at furthering the business\'s interests. Analyse data for trends and patterns and Interpret data with a clear objective in mind. Cross-validating models to ensure their generalizability. Keep current with technical and industry developments in dairy sector. Execute analytical experiments methodically to help solve various problems and make a true impact across dairy domains industries. Monitoring the performance of Junior Data Scientists and providing them with practical guidance, as needed. Selecting and Employing advance deep learning and machine-learning based solutions to Solve varies Real world problem.
Show More

Jr. Data Scientist

2017/01 - 2018/01

Skills:

Your Role and Responsibilities:

Requirement gathering from the Business Team. Increased training data using the Image augmentation technique. Used Transfer learning concept to achieve reasonable accuracy. Implemented Deep learning image classification model. Validated Model on thousands of images.
Requirement gathering from the Business Team. Increased training data using the Image augmentation technique. Used Transfer learning concept to achieve reasonable accuracy. Implemented Deep learning image classification model. Validated Model on thousands of images.

+ More

Education

2013 - 2020


2007 - 2011


Skills

Agile Software Development Algorithm Development Apache Solr Apache Spark API Development

Tools

Anaconda NumPy VSCode Sublime Text PyCharm

Hobbies

  • Love to explore new places.
  • Learn new Experiences.
  • Self-Help Book Reading
  • Exercise

Achievements

  • Kaggle Kernel Master
  • Three Time Awarded from Kaggle.

Certification

  • I have done More than 35 + Coursera Certification Visit My LinkedIn for All Details. https://www.linkedin.com/in/ashishpatel2604/
  • I am IBM cer

Preferred Languages

English - Conversational Hindi - Fluent