Now you can Instantly Chat with Tanvi!
About Me
Contribute to the growth of the organization by applying the best business practices, through innovative solutions, while utilizing my technical expertise, analytical abilities and constantly updating my skill sets in achieving the goals of the organ...
Show MoreSkills
Positions
Portfolio Projects
Description
Deployment of healthcare AI application on Google cloud platform with BigTable database, pubsub, Google Storage bucket, Google App Engine, FastAPI for HTTP REST APIs, authentication using Google Identity Aware Proxy. The application can handle parallel requests with multiple devices and deliver HTTP response with low latency.Prediction of Cardiac Arrest in next 3 hrs ,fetch data from bigtable and alerts on App for doctors with critical patients.
Show More Show LessDescription
Researched on multiple research papers for ICO success criteria Scraped multiple sites for data collection, pipelining ,cleansing and merged into unbiased Dataset Worked on EDA challenges and Feature Engineering and Selection for Modeling Created 1 Classification Model (RandomForest &Xgboost) and 2 Regression Models (Multitarget/randomforest) Is Trading : Will the token hit any of the exchanges after ICO, Amount Raised: Total amount raised during the ICO ROI : Market corrected Return on investment after 4 months of exchange listing . Achieved 72
Show More Show LessDescription
Electricity demand for commercial spaces will grow to 390 TWh by 2040 and 70% of this demand will be covered by renewable solar pv energy. This sector will experience one of the biggest energy transitions and an opportunity for a more m modern architecture for the grid of the future
Few Deliverables include :
Predictive analytics in designing solar solutions or clean energy solutions for clients
Need Energy use its smart energy monitors for electricity and will help in Business Use Cases and decision making for C&I clients in transitioning to renewable energy.
If Possible, will also help in providing insight for the energy suppliers
Detect anomalies in the current consumption of data for a given region/state/country.
Description
Webscraped data from Twitter,UNCIEF,Instagram (Python)for sentimental analysis for thoughts and aspirations(NLP) Preformed EDA and Sentimental analysis. Visualization: Plotly (Basemaps), Wordclouds,violinplots,scatterplots,dynamic heatmaps to understand thoughts and sentiments. Tableau visual rep over geographic data. Built LDA ML Model to predict the important topics in Data. Built LSTM BERT Deeplearning Model to classify the labels for text data. Worked on models like RandomForest,Xgboost,Logistic Regression for Classification & Regression case study. Automated Models using AWS Sagemaker
Show More Show LessDescription
Working on data pipelines in Mysql.Design and implement data mining techniques on large data sets.Delivery of Patches before Release.Implemented and followed Agile development methodology within the cross-functional team and acted as a liaison between the business user group and the technical team.Performed Data Cleaning, features scaling, features engineering using pandas, Scikit-learn, Seaborn, TensorFlow and Ggplot2 packages in python and R. • Created drill down, drill through, sub reports using SSRS as well as managed subscription reports as per client's requirements using SSRS. • Work with Development IDE like Jupyter Notebook, and Spyder. • Design and develop analytics, machine learning models, and visualizations that drive performance and provide insights, from prototyping to production deployment and product recommendation and allocation planning. •Data validation and cleansing ofstaged input records was performed before loading into Data Warehouse.• Identifying and executing process improvements, hands-on in various technologies such asOracle. • Implemented Classification using supervised algorithms like Logistic Regression, Decision trees Used open source tools Spyder (Python) for statistical analysis and building machine learning algorithms. Performed Data Collection, Data Cleaning, Feature Engineering (Deep Feature Synthesis), Validation, Visualization, Report findings, and developed strategic uses of data.
Show More Show LessDescription
Used Python to scrape, clean, and analyze large datasets. Transformed raw data into MySQL with custom-made ETL application to prepare unruly data for machine learning. Design and implement data mining techniques on large data sets.Delivery of Patches before Release,Working on RCM workflows .Collaberating with Dev on testing checks on UAT environments.Working on Customer file issues(Proactive+Reactive)
Show More Show Less