Now you can Instantly Chat with Sadegh!
About Me
Sadegh Astaneh’s PhD value proposition consists of the combination of the four skills: IT, J2EE Develop, Data Engineer and ML pipeline (MLOps) to run change initiatives in strict alignment with the business, of which he possesses an in-depth und...
Show MoreSkills
Portfolio Projects
Description
With funding from the European Commission, H2020 programme, 19 organisations from 11 EU Member States including SMEs, NGOs, and industrial, research and academia entities, along with six public law-enforcement agencies in the domains of justice, police and interior security, on May 1, 2021 started a three-year project for supporting the fight against radicalisation and thus preventing future terrorist attacks.
Under this project, an open platform for analysis and early alert – the CounteR solution – will be launched, collecting and analysing data from dispersed sources in order to predict critical communities at risk of radicalisation and violent extremism and aid law enforcement to more easily detect radicalisation processes. The system aims at supporting the fight against organised (cyber) crime and terrorism threats and at fostering information-sharing and collaboration between diverse European agencies in charge of countering radical propaganda, fundraising, recruitment, and planning of terrorist acts. The CounteR project will develop a tool for taking down quickly and accurately the terrorism content online, and, at the same time, preserve the privacy protection and data anonymisation of the content.
The system will incorporate state of the art NLP technologies combined with expert knowledge into the psychology of radicalization processes to provide a complete solution for law enforcement authorities to understand the when, where and why of radicalization in the community.
Description
Smart Growth System for solutions to improve watering and plant nutrition in professional crops. The objective is:
1. Reduce water and fertilizer consumption
2. Save pumping energy and treatments
3. In addition, enhance crop production (quantity and quality).
The main goal, improve and Automate the current processes for physical and chemical soil response analysis with new IT technologies such as Big Data processing and Machine Learning. This main goal through the following sub-goals:
1. Automation of the calculation of main watering variables
2. Automatic follow up of the soil chemical behaviour, nutrient availability and nutrient leaching
3. Automatic extrapolation of watering and fertilization variables from a point to an area
4. Automatic watering and fertilization proposals to the farmers
Responsibilities:
● Plan Implement & monitor the AWS Cloud Infrastructure, build & configure the production systems, manage the updation of the server-based technologies
● Implemented a generic ETL framework with high availability for bringing related data for Hadoop & InfluxDB from various sources using spark.
● Queried and analyzed data from InfluxDB for quick searching, sorting and grouping
● Implemented various Data Modeling techniques
● Participated in various upgrades and troubleshooting activities across enterprises.
● Knowledge in performance troubleshooting and tuning Hadoop clusters.
● Applied Spark advanced procedures like text analytics and processing using in-memory processing.
● Implemented process to join data from SQL and No SQL databases and store it in Hadoop.
● Created architecture stack blueprint for data access with NoSQL;
● Brought data from various sources into Hadoop using Kafka.
● Applied spark streaming for real time data transforming.
● Created multiple dashboards for multiple business needs.
● Implemented Composite server for the data virtualization needs and created multiple views for restricted data access using a REST API.
● Devised and led the implementation of next generation architecture for more efficient data ingestion and processing.
● Created and implemented various shell scripts for automating the jobs.
● Worked with Enterprise data support teams to install Hadoop updates, patches, version upgrades as required and fixed problems, which were raised after the upgrades.
● Implemented test scripts to support test-driven development and continuous integration.
● Used Spark for Parallel data processing and better performances.
Business Case Solution to improve watering and plant nutrition in professional crops, to reduce water and fertilizer consumption, save pumping energy and treatments, and enhance crop production (quantity and quality).
Show More Show LessDescription
My position is focused on research and development of Parser:
● The THREAT-ARREST Emulation Compiler API exploits the concepts developed in the TOREADOR project to develop an API for parsing and transforming the XML description of the emulated environment in a specific HEAT template, ready to be deployed in the OpenStack target platform.
● Generate YAML heat template of received XML using JAXB framework, where maps information in Xml in HEAT template of OpenStack.
● Emulation Compiler Tools is composed of the function blocks which manage logical/virtual resources deployed on physical resources, the function block which provides among OpenStack function blocks and the function block which orchestrates a set of virtual resources.
● The parser provides services to support the creation and manage of virtual machine instances of OpenStack platform, takes as input the XML file and executes the following steps to create the HEAT template, where heat is an OpenStack component responsible for Orchestration, development of template management technology to build virtual resources environments on OpenStack. Output of parser are template deployment technologies and build stacks which are sets of virtual resources based on templates
Show More Show LessDescription
My position is focused on research and development of Parser:
● Creation of a parser for the transformation of ontologies of the Big Data services for the TOREADOR project, which allows creation, manipulation and serialization of OWL ontologies.
● Transform OWL-S to WORKFLOW get OWL-S of service with parameter TRANSFORME and type of transform, Automatically parse RDF of Service by SAX (the Simple API for XML) technology and generate new XML file where describe workflow of target platform like OOZIE.
● WSDL2OWL-S Converter is a web-based tool that provides a partial conversion from WSDL Web services descriptions to OWL-S descriptions. The tool provides a complete specification of the Grounding, and the atomic processes of the OWL-S Process Model; it also provides a partial specification of the OWL-s Profile. After the transformation, the only work that remains to be done is the specification of the complex processes in the Process Model, and providing the xslt transformation from the data types used by WSDL and the OWL ontologies used by the OWL-S description. Finally complete the description of the OWL-S Profile.
● Transform OWL-S to WORKFLOW get OWL-S of service with parameter TRANSFORME and type of transform, Automatically parse RDF of Service by SAX (the Simple API for XML) technology and generate new XML file where describe workflow of target platform like OOZIE.
● Transform OWL-S to Spring Dataflow get OWL-S of service with parameter TRANSFORME and type of transform, Automatically parse RDF of Service by SAX (the Simple API for XML) technology and generate new DSL file where describe workflow of target platform like Spring Dataflow
Show More Show LessDescription
Implementation and evolution of the ICT solutions architecture for the Risk Management Competence Line for the centralized calculation and the monitoring of key indicators for all risk categories.
My position is focused on development of new data products. This includes process automation, execution of data recommendation, tool prototyping and development in R and Python
● Managing application solutions implemented on the “Big Data” technological framework;
● Managing design, development and evolution of methods and algorithms of advanced analytics, data science, machine learning, cognitive computing;
● Design and build complex Big Data architecture (Hadoop, Hive, Spark, Kafka, AKKA, etc.)
● Managing interactions between external and internal stakeholders, defining IT activities plan;
● Fostering methodologies, technologies and innovative ways of approaching and managing in terms of processing, integration and analysis of services and applications based on data and analytics;
● Supporting in scouting new generation products and platforms with the goal to innovate and optimize services.
● Designed a new Machine Learning pipeline to replace existing for portfolio risk prediction, automated market or trading activity classification & anomaly detection.
● Developed a Machine Learning test-bed with 8 different model learning and feature learning algorithms
● Designing, implementing and supporting fully automated Continuous Integration and Continuous Delivery processes
● Working with and supporting multiple World-wide development teams delivering a wide range of software applications
● Introducing and implementing Continuous Integration principles and practises for the Billing Development Team using Jenkins, Subversion, JUnit, Atlassian JIRA
● Developing automated process for builds and deployments; Jenkins, Ant, Maven, Sonatype Nexus, Shell Script, Java
● Automating the installation, deployment and maintenance of Middleware Application Servers Environment: Jenkins, Subversion, Continuous Integration, Agile, Maven, Nexus, Ant, Java, Linux
Show More Show LessDescription
Data Architect/Big Data Engineer manager of applied research project: OPLON (Smart Cities and Communities – Project OPportunities for active and healthy LONgevity)
Develop monitoring systems and integrated models to prevent and manage the fragility and promote active aging and health in older pre-frail and frail.
In this role I've designed and implemented:
● A predictive instrument for the functional decline risk of the elderly, extending on a wider scale the health-clinical-administrative data, integrated with other social data, as proxy of the functional tests to calculate the frailty indicators.
● Frailty calculator is based upon a logistic regression model and the training was performed over 2 year depth data. In particularthe model was built using 11 databases and exploits 26 social clinical variables to return the expected frailty risk.
● Use specific High Performance Computing like AKKA, Spark and Hadoop and Service Oriented Architecture technologies and data analysis techniques typical of data warehouse and data mining based on the exchange of data in an open and safe forma
Show More Show LessDescription
Data Engineer architect of CReG (Chronic Related Group) project of Lombardy region in order to facilitate the integration of the present applications and to have the chance to add new ones in the future to the platform. Tele-monitoring management is one subsystem of the CReG platform. The functional processes are implemented through a number of integrations:
● Research on machine learning for structured and unstructured data, with an emphasis on developing statistical topic models for Diagnosis and management of chronic patients.
● Develops machine learning methods for analyzing the structure, content, and dynamics of chronic patients data.
● Monitoring and follow up of Chronic failure patient with main parts include demographic, clinical, examination, exercise test, drug, procedure and imaging.
● Gathered data for intelligent data analysis and modeling decision trees methods algorithm.
● Model validation technique for assessing how the results of a statistical analysis will generalize to an independent data set
Show More Show Less