Browsing by Author "Heidari, Arash"
Now showing 1 - 20 of 37
- Results Per Page
- Sort Options
Article Citation - WoS: 32Citation - Scopus: 37Implementation of a Product-Recommender System in an Iot-Based Smart Shopping Using Fuzzy Logic and Apriori Algorithm(IEEE-Inst Electrical Electronics Engineers Inc, 2022) Yan, Shu-Rong; Pirooznia, Sina; Heidari, Arash; Navimipour, Nima Jafari; Unal, MehmetThe Internet of Things (IoT) has recently become important in accelerating various functions, from manufacturing and business to healthcare and retail. A recommender system can handle the problem of information and data buildup in IoT-based smart commerce systems. These technologies are designed to determine users' preferences and filter out irrelevant information. Identifying items and services that customers might be interested in and then convincing them to buy is one of the essential parts of effective IoT-based smart shopping systems. Due to the relevance of product-recommender systems from both the consumer and shop perspectives, this article presents a new IoT-based smart product-recommender system based on an apriori algorithm and fuzzy logic. The suggested technique employs association rules to display the interdependencies and linkages among many data objects. The most common use of association rule discovery is shopping cart analysis. Customers' buying habits and behavior are studied based on the numerous goods they place in their shopping carts. As a result, the association rules are generated using a fuzzy system. The apriori algorithm then selects the product based on the provided fuzzy association rules. The results revealed that the suggested technique had achieved acceptable results in terms of mean absolute error, root-mean-square error, precision, recall, diversity, novelty, and catalog coverage when compared to cutting-edge methods. Finally, themethod helps increase recommender systems' diversity in IoT-based smart shopping.Article Citation - WoS: 50Citation - Scopus: 56A hybrid approach for latency and battery lifetime optimization in IoT devices through offloading and CNN learning(Elsevier, 2023) Heidari, Arash; Navimipour, Nima Jafari; Jamali, Mohammad Ali Jabraeil; Akbarpour, ShahinOffloading assists in overcoming the resource constraints of specific elements, making it one of the primary technical enablers of the Internet of Things (IoT). IoT devices with low battery capacities can use the edge to offload some of the operations, which can significantly reduce latency and lengthen battery lifetime. Due to their restricted battery capacity, deep learning (DL) techniques are more energy-intensive to utilize in IoT devices. Because many IoT devices lack such modules, numerous research employed energy harvester modules that are not available to IoT devices in real-world circumstances. Using the Markov Decision Process (MDP), we describe the offloading problem in this study. Next, to facilitate partial offloading in IoT devices, we develop a Deep Reinforcement learning (DRL) method that can efficiently learn the policy by adjusting to network dynamics. Convolutional Neural Network (CNN) is then offered and implemented on Mobile Edge Computing (MEC) devices to expedite learning. These two techniques operate together to offer the proper offloading approach throughout the length of the system's operation. Moreover, transfer learning was employed to initialize the Qtable values, which increased the system's effectiveness. The simulation in this article, which employed Cooja and TensorFlow, revealed that the strategy outperformed five benchmarks in terms of latency by 4.1%, IoT device efficiency by 2.9%, energy utilization by 3.6%, and job failure rate by 2.6% on average.Article Citation - WoS: 7Citation - Scopus: 7Leveraging Explainable Artificial Intelligence for Transparent and Trustworthy Cancer Detection Systems(Elsevier, 2025) Toumaj, Shiva; Heidari, Arash; Navimipour, Nima JafariTimely detection of cancer is essential for enhancing patient outcomes. Artificial Intelligence (AI), especially Deep Learning (DL), demonstrates significant potential in cancer diagnostics; however, its opaque nature presents notable concerns. Explainable AI (XAI) mitigates these issues by improving transparency and interpretability. This study provides a systematic review of recent applications of XAI in cancer detection, categorizing the techniques according to cancer type, including breast, skin, lung, colorectal, brain, and others. It emphasizes interpretability methods, dataset utilization, simulation environments, and security considerations. The results indicate that Convolutional Neural Networks (CNNs) account for 31 % of model usage, SHAP is the predominant interpretability framework at 44.4 %, and Python is the leading programming language at 32.1 %. Only 7.4 % of studies address security issues. This study identifies significant challenges and gaps, guiding future research in trustworthy and interpretable AI within oncology.Review Citation - WoS: 92Citation - Scopus: 116Adventures in Data Analysis: a Systematic Review of Deep Learning Techniques for Pattern Recognition in Cyber-Physical Systems(Springer, 2023) Amiri, Zahra; Heidari, Arash; Navimipour, Nima Jafari; Unal, Mehmet; Mousavi, AliMachine Learning (ML) and Deep Learning (DL) have achieved high success in many textual, auditory, medical imaging, and visual recognition patterns. Concerning the importance of ML/DL in recognizing patterns due to its high accuracy, many researchers argued for many solutions for improving pattern recognition performance using ML/DL methods. Due to the importance of the required intelligent pattern recognition of machines needed in image processing and the outstanding role of big data in generating state-of-the-art modern and classical approaches to pattern recognition, we conducted a thorough Systematic Literature Review (SLR) about DL approaches for big data pattern recognition. Therefore, we have discussed different research issues and possible paths in which the abovementioned techniques might help materialize the pattern recognition notion. Similarly, we have classified 60 of the most cutting-edge articles put forward pattern recognition issues into ten categories based on the DL/ML method used: Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Generative Adversarial Network (GAN), Autoencoder (AE), Ensemble Learning (EL), Reinforcement Learning (RL), Random Forest (RF), Multilayer Perception (MLP), Long-Short Term Memory (LSTM), and hybrid methods. SLR method has been used to investigate each one in terms of influential properties such as the main idea, advantages, disadvantages, strategies, simulation environment, datasets, and security issues. The results indicate most of the articles were published in 2021. Moreover, some important parameters such as accuracy, adaptability, fault tolerance, security, scalability, and flexibility were involved in these investigations.Article Citation - WoS: 186Citation - Scopus: 219A Secure Intrusion Detection Platform Using Blockchain and Radial Basis Function Neural Networks for Internet of Drones(IEEE-Inst Electrical Electronics Engineers Inc, 2023) Heidari, Arash; Navimipour, Nima Jafari; Unal, MehmetThe Internet of Drones (IoD) is built on the Internet of Things (IoT) by replacing Things with Drones while retaining incomparable features. Because of its vital applications, IoD technologies have attracted much attention in recent years. Nevertheless, gaining the necessary degree of public acceptability of IoD without demonstrating safety and security for human life is exceedingly difficult. In addition, intrusion detection systems (IDSs) in IoD confront several obstacles because of the dynamic network architecture, particularly in balancing detection accuracy and efficiency. To increase the performance of the IoD network, we proposed a blockchain-based radial basis function neural networks (RBFNNs) model in this article. The proposed method can improve data integrity and storage for smart decision-making across different IoDs. We discussed the usage of blockchain to create decentralized predictive analytics and a model for effectively applying and sharing deep learning (DL) methods in a decentralized fashion. We also assessed the model using a variety of data sets to demonstrate the viability and efficacy of implementing the blockchain-based DL technique in IoD contexts. The findings showed that the suggested model is an excellent option for developing classifiers while adhering to the constraints placed by network intrusion detection. Furthermore, the proposed model can outperform the cutting-edge methods in terms of specificity, F1, recall, precision, and accuracy.Review Citation - WoS: 49Citation - Scopus: 65Resilient and Dependability Management in Distributed Environments: a Systematic and Comprehensive Literature Review(Springer, 2023) Amiri, Zahra; Heidari, Arash; Navimipour, Nima Jafari; Unal, MehmetWith the galloping progress of the Internet of Things (IoT) and related technologies in multiple facets of science, distribution environments, namely cloud, edge, fog, Internet of Drones (IoD), and Internet of Vehicles (IoV), carry special attention due to their providing a resilient infrastructure in which users can be sure of a secure connection among smart devices in the network. By considering particular parameters which overshadow the resiliency in distributed environments, we found several gaps in the investigated review papers that did not comprehensively touch on significantly related topics as we did. So, based on the resilient and dependable management approaches, we put forward a beneficial evaluation in this regard. As a novel taxonomy of distributed environments, we presented a well-organized classification of distributed systems. At the terminal stage, we selected 37 papers in the research process. We classified our categories into seven divisions and separately investigated each one their main ideas, advantages, challenges, and strategies, checking whether they involved security issues or not, simulation environments, datasets, and their environments to draw a cohesive taxonomy of reliable methods in terms of qualitative in distributed computing environments. This well-performed comparison enables us to evaluate all papers comprehensively and analyze their advantages and drawbacks. The SLR review indicated that security, latency, and fault tolerance are the most frequent parameters utilized in studied papers that show they play pivotal roles in the resiliency management of distributed environments. Most of the articles reviewed were published in 2020 and 2021. Besides, we proposed several future works based on existing deficiencies that can be considered for further studies.Article Citation - WoS: 84Citation - Scopus: 114A new lung cancer detection method based on the chest CT images using Federated Learning and blockchain systems(Elsevier, 2023) Heidari, Arash; Javaheri, Danial; Toumaj, Shiva; Navimipour, Nima Jafari; Rezaei, Mahsa; Unal, MehmetWith an estimated five million fatal cases each year, lung cancer is one of the significant causes of death worldwide. Lung diseases can be diagnosed with a Computed Tomography (CT) scan. The scarcity and trustworthiness of human eyes is the fundamental issue in diagnosing lung cancer patients. The main goal of this study is to detect malignant lung nodules in a CT scan of the lungs and categorize lung cancer according to severity. In this work, cutting-edge Deep Learning (DL) algorithms were used to detect the location of cancerous nodules. Also, the real-life issue is sharing data with hospitals around the world while bearing in mind the organizations' privacy issues. Besides, the main problems for training a global DL model are creating a collaborative model and maintaining privacy. This study presented an approach that takes a modest amount of data from multiple hospitals and uses blockchain-based Federated Learning (FL) to train a global DL model. The data were authenticated using blockchain technology, and FL trained the model internationally while maintaining the organization's anonymity. First, we presented a data normalization approach that addresses the variability of data obtained from various institutions using various CT scanners. Furthermore, using a CapsNets method, we classified lung cancer patients in local mode. Finally, we devised a way to train a global model cooperatively utilizing blockchain technology and FL while maintaining anonymity. We also gathered data from real-life lung cancer patients for testing purposes. The suggested method was trained and tested on the Cancer Imaging Archive (CIA) dataset, Kaggle Data Science Bowl (KDSB), LUNA 16, and the local dataset. Finally, we performed extensive experiments with Python and its well-known libraries, such as Scikit-Learn and TensorFlow, to evaluate the suggested method. The findings showed that the method effectively detects lung cancer patients. The technique delivered 99.69 % accuracy with the smallest possible categorization error.Article Citation - WoS: 13Citation - Scopus: 11A New a Flow-Based Approach for Enhancing Botnet Detection Using Convolutional Neural Network and Long Short-Term Memory(Springer London Ltd, 2025) Asadi, Mehdi; Heidari, Arash; Navimipour, Nima JafariDespite the growing research and development of botnet detection tools, an ever-increasing spread of botnets and their victims is being witnessed. Due to the frequent adaptation of botnets to evolving responses offered by host-based and network-based detection mechanisms, traditional methods are found to lack adequate defense against botnet threats. In this regard, the suggestion is made to employ flow-based detection methods and conduct behavioral analysis of network traffic. To enhance the performance of these approaches, this paper proposes utilizing a hybrid deep learning method that combines convolutional neural network (CNN) and long short-term memory (LSTM) methods. CNN efficiently extracts spatial features from network traffic, such as patterns in flow characteristics, while LSTM captures temporal dependencies critical to detecting sequential patterns in botnet behaviors. Experimental results reveal the effectiveness of the proposed CNN-LSTM method in classifying botnet traffic. In comparison with the results obtained by the leading method on the identical dataset, the proposed approach showcased noteworthy enhancements, including a 0.61% increase in precision, a 0.03% augmentation in accuracy, a 0.42% enhancement in the recall, a 0.51% improvement in the F1-score, and a 0.10% reduction in the false-positive rate. Moreover, the utilization of the CNN-LSTM framework exhibited robust overall performance and notable expeditiousness in the realm of botnet traffic identification. Additionally, we conducted an evaluation concerning the impact of three widely recognized adversarial attacks on the Information Security Centre of Excellence dataset and the Information Security and Object Technology dataset. The findings underscored the proposed method's propensity for delivering a promising performance in the face of these adversarial challenges.Book Part Citation - Scopus: 2Machine/Deep Learning Techniques for Multimedia Security(inst Engineering Tech-iet, 2023) Heidari, Arash; Navimipour, Nima Jafari; Azad, PoupakMultimedia security based on Machine Learning (ML)/ Deep Learning (DL) is a field of study that focuses on using ML/DL techniques to protect multimedia data such as images, videos, and audio from unauthorized access, manipulation, or theft. Developing and implementing algorithms and systems that use ML/DL techniques to detect and prevent security breaches in multimedia data is the main subject of this field. These systems use techniques like watermarking, encryption, and digital signature verification to protect multimedia data. The advantages of using ML/DL in multimedia security include improved accuracy, scalability, and automation. ML/DL algorithms can improve the accuracy of detecting security threats and help identify multimedia data vulnerabilities. Additionally, ML models can be scaled up to handle large amounts of multimedia data, making them helpful in protecting big datasets. Finally, ML/DL algorithms can automate the process of multimedia security, making it easier and more efficient to protect multimedia data. The disadvantages of using ML/DL in multimedia security include data availability, complexity, and black box models. ML and DL algorithms require large amounts of data to train the models, which can sometimes be challenging. Developing and implementing ML algorithms can also be complex, requiring specialized skills and knowledge. Finally, ML/DL models are often black box models, which means it can be difficult to understand how they make their decisions. This can be a challenge when explaining the decisions to stakeholders or auditors. Overall, multimedia security based on ML/DL is a promising area of research with many potential benefits. However, it also presents challenges that must be addressed to ensure the security and privacy of multimedia data.Article Citation - WoS: 8Citation - Scopus: 11A Nano-Scale Design of Vedic Multiplier for Electrocardiogram Signal Processing Based on a Quantum Technology(Aip Publishing, 2025) Wang, Yuyao; Darbandi, Mehdi; Ahmadpour, Seyed-Sajad; Navimipour, Nima Jafari; Navin, Ahmad Habibizad; Heidari, Arash; Anbar, MohammadAn electrocardiogram (ECG) measures the electric signals from the heartbeat to diagnose various heart issues; nevertheless, it is susceptible to noise. ECG signal noise must be removed because it significantly affects ECG signal characteristics. In addition, speed and occupied area play a fundamental role in ECG structures. The Vedic multiplier is an essential part of signal processing and is necessary for various applications, such as ECG, clusters, and finite impulse response filter architectures. All ECGs have a Vedic multiplier circuit unit that is necessary for signal processing. The Vedic multiplier circuit always performs multiplication and accumulation steps to execute continuous and complex operations in signal processing programs. Conversely, in the Vedic multiplier framework, the circuit speed and occupied area are the main limitations. Fixing these significant defects can drastically improve the performance of this crucial circuit. The use of quantum technologies is one of the most popular solutions to overcome all previous shortcomings, such as the high occupied area and speed. In other words, a unique quantum technology like quantum dot cellular automata (QCA) can easily overcome all previous shortcomings. Thus, based on quantum technology, this paper proposes a multiplier for ECG using carry skip adder, half-adder, and XOR circuits. All suggested frameworks utilized a single-layer design without rotated cells to increase their operability in complex architectures. All designs have been proposed with a coplanar configuration in view, having an impact on the circuits' durability and stability. All proposed architectures have been designed and validated with the tool QCADesigner 2.0.3. All designed circuits showed a simple structure with minimum quantum cells, minimum area, and minimum delay with respect to state-of-the-art structures.Publication Citation - WoS: 47Citation - Scopus: 55Everything You Wanted To Know About Chatgpt: Components, Capabilities, Applications, and Opportunities(John Wiley & Sons Ltd, 2024) Heidari, Arash; Navimipour, Nima Jafari; Zeadally, Sherali; Chamola, VinayConversational Artificial Intelligence (AI) and Natural Language Processing have advanced significantly with the creation of a Generative Pre-trained Transformer (ChatGPT) by OpenAI. ChatGPT uses deep learning techniques like transformer architecture and self-attention mechanisms to replicate human speech and provide coherent and appropriate replies to the situation. The model mainly depends on the patterns discovered in the training data, which might result in incorrect or illogical conclusions. In the context of open-domain chats, we investigate the components, capabilities constraints, and potential applications of ChatGPT along with future opportunities. We begin by describing the components of ChatGPT followed by a definition of chatbots. We present a new taxonomy to classify them. Our taxonomy includes rule-based chatbots, retrieval-based chatbots, generative chatbots, and hybrid chatbots. Next, we describe the capabilities and constraints of ChatGPT. Finally, we present potential applications of ChatGPT and future research opportunities. The results showed that ChatGPT, a transformer-based chatbot model, utilizes encoders to produce coherent responses.Article Citation - WoS: 28Citation - Scopus: 33A Fuzzy-Based Method for Objects Selection in Blockchain-Enabled Edge-Iot Platforms Using a Hybrid Multi-Criteria Decision-Making Model(Mdpi, 2022) Gardas, Bhaskar B.; Heidari, Arash; Navimipour, Nima Jafari; Unal, MehmetThe broad availability of connected and intelligent devices has increased the demand for Internet of Things (IoT) applications that require more intense data storage and processing. However, cloud-based IoT systems are typically located far from end-users and face several issues, including high cloud server load, slow response times, and a lack of global mobility. Some of these flaws can be addressed with edge computing. In addition, node selection helps avoid common difficulties related to IoT, including network lifespan, allocation of resources, and trust in the acquired data by selecting the correct nodes at a suitable period. On the other hand, the IoT's interconnection of edge and blockchain technologies gives a fresh perspective on access control framework design. This article provides a novel node selection approach for blockchain-enabled edge IoT that provides a quick and dependable node selection. Moreover, fuzzy logic to approximation logic was used to manage numerical and linguistic data simultaneously. In addition, the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), a powerful tool for examining Multi-Criteria Decision-Making (MCDM) problems, is used. The suggested fuzzy-based technique employs three input criteria to select the correct IoT node for a given mission in IoT-edge situations. The outcomes of the experiments indicate that the proposed framework enhances the parameters under consideration.Review Citation - WoS: 24Citation - Scopus: 24The History of Computing in Iran (persia)-Since the Achaemenid Empire(Mdpi, 2022) Heidari, Arash; Navimipour, Nima Jafari; Unal, MehmetPersia was the early name for the territory that is currently recognized as Iran. Iran's proud history starts with the Achaemenid Empire, which began in the 6th century BCE (c. 550). The Iranians provided numerous innovative ideas in breakthroughs and technologies that are often taken for granted today or whose origins are mostly unknown from the Achaemenid Empire's early days. To recognize the history of computing systems in Iran, we must pay attention to everything that can perform computing. Because of Iran's historical position in the ancient ages, studying the history of computing in this country is an exciting subject. The history of computing in Iran started very far from the digital systems of the 20th millennium. The Achaemenid Empire can be mentioned as the first recorded sign of using computing systems in Persia. The history of computing in Iran started with the invention of mathematical theories and methods for performing simple calculations. This paper also attempts to shed light on Persia's computing heritage elements, dating back to 550 BC. We look at both the ancient and current periods of computing. In the ancient section, we will go through the history of computing in the Achaemenid Empire, followed by a description of the tools used for calculations. Additionally, the transition to the Internet era, the formation of a computer-related educational system, the evolution of data networks, the growth of the software and hardware industry, cloud computing, and the Internet of Things (IoT) are all discussed in the modern section. We highlighted the findings in each period that involve vital sparks of computing evolution, such as the gradual growth of computing in Persia from its early stages to the present. The findings indicate that the development of computing and related technologies has been rapidly accelerating recently.Article Citation - WoS: 107Citation - Scopus: 182The Applications of Machine Learning Techniques in Medical Data Processing Based on Distributed Computing and the Internet of Things(Elsevier Ireland Ltd, 2023) Aminizadeh, Sarina; Heidari, Arash; Toumaj, Shiva; Darbandi, Mehdi; Navimipour, Nima Jafari; Rezaei, Mahsa; Talebi, SamiraMedical data processing has grown into a prominent topic in the latest decades with the primary goal of maintaining patient data via new information technologies, including the Internet of Things (IoT) and sensor technologies, which generate patient indexes in hospital data networks. Innovations like distributed computing, Machine Learning (ML), blockchain, chatbots, wearables, and pattern recognition can adequately enable the collection and processing of medical data for decision-making in the healthcare era. Particularly, to assist experts in the disease diagnostic process, distributed computing is beneficial by digesting huge volumes of data swiftly and producing personalized smart suggestions. On the other side, the current globe is confronting an outbreak of COVID-19, so an early diagnosis technique is crucial to lowering the fatality rate. ML systems are beneficial in aiding radiologists in examining the incredible amount of medical images. Nevertheless, they demand a huge quantity of training data that must be unified for processing. Hence, developing Deep Learning (DL) confronts multiple issues, such as conventional data collection, quality assurance, knowledge exchange, privacy preservation, administrative laws, and ethical considerations. In this research, we intend to convey an inclusive analysis of the most recent studies in distributed computing platform applications based on five categorized platforms, including cloud computing, edge, fog, IoT, and hybrid platforms. So, we evaluated 27 articles regarding the usage of the proposed framework, deployed methods, and applications, noting the advantages, drawbacks, and the applied dataset and screening the security mechanism and the presence of the Transfer Learning (TL) method. As a result, it was proved that most recent research (about 43%) used the IoT platform as the environment for the proposed architecture, and most of the studies (about 46%) were done in 2021. In addition, the most popular utilized DL algorithm was the Convolutional Neural Network (CNN), with a percentage of 19.4%. Hence, despite how technology changes, delivering appropriate therapy for patients is the primary aim of healthcare-associated departments. Therefore, further studies are recommended to develop more functional architectures based on DL and distributed environments and better evaluate the present healthcare data analysis models.Article Citation - WoS: 81Citation - Scopus: 95The Applications of Nature-Inspired Algorithms in Internet of Things-Based Healthcare Service: a Systematic Literature Review(Wiley, 2024) Amiri, Zahra; Heidari, Arash; Zavvar, Mohammad; Navimipour, Nima Jafari; Esmaeilpour, MansourNature-inspired algorithms revolve around the intersection of nature-inspired algorithms and the IoT within the healthcare domain. This domain addresses the emerging trends and potential synergies between nature-inspired computational approaches and IoT technologies for advancing healthcare services. Our research aims to fill gaps in addressing algorithmic integration challenges, real-world implementation issues, and the efficacy of nature-inspired algorithms in IoT-based healthcare. We provide insights into the practical aspects and limitations of such applications through a systematic literature review. Specifically, we address the need for a comprehensive understanding of the applications of nature-inspired algorithms in IoT-based healthcare, identifying gaps such as the lack of standardized evaluation metrics and studies on integration challenges and security considerations. By bridging these gaps, our paper offers insights and directions for future research in this domain, exploring the diverse landscape of nature-inspired algorithms in healthcare. Our chosen methodology is a Systematic Literature Review (SLR) to investigate related papers rigorously. Categorizing these algorithms into groups such as genetic algorithms, particle swarm optimization, cuckoo algorithms, ant colony optimization, other approaches, and hybrid methods, we employ meticulous classification based on critical criteria. MATLAB emerges as the predominant programming language, constituting 37.9% of cases, showcasing a prevalent choice among researchers. Our evaluation emphasizes adaptability as the paramount parameter, accounting for 18.4% of considerations. By shedding light on attributes, limitations, and potential directions for future research and development, this review aims to contribute to a comprehensive understanding of nature-inspired algorithms in the dynamic landscape of IoT-based healthcare services. Providing a complete overview of the current issues associated with nature-inspired algorithms in IoT-based healthcare services. Providing a thorough overview of present methodologies for IoT-based healthcare services in research studies; Evaluating each region that tailored nature-inspired algorithms with many perspectives such as advantages, restrictions, datasets, security involvement, and simulation stings; Outlining the critical aspects that motivate the cited approaches to enhance future research; Illustrating descriptions of certain IoT-based healthcare services used in various studies. imageArticle Citation - WoS: 75Citation - Scopus: 86A GSO-based multi-objective technique for performance optimization of blockchain-based industrial Internet of things(Wiley, 2024) Zanbouri, Kouros; Darbandi, Mehdi; Nassr, Mohammad; Heidari, Arash; Navimipour, Nima Jafari; Yalcin, SenayThe latest developments in the industrial Internet of things (IIoT) have opened up a collection of possibilities for many industries. To solve the massive IIoT data security and efficiency problems, a potential approach is considered to satisfy the main needs of IIoT, such as high throughput, high security, and high efficiency, which is named blockchain. The blockchain mechanism is considered a significant approach to boosting data protection and performance. In the quest to amplify the capabilities of blockchain-based IIoT, a pivotal role is accorded to the Glowworm Swarm Optimization (GSO) algorithm. Inspired by the collaborative brilliance of glowworms in nature, the GSO algorithm offers a unique approach to harmonizing these conflicting aims. This paper proposes a new approach to improve the performance optimization of blockchain-based IIoT using the GSO algorithm due to the blockchain's contradictory objectives. The proposed blockchain-based IIoT system using the GSO algorithm addresses scalability challenges typically associated with blockchain technology by efficiently managing interactions among nodes and dynamically adapting to network demands. The GSO algorithm optimizes the allocation of resources and decision-making, reducing inefficiencies and bottlenecks. The method demonstrates considerable performance improvements through extensive simulations compared to traditional algorithms, offering a more scalable and efficient solution for industrial applications in the context of the IIoT. The extensive simulation and computational study have shown that the proposed method using GSO considerably improves the objective function and blockchain-based IIoT systems' performance compared to traditional algorithms. It provides more efficient and secure systems for industries and corporations. We introduced a blockchain-based IIoT using a glowworm swarm optimization algorithm motivated by glowworms' behavior, movements' probability toward each other, and luciferin quantity. The proposed approach significantly improves four-way trade-offs such as scalability, decentralization, cost, and latency. imageReview Citation - WoS: 54Citation - Scopus: 61Botnets Unveiled: a Comprehensive Survey on Evolving Threats and Defense Strategies(Wiley, 2024) Asadi, Mehdi; Jamali, Mohammad Ali Jabraeil; Heidari, Arash; Navimipour, Nima JafariBotnets have emerged as a significant internet security threat, comprising networks of compromised computers under the control of command and control (C&C) servers. These malevolent entities enable a range of malicious activities, from denial of service (DoS) attacks to spam distribution and phishing. Each bot operates as a malicious binary code on vulnerable hosts, granting remote control to attackers who can harness the combined processing power of these compromised hosts for synchronized, highly destructive attacks while maintaining anonymity. This survey explores botnets and their evolution, covering aspects such as their life cycles, C&C models, botnet communication protocols, detection methods, the unique environments botnets operate in, and strategies to evade detection tools. It analyzes research challenges and future directions related to botnets, with a particular focus on evasion and detection techniques, including methods like encryption and the use of covert channels for detection and the reinforcement of botnets. By reviewing existing research, the survey provides a comprehensive overview of botnets, from their origins to their evolving tactics, and evaluates how botnets evade detection and how to counteract their activities. Its primary goal is to inform the research community about the changing landscape of botnets and the challenges in combating these threats, offering guidance on addressing security concerns effectively through the highlighting of evasion and detection methods. The survey concludes by presenting future research directions, including using encryption and covert channels for detection and strategies to strengthen botnets. This aims to guide researchers in developing more robust security measures to combat botnets effectively. Exploring botnets: evolution, tactics, countermeasures. This survey dives into botnets, covering life cycles, communication, and evasion tactics. It highlights challenges and future strategies for combating cyber threats. imageArticle Citation - WoS: 109A reliable method for data aggregation on the industrial internet of things using a hybrid optimization algorithm and density correlation degree(Springer, 2024) Heidari, Arash; Shishehlou, Houshang; Darbandi, Mehdi; Navimipour, Nima Jafari; Yalcin, SenayThe Internet of Things (IoT) is a new information technology sector in which each device may receive and distribute data across a network. Industrial IoT (IIoT) and related areas, such as Industrial Wireless Networks (IWNs), big data, and cloud computing, have made significant strides recently. Using IIoT requires a reliable and effective data collection system, such as a spanning tree. Many previous spanning tree algorithms ignore failure and mobility. In such cases, the spanning tree is broken, making data delivery to the base station difficult. This study proposes an algorithm to construct an optimal spanning tree by combining an artificial bee colony, genetic operators, and density correlation degree to make suitable trees. The trees' fitness is measured using hop count distances of the devices from the base station, residual energy of the devices, and their mobility probabilities in this technique. The simulation outcomes highlight the enhanced data collection reliability achieved by the suggested algorithm when compared to established methods like the Reliable Spanning Tree (RST) construction algorithm in IIoT and the Hop Count Distance (HCD) based construction algorithm. This proposed algorithm shows improved reliability across diverse node numbers, considering key parameters including reliability, energy consumption, displacement probability, and distance.Article Citation - WoS: 117Citation - Scopus: 185Opportunities and Challenges of Artificial Intelligence and Distributed Systems To Improve the Quality of Healthcare Service(Elsevier, 2024) Aminizadeh, Sarina; Heidari, Arash; Dehghan, Mahshid; Toumaj, Shiva; Rezaei, Mahsa; Navimipour, Nima Jafari; Unal, MehmetThe healthcare sector, characterized by vast datasets and many diseases, is pivotal in shaping community health and overall quality of life. Traditional healthcare methods, often characterized by limitations in disease prevention, predominantly react to illnesses after their onset rather than proactively averting them. The advent of Artificial Intelligence (AI) has ushered in a wave of transformative applications designed to enhance healthcare services, with Machine Learning (ML) as a noteworthy subset of AI. ML empowers computers to analyze extensive datasets, while Deep Learning (DL), a specific ML methodology, excels at extracting meaningful patterns from these data troves. Despite notable technological advancements in recent years, the full potential of these applications within medical contexts remains largely untapped, primarily due to the medical community's cautious stance toward novel technologies. The motivation of this paper lies in recognizing the pivotal role of the healthcare sector in community well-being and the necessity for a shift toward proactive healthcare approaches. To our knowledge, there is a notable absence of a comprehensive published review that delves into ML, DL and distributed systems, all aimed at elevating the Quality of Service (QoS) in healthcare. This study seeks to bridge this gap by presenting a systematic and organized review of prevailing ML, DL, and distributed system algorithms as applied in healthcare settings. Within our work, we outline key challenges that both current and future developers may encounter, with a particular focus on aspects such as approach, data utilization, strategy, and development processes. Our study findings reveal that the Internet of Things (IoT) stands out as the most frequently utilized platform (44.3 %), with disease diagnosis emerging as the predominant healthcare application (47.8 %). Notably, discussions center significantly on the prevention and identification of cardiovascular diseases (29.2 %). The studies under examination employ a diverse range of ML and DL methods, along with distributed systems, with Convolutional Neural Networks (CNNs) being the most commonly used (16.7 %), followed by Long Short -Term Memory (LSTM) networks (14.6 %) and shallow learning networks (12.5 %). In evaluating QoS, the predominant emphasis revolves around the accuracy parameter (80 %). This study highlights how ML, DL, and distributed systems reshape healthcare. It contributes to advancing healthcare quality, bridging the gap between technology and medical adoption, and benefiting practitioners and patients.Article Citation - WoS: 38Citation - Scopus: 48Applications of Deep Learning in Alzheimer's Disease: a Systematic Literature Review of Current Trends, Methodologies, Challenges, Innovations, and Future Directions(Springer, 2024) Toumaj, Shiva; Heidari, Arash; Shahhosseini, Reza; Navimipour, Nima JafariAlzheimer's Disease (AD) constitutes a significant global health issue. In the next 40 years, it is expected to affect 106 million people. Although more and more people are getting AD, there are still no effective drugs to treat it. Insightful information about how important it is to find and treat AD quickly. Recently, Deep Learning (DL) techniques have been used more and more to diagnose AD. They claim better accuracy in drug reuse, medication recognition, and labeling. This essay meticulously examines the works that have talked about using DL with Alzheimer's disease. Some of the methods are Natural Language Processing (NLP), drug reuse, classification, and identification. Concerning these methods, we examine their pros and cons, paying special attention to how easily they can be explained, how safe they are, and how they can be used in medical situations. One important finding is that Convolutional Neural Networks (CNNs) are most often used for AD research and Python is most often used for DL issues. Some security problems, like data protection and model stability, are not looked at enough in the present research, according to us. This study thoroughly examines present methods and also points out areas that need more work, like better data integration and AI systems that can be explained. The findings should help guide more research and speed up the creation of DL-based AD identification tools in the future.

