Jafari Navimipour, Nima
Loading...

Profile URL
Name Variants
Jafari Navimipour,Nima
JAFARI NAVIMIPOUR, Nima
N. Jafari Navimipour
Jafari Navimipour, Nima
Jafari Navimipour,N.
J.,Nima
JAFARI NAVIMIPOUR, NIMA
Jafari Navimipour, N.
Nima Jafari Navimipour
Nima JAFARI NAVIMIPOUR
Jafari Navimipour, NIMA
Jafari Navimipour N.
NIMA JAFARI NAVIMIPOUR
J., Nima
Nima, Jafari Navimipour
Navimipour, Nima Jafari
Navimipour, N.J.
Navimpour, Nima Jafari
Navımıpour, Nıma Jafarı
Jafari Navimipour, Nima Jafari
JAFARI NAVIMIPOUR, Nima
N. Jafari Navimipour
Jafari Navimipour, Nima
Jafari Navimipour,N.
J.,Nima
JAFARI NAVIMIPOUR, NIMA
Jafari Navimipour, N.
Nima Jafari Navimipour
Nima JAFARI NAVIMIPOUR
Jafari Navimipour, NIMA
Jafari Navimipour N.
NIMA JAFARI NAVIMIPOUR
J., Nima
Nima, Jafari Navimipour
Navimipour, Nima Jafari
Navimipour, N.J.
Navimpour, Nima Jafari
Navımıpour, Nıma Jafarı
Jafari Navimipour, Nima Jafari
Job Title
Doç. Dr.
Email Address
Main Affiliation
Computer Engineering
Computer Engineering
05. Faculty of Engineering and Natural Sciences
01. Kadir Has University
Computer Engineering
05. Faculty of Engineering and Natural Sciences
01. Kadir Has University
Status
Current Staff
Website
ORCID ID
Scopus Author ID
Turkish CoHE Profile ID
Google Scholar ID
WoS Researcher ID
Sustainable Development Goals
15
LIFE ON LAND

0
Research Products
16
PEACE, JUSTICE AND STRONG INSTITUTIONS

0
Research Products
14
LIFE BELOW WATER

2
Research Products
6
CLEAN WATER AND SANITATION

0
Research Products
3
GOOD HEALTH AND WELL-BEING

12
Research Products
17
PARTNERSHIPS FOR THE GOALS

0
Research Products
4
QUALITY EDUCATION

0
Research Products
2
ZERO HUNGER

5
Research Products
10
REDUCED INEQUALITIES

0
Research Products
7
AFFORDABLE AND CLEAN ENERGY

10
Research Products
13
CLIMATE ACTION

1
Research Products
1
NO POVERTY

0
Research Products
9
INDUSTRY, INNOVATION AND INFRASTRUCTURE

18
Research Products
12
RESPONSIBLE CONSUMPTION AND PRODUCTION

5
Research Products
8
DECENT WORK AND ECONOMIC GROWTH

0
Research Products
11
SUSTAINABLE CITIES AND COMMUNITIES

9
Research Products
5
GENDER EQUALITY

0
Research Products

This researcher does not have a Scopus ID.

This researcher does not have a WoS ID.

Scholarly Output
113
Articles
101
Views / Downloads
1331/14462
Supervised MSc Theses
3
Supervised PhD Theses
1
WoS Citation Count
3263
Scopus Citation Count
4063
WoS h-index
32
Scopus h-index
34
Patents
0
Projects
0
WoS Citations per Publication
28.88
Scopus Citations per Publication
35.96
Open Access Source
28
Supervised Theses
4
| Journal | Count |
|---|---|
| Nano Communication Networks | 6 |
| Sustainable Computing-Informatics & Systems | 5 |
| Cluster Computing | 5 |
| International Journal of Communication Systems | 4 |
| Multimedia Tools and Applications | 4 |
Current Page: 1 / 14
Scopus Quartile Distribution
Competency Cloud

113 results
Scholarly Output Search Results
Now showing 1 - 10 of 113
Review Citation - WoS: 25Citation - Scopus: 24The History of Computing in Iran (persia)-Since the Achaemenid Empire(Mdpi, 2022) Heidari, Arash; Navimipour, Nima Jafari; Unal, MehmetPersia was the early name for the territory that is currently recognized as Iran. Iran's proud history starts with the Achaemenid Empire, which began in the 6th century BCE (c. 550). The Iranians provided numerous innovative ideas in breakthroughs and technologies that are often taken for granted today or whose origins are mostly unknown from the Achaemenid Empire's early days. To recognize the history of computing systems in Iran, we must pay attention to everything that can perform computing. Because of Iran's historical position in the ancient ages, studying the history of computing in this country is an exciting subject. The history of computing in Iran started very far from the digital systems of the 20th millennium. The Achaemenid Empire can be mentioned as the first recorded sign of using computing systems in Persia. The history of computing in Iran started with the invention of mathematical theories and methods for performing simple calculations. This paper also attempts to shed light on Persia's computing heritage elements, dating back to 550 BC. We look at both the ancient and current periods of computing. In the ancient section, we will go through the history of computing in the Achaemenid Empire, followed by a description of the tools used for calculations. Additionally, the transition to the Internet era, the formation of a computer-related educational system, the evolution of data networks, the growth of the software and hardware industry, cloud computing, and the Internet of Things (IoT) are all discussed in the modern section. We highlighted the findings in each period that involve vital sparks of computing evolution, such as the gradual growth of computing in Persia from its early stages to the present. The findings indicate that the development of computing and related technologies has been rapidly accelerating recently.Article Citation - WoS: 21Citation - Scopus: 26An Energy-Aware Iot Routing Approach Based on a Swarm Optimization Algorithm and a Clustering Technique(Springer, 2022) Sadrishojaei, Mahyar; Navimipour, Nima Jafari; Reshadi, Midia; Hosseinzadeh, MehdiThe Internet of Things (IoT) comprises many nodes dispersed around a particular target region, and it has lately been applied in a variety of sectors such as smart cities, farming, climatology, smart metering, waste treatment, and others. Even though the IoT has tremendous potential, some difficulties must be addressed. When building the clustering and routing protocol for huge-scale IoT networks, uniform energy usage and optimization are two significant concerns. Clustering and routing are well-known NP-hard optimization challenges applied to the IoT. The ease with which chicken can be implemented has garnered much interest compared to other population-based metaheuristic algorithms in solving optimization problems in the IoT. Aiming to reduce and improve node energy consumption in the IoT network by choosing the most suitable cluster head, the current effort seeks to extend the life of a network by selecting the most appropriate cluster head. A new cost function for homogenous dispersion of cluster heads was proposed in this research, and a good balance among exploration and exploitation search skills to create a node clustering protocol based on chicken search. This procedure is a big step forward from previous state-of-the-art protocols. The number of packets received, the total power consumption, the number of active nodes, and the latency of the suggested integrated clustered routing protocol are all used to evaluate the protocol's overall performance. The proposed strategy has been demonstrated to improve power consumption by at least 16 percent.Article Citation - WoS: 110Citation - Scopus: 187The Applications of Machine Learning Techniques in Medical Data Processing Based on Distributed Computing and the Internet of Things(Elsevier Ireland Ltd, 2023) Aminizadeh, Sarina; Heidari, Arash; Toumaj, Shiva; Darbandi, Mehdi; Navimipour, Nima Jafari; Rezaei, Mahsa; Talebi, SamiraMedical data processing has grown into a prominent topic in the latest decades with the primary goal of maintaining patient data via new information technologies, including the Internet of Things (IoT) and sensor technologies, which generate patient indexes in hospital data networks. Innovations like distributed computing, Machine Learning (ML), blockchain, chatbots, wearables, and pattern recognition can adequately enable the collection and processing of medical data for decision-making in the healthcare era. Particularly, to assist experts in the disease diagnostic process, distributed computing is beneficial by digesting huge volumes of data swiftly and producing personalized smart suggestions. On the other side, the current globe is confronting an outbreak of COVID-19, so an early diagnosis technique is crucial to lowering the fatality rate. ML systems are beneficial in aiding radiologists in examining the incredible amount of medical images. Nevertheless, they demand a huge quantity of training data that must be unified for processing. Hence, developing Deep Learning (DL) confronts multiple issues, such as conventional data collection, quality assurance, knowledge exchange, privacy preservation, administrative laws, and ethical considerations. In this research, we intend to convey an inclusive analysis of the most recent studies in distributed computing platform applications based on five categorized platforms, including cloud computing, edge, fog, IoT, and hybrid platforms. So, we evaluated 27 articles regarding the usage of the proposed framework, deployed methods, and applications, noting the advantages, drawbacks, and the applied dataset and screening the security mechanism and the presence of the Transfer Learning (TL) method. As a result, it was proved that most recent research (about 43%) used the IoT platform as the environment for the proposed architecture, and most of the studies (about 46%) were done in 2021. In addition, the most popular utilized DL algorithm was the Convolutional Neural Network (CNN), with a percentage of 19.4%. Hence, despite how technology changes, delivering appropriate therapy for patients is the primary aim of healthcare-associated departments. Therefore, further studies are recommended to develop more functional architectures based on DL and distributed environments and better evaluate the present healthcare data analysis models.Article Citation - WoS: 32Citation - Scopus: 37A Cost- and Energy-Efficient Sram Design Based on a New 5 I-P Majority Gate in Qca Nanotechnology(Elsevier, 2024) Kassa, Sankit; Ahmadpour, Seyed-Sajad; Lamba, Vijay; Misra, Neeraj Kumar; Navimipour, Nima Jafari; Kotecha, KetanQuantum-dot Cellular Automata (QCA) is a revolutionary paradigm in the Nano-scale VLSI market with the potential to replace the traditional Complementary Metal Oxide Semiconductor system. To demonstrate its usefulness, this article provides a QCA-based innovation structure comprising a 5-input (i-p) Majority Gate, which is one of the basic gates in QCA, and a Static Random Access Memory (SRAM) cell with set and reset functionalities. The suggested design, with nominal clock zones, provides a reliable, compact, efficient, and durable configuration that helps achieve the optimal size and latency while decreasing power consumption. Based on the suggested 5 i-p majority gate, the realized SRAM architecture improves energy dissipation by 33.95 %, cell count by 31.34 %, and area by 33.33 % when compared to the most recent design designs. Both the time and the cost have been decreased by 30 % and 53.95 %, respectively.Article Citation - WoS: 84Citation - Scopus: 100The Applications of Nature-Inspired Algorithms in Internet of Things-Based Healthcare Service: a Systematic Literature Review(Wiley, 2024) Amiri, Zahra; Heidari, Arash; Zavvar, Mohammad; Navimipour, Nima Jafari; Esmaeilpour, MansourNature-inspired algorithms revolve around the intersection of nature-inspired algorithms and the IoT within the healthcare domain. This domain addresses the emerging trends and potential synergies between nature-inspired computational approaches and IoT technologies for advancing healthcare services. Our research aims to fill gaps in addressing algorithmic integration challenges, real-world implementation issues, and the efficacy of nature-inspired algorithms in IoT-based healthcare. We provide insights into the practical aspects and limitations of such applications through a systematic literature review. Specifically, we address the need for a comprehensive understanding of the applications of nature-inspired algorithms in IoT-based healthcare, identifying gaps such as the lack of standardized evaluation metrics and studies on integration challenges and security considerations. By bridging these gaps, our paper offers insights and directions for future research in this domain, exploring the diverse landscape of nature-inspired algorithms in healthcare. Our chosen methodology is a Systematic Literature Review (SLR) to investigate related papers rigorously. Categorizing these algorithms into groups such as genetic algorithms, particle swarm optimization, cuckoo algorithms, ant colony optimization, other approaches, and hybrid methods, we employ meticulous classification based on critical criteria. MATLAB emerges as the predominant programming language, constituting 37.9% of cases, showcasing a prevalent choice among researchers. Our evaluation emphasizes adaptability as the paramount parameter, accounting for 18.4% of considerations. By shedding light on attributes, limitations, and potential directions for future research and development, this review aims to contribute to a comprehensive understanding of nature-inspired algorithms in the dynamic landscape of IoT-based healthcare services. Providing a complete overview of the current issues associated with nature-inspired algorithms in IoT-based healthcare services. Providing a thorough overview of present methodologies for IoT-based healthcare services in research studies; Evaluating each region that tailored nature-inspired algorithms with many perspectives such as advantages, restrictions, datasets, security involvement, and simulation stings; Outlining the critical aspects that motivate the cited approaches to enhance future research; Illustrating descriptions of certain IoT-based healthcare services used in various studies. imageArticle Citation - WoS: 2Citation - Scopus: 7Multimedia big data computing mechanisms: a bibliometric analysis(Springer, 2023) Rivai, Faradillah Amalia; Navimipour, Nima Jafari; Yalcin, SenayMassive multimedia data are being created due to the rising amount of the Internet and user-generated content, low-cost commodity devices with cameras (like cellphones, surveillance systems, and so on), and the proliferation of social networks, forming a unique type of big data. Several studies have been conducted in this research area using a survey and event analysis approach; however, none has been conducted to investigate the status of knowledge, its features, evolution, and emerging trend of multimedia big data. Therefore, in this paper, a bibliometric study using VOSviewer software is carried out with 1,865 documents from 2008 to 2020. Based on the result, 2013 is the starting year where the total publication excess of 100 articles and the configuration of leading countries, productive organizations, and authors are investigated. The most cited journals, popular publications venues, and hot research topics are also included in the investigations. Our investigation uncovered useful information, such as annual publishing patterns, the hottest research topic, the top 10 important authors and articles, and the most helpful funding organizations and venues.Article Citation - Scopus: 10A New Fog-Based Transmission Scheduler on the Internet of Multimedia Things Using a Fuzzy-Based Quantum Genetic Algorithm(IEEE Computer Society, 2023) Zanbouri, K.; Al-Khafaji, H.M.R.; Jafari Navimipour, N.; Yalcin, S.The Internet of Multimedia Things (IoMT) has recently experienced a considerable surge in multimedia-based services. Due to the fast proliferation and transfer of massive data, the IoMT has service quality challenges. This paper proposes a novel fog-based multimedia transmission scheme for IoMT using the Sugano interference system with a quantum genetic optimization algorithm. The fuzzy system devises a mathematically organized strategy for generating fuzzy rules from input and output variables. The Quantum Genetic Algorithm (QGA) is a metaheuristic algorithm that combines genetic algorithms and quantum computing theory. It combines many critical elements of quantum computing, such as quantum superposition and entanglement. This provides a robust representation of population diversity and the capacity to achieve rapid convergence and high accuracy. As a result of the simulations and computational analysis, the proposed fuzzy-based QGA scheme improves packet delivery ratio and throughput by reducing end-to-end latency and delay when compared to traditional algorithms like Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Heterogeneous Earliest-Finish-Time (HEFT) and Ant Colony Optimization (ACO). Consequently, it provides a more efficient scheme for multimedia transmission in IoMT. IEEEReview Citation - WoS: 97Citation - Scopus: 123Adventures in Data Analysis: a Systematic Review of Deep Learning Techniques for Pattern Recognition in Cyber-Physical Systems(Springer, 2023) Amiri, Zahra; Heidari, Arash; Navimipour, Nima Jafari; Unal, Mehmet; Mousavi, AliMachine Learning (ML) and Deep Learning (DL) have achieved high success in many textual, auditory, medical imaging, and visual recognition patterns. Concerning the importance of ML/DL in recognizing patterns due to its high accuracy, many researchers argued for many solutions for improving pattern recognition performance using ML/DL methods. Due to the importance of the required intelligent pattern recognition of machines needed in image processing and the outstanding role of big data in generating state-of-the-art modern and classical approaches to pattern recognition, we conducted a thorough Systematic Literature Review (SLR) about DL approaches for big data pattern recognition. Therefore, we have discussed different research issues and possible paths in which the abovementioned techniques might help materialize the pattern recognition notion. Similarly, we have classified 60 of the most cutting-edge articles put forward pattern recognition issues into ten categories based on the DL/ML method used: Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Generative Adversarial Network (GAN), Autoencoder (AE), Ensemble Learning (EL), Reinforcement Learning (RL), Random Forest (RF), Multilayer Perception (MLP), Long-Short Term Memory (LSTM), and hybrid methods. SLR method has been used to investigate each one in terms of influential properties such as the main idea, advantages, disadvantages, strategies, simulation environment, datasets, and security issues. The results indicate most of the articles were published in 2021. Moreover, some important parameters such as accuracy, adaptability, fault tolerance, security, scalability, and flexibility were involved in these investigations.Article Citation - WoS: 191Citation - Scopus: 220A Secure Intrusion Detection Platform Using Blockchain and Radial Basis Function Neural Networks for Internet of Drones(IEEE-Inst Electrical Electronics Engineers Inc, 2023) Heidari, Arash; Navimipour, Nima Jafari; Unal, MehmetThe Internet of Drones (IoD) is built on the Internet of Things (IoT) by replacing Things with Drones while retaining incomparable features. Because of its vital applications, IoD technologies have attracted much attention in recent years. Nevertheless, gaining the necessary degree of public acceptability of IoD without demonstrating safety and security for human life is exceedingly difficult. In addition, intrusion detection systems (IDSs) in IoD confront several obstacles because of the dynamic network architecture, particularly in balancing detection accuracy and efficiency. To increase the performance of the IoD network, we proposed a blockchain-based radial basis function neural networks (RBFNNs) model in this article. The proposed method can improve data integrity and storage for smart decision-making across different IoDs. We discussed the usage of blockchain to create decentralized predictive analytics and a model for effectively applying and sharing deep learning (DL) methods in a decentralized fashion. We also assessed the model using a variety of data sets to demonstrate the viability and efficacy of implementing the blockchain-based DL technique in IoD contexts. The findings showed that the suggested model is an excellent option for developing classifiers while adhering to the constraints placed by network intrusion detection. Furthermore, the proposed model can outperform the cutting-edge methods in terms of specificity, F1, recall, precision, and accuracy.Article Citation - WoS: 4Citation - Scopus: 8Quantum-based serial-parallel multiplier circuit using an efficient nano-scale serial adder(Soc Microelectronics, Electron Components Materials-midem, 2024) Wu, Hongyu; Jiang, Shuai; Seyedi, Saeid; Navimipour, Nima JafariQuantum dot cellular automata (QCA) is one of the newest nanotechnologies. The conventional complementary metal oxide semiconductor (CMOS) technology was superbly replaced by QCA technology. This method uses logic states to identify the positions of individual electrons rather than defining voltage levels. A wide range of optimization factors, including reduced power consumption, quick transitions, and an extraordinarily dense structure, are covered by QCA technology. On the other hand, the serialparallel multiplier (SPM) circuit is an important circuit by itself, and it is also very important in the design of larger circuits. This paper defines an optimized circuit of SPM circuit using QCA. It can integrate serial and parallel processing benefits altogether to increase efficiency and decrease computation time. Thus, all these mentioned advantages make this multiplier framework a crucial element in numerous applications, including complex arithmetic computations and signal processing. This research presents a new QCAbased SPM circuit to optimize the multiplier circuit's performance and enhance the overall design. The proposed framework is an amalgamation of highly performance architecture with efficient path planning. Other than that, the proposed QCA-based SPM circuit is based on the majority gate and 1-bit serial adder (BSA). BCA circuit has 34 cells and a 0.04 mu m2 area and uses 0.5 clock cycles. The outcomes showed the suggested QCA-based SPM circuit occupies a mere 0.28 mu m 2 area, requires 222 QCA cells, and demonstrates a latency of 1.25 clock cycles. This work contributes to the existing literature on QCA technology, also emphasizing its capabilities in advancing VLSI circuit layout via optimized performance.

