Topic Review
Fine-Grained Change Detection
Fine-grained change detection in sensor data is very challenging for artificial intelligence though it is critically important in practice. It is the process of identifying differences in the state of an object or phenomenon where the differences are class-specific and are difficult to generalise. As a result, many recent technologies that leverage big data and deep learning struggle with this task.
  • 825
  • 12 Jul 2021
Topic Review
Machine Learning for Crop Diseases and Pests
Rapid population growth has resulted in an increased demand for agricultural goods. Pests and diseases are major obstacles to achieving this productivity outcome. Therefore, it is very important to develop efficient methods for the automatic detection, identification, and prediction of pests and diseases in agricultural crops. To perform such automation, Machine Learning (ML) techniques can be used to derive knowledge and relationships from the data that is being worked on. 
  • 823
  • 16 Sep 2022
Topic Review
Quote Notation
Quote notation is a representation of the rational numbers based on Kurt Hensel's p-adic numbers. In quote notation, arithmetic operations take particularly simple, consistent forms, producing exact answers with no roundoff error. Quote notation’s arithmetic algorithms work in a right-to-left direction; addition, subtraction, and multiplication algorithms are the same as for natural numbers, and division is easier than the usual division algorithm. The notation was invented by Eric Hehner of the University of Toronto and Nigel Horspool, then at McGill University, and published in the SIAM Journal on Computing, v.8, n.2, May 1979, pp. 124–134.
  • 823
  • 24 Nov 2022
Topic Review
Analysing Hucul Horses by AI
       The neural classification system in form of a multi-layered artificial neural network suggested in this paper was implemented in the programming environment MATLAB. MATLAB is a useful tool focused mainly on scientific and technical calculations. It boasts of a wide spectrum of software solutions/libraries, the so-called Toolboxes that can be used, for example to create and optimize neural networks. It is fully compatible with other programming environments.  Matlab is a tool for rapid prototyping that enables a wide range of learning algorithms, the selection of optimal neural network architecture, the selection of the most efficient neuron activation functions as well as optimal learning parameters.          The design of the network is of key significance both for the learning process, and the quality of its operation in later stages. The set of input data, purpose, and results do have significant impact on the they configuration.  A key assumption is taking cognizance of factual links between the set of explanatory variables (input) and the output.        The artificial neural networks enable the capture of relationships and dependencies between the data in circumstances where the application of traditional analytical methods would not have yielded satisfactory solutions.         The use of ANN enables objective assessments of individual animals by taking into account only factors essential for determining horses’ performance and breeding values.         Preliminary results of the application of artificial neural networks in predicting the utility value of Hucul horses, relying on a specific set of features seem rather promising.        It offers potential possibilities of evaluation, relying on available information about the animals.
  • 819
  • 13 Oct 2020
Topic Review
Application Scenarios of Using Knowledge Graph
In dynamic complex cyber environments, Cyber Threat Intelligence (CTI) and the risk of cyberattacks are both increasing. This means that organizations need to have a strong understanding of both their internal CTI and their external CTI. The potential for cybersecurity knowledge graphs is evident in their ability to aggregate and represent knowledge about cyber threats, as well as their ability to manage and reason with that knowledge. While most existing research has focused on how to create a full knowledge graph, how to utilize the knowledge graph to tackle real-world industrial difficulties in cyberattack and defense situations is still unclear. 
  • 819
  • 12 Aug 2022
Topic Review
Discrimination, Bias, Fairness, and Trustworthy AI
It has been identified that there exists a set of specialized variables, such as security, privacy, responsibility, etc., that are used to operationalize the principles in the Principled AI International Framework.  Bias, discrimination, and fairness are mainly approached with an operational interest by the Principled AI International Framework.
  • 817
  • 01 Jul 2022
Topic Review
Smartphone Security and Privacy
There is an exponential rise in the use of smartphones in government and private institutions due to business dependencies such as communication, virtual meetings, and access to global information. These smartphones are an attractive target for cybercriminals and are one of the leading causes of cyber espionage and sabotage. A large number of sophisticated malware attacks as well as advanced persistent threats (APTs) have been launched on smartphone users. These attacks are becoming significantly more complex, sophisticated, persistent, and undetected for extended periods. Traditionally, devices are targeted by exploiting a vulnerability in the operating system (OS) or device sensors. Nevertheless, there is a rise in APTs, side-channel attacks, sensor-based attacks, and attacks launched through the Google Play Store.
  • 816
  • 25 Jun 2023
Topic Review
Data Fusion in Agriculture
The term “data fusion” can be defined as “the process of combining data from multiple sources to produce more accurate, consistent, and concise information than that provided by any individual data source”. Other stricter definitions do exist to better fit narrower contexts. This type of approach has been applied to agricultural problems since the first half of the 1990s, and there has been an increase in the use of this approach. Arguably, the main challenge involved in the use of data fusion techniques involves finding the best approach to fully explore the synergy and complementarities that potentially exist between different types of data and data sources.
  • 815
  • 07 Apr 2022
Topic Review
Dragonfly Algorithm and Its Hybrids: A Survey
Optimization algorithms are essential for numerous optimization applications where usually certain parameters are minimized or maximized by considering an objective function. These algorithms include exact methods and heuristic algorithms such as swarm intelligence algorithms. Swarm intelligence is a discipline which makes use of a number of agents, thereby forming a population in which individuals interact among themselves and with their environment, to give rise to a global intelligent behavior. The Dragonfly Algorithm (DA) is a swarm intelligence algorithm that was proposed in 2016, and it is inspired by the behavior of dragonflies in nature. It has been found to have a higher performance than some of the most popular evolutionary algorithms, such as the genetic algorithm (GA), and swarm intelligence algorithms such as particle swarm optimization (PSO). Owing to its high effectiveness and efficiency, it has been utilized in multifarious applications and attempts to further improve its performance have been made and hence a number of hybrids of DA have been proposed.
  • 814
  • 25 Nov 2021
Topic Review
AI Revolution in Digital Finance in Saudi Arabia
In recent years, Artificial Intelligence (AI) has become widespread, driven by abundant daily data production and increased computing power. It finds applications across various sectors, including transportation, education, healthcare, banking, and finance. The financial industry, in particular, is rapidly adopting AI to achieve significant cost savings. AI has the potential to revolutionize financial services by offering tailored, faster, and more cost-effective solutions. Saudi Arabia is emerging as a growing market in this field, emphasizing technology-driven institutions. Despite gaining prominence and government support, AI has yet to play a crucial role in improving the efficiency of financial transactions.
  • 814
  • 27 Nov 2023
Topic Review
Predictive Maintenance Solutions for SMEs
Small- and medium-sized enterprises (SMEs) play an important role in the economy of societies. Although environmental factors, such as COVID-19, as well as non-environmental factors, such as equipment failure, make these industries more vulnerable, they can be minimized by better understanding the concerns and threats these industries face. Only a few SMEs have the capacity to implement the innovative manufacturing technologies of Industry 4.0.
  • 813
  • 29 Oct 2021
Topic Review
Digital Twins
Digital Twins, which are virtual representations of physical systems mirroring their behavior, enable real-time monitoring, analysis, and optimization. Understanding and identifying the temporal dependencies included in the multivariate time series data that characterize the behavior of the system are crucial for improving the effectiveness of Digital Twins. Long Short-Term Memory (LSTM) networks have been used to represent complex temporal dependencies and identify long-term links in the Industrial Internet of Things (IIoT).
  • 812
  • 03 Nov 2023
Topic Review
MoRAL-AI: AI-based Liver Transplantation Model
Novel model to predict HCC recurrence after liver transplantation based on deep learning
  • 808
  • 29 Jan 2021
Topic Review
Abstractive vs. Extractive Summarization
Due to the huge and continuously growing size of the textual corpora existing on the Internet, important information may go unnoticed or become lost. At the same time, the task of summarizing these resources by human experts is tedious and time consuming. This necessitates the automation of the task. Natural language processing (NLP) is a multidisciplinary research field, merging aspects and approaches from computer science, artificial intelligence and linguistics; it deals with the development of processes that semantically and efficiently analyze vast amounts of textual data. Text summarization (TS) is a fundamental NLP subtask, which has been defined as the process of the automatic creation of a concise and fluent summary that captures the main ideas and topics of one or multiple documents.
  • 806
  • 07 Jul 2023
Topic Review
Artificial Neural Networks and Energy Forecasting
Load prediction with higher accuracy and less computing power has become an important problem in the smart grids domain in general and especially in demand-side management (DSM), as it can serve to minimize global warming and better integrate renewable energies. Indeed, artificial neural networks (ANN) are the most used methods in forecasting electrical load. They are widely employed in this field for their numerous advantages. In fact, the complexity of this task is considerable due to several factors/parameters, such as weather and holidays (linear and non-linear relationships), which is a well-suited problem for ANNs and their capacity to deal with non-linear relationships.
  • 804
  • 21 Jun 2022
Topic Review
Developing IoT Artifacts in a MAS Platform
The Internet of Things (IoT) is a computational paradigm where a massive number (perhaps billions) of ordinary objects are endowed with interconnection capabilities, making them able to communicate and cooperate with other (surrounding) devices, generally via the Internet.. The Internet of Things (IoT) is a growing computational paradigm where all kinds of everyday objects are interconnected, forming a vast cyberphysical environment at the edge between the virtual and the real world. Since the emergence of the IoT, Multi-Agent Systems (MAS) technology has been successfully applied in this area, proving itself to be an appropriate paradigm for developing distributed, intelligent systems containing sets of IoT devices. However, this technology still lacks effective mechanisms to integrate the enormous diversity of existing IoT devices systematically.
  • 803
  • 15 Mar 2022
Topic Review Peer Reviewed
Tokenization in the Theory of Knowledge
Tokenization is a procedure for recovering the elements of interest in a sequence of data. This term is commonly used to describe an initial step in the processing of programming languages, and also for the preparation of input data in the case of artificial neural networks; however, it is a generalizable concept that applies to reducing a complex form to its basic elements, whether in the context of computer science or in natural processes. In this entry, the general concept of a token and its attributes are defined, along with its role in different contexts, such as deep learning methods. Included here are suggestions for further theoretical and empirical analysis of tokenization, particularly regarding its use in deep learning, as it is a rate-limiting step and a possible bottleneck when the results do not meet expectations.
  • 802
  • 11 Apr 2023
Topic Review
Speaker Recognition Systems
Along with the prevalence and increasing influence of the speaker recognition technology, its security has drawn broad attention. Though speaker recognition systems (SRSs) have reached a high recognition accuracy, their security remains a big concern since a minor perturbation on the audio input may result in reduced recognition accuracy.
  • 800
  • 28 Jul 2022
Topic Review
A Patch-Based CNN Built on the VGG-16 Architecture
Facial recognition is a prevalent method for biometric authentication that is utilized in a variety of software applications. This technique is susceptible to spoofing attacks, in which an imposter gains access to a system by presenting the image of a legitimate user to the sensor, hence increasing the risks to social security. Consequently, facial liveness detection has become an essential step in the authentication process prior to granting access to users. A patch-based convolutional neural network (CNN) with a deep component for facial liveness detection for security enhancement was developed, which was based on the VGG-16 architecture.
  • 800
  • 13 Sep 2022
Topic Review
Computer-Aided Breast Cancer Diagnosis
A computer-aided diagnosis (CAD) expert system is a powerful tool to efficiently assist a pathologist in achieving an early diagnosis of breast cancer. This process identifies the presence of cancer in breast tissue samples and the distinct type of cancer stages. In a standard CAD system, the main process involves image pre-processing, segmentation, feature extraction, feature selection, classification, and performance evaluation. Breast cancer can be distinguished as benign (non-cancerous) and malignant (cancerous/metastatic) tumours. Benign tissue refers to changes in normal tissue of breast parenchyma, which does not relate to the development of malignancy . Contrarily, malignant tissue can be categorised into two types: in-situ carcinoma and invasive carcinoma.  
  • 796
  • 15 Jun 2021
  • Page
  • of
  • 58
ScholarVision Creations