You're using an outdated browser. Please upgrade to a modern browser for the best experience.
Subject:
All Disciplines Arts & Humanities Biology & Life Sciences Business & Economics Chemistry & Materials Science Computer Science & Mathematics Engineering Environmental & Earth Sciences Medicine & Pharmacology Physical Sciences Public Health & Healthcare Social Sciences
Sort by:
Most Viewed Latest Alphabetical (A-Z) Alphabetical (Z-A)
Filter:
All Topic Review Biography Peer Reviewed Entry Video Entry
Topic Review
Machine Learning in Cereal Crops Disease Detection
Cereals are an important and major source of the human diet. They constitute more than two-thirds of the world’s food source and cover more than 56% of the world’s cultivatable land. These important sources of food are affected by a variety of damaging diseases, causing significant loss in annual production. In this regard, detection of diseases at an early stage and quantification of the severity has acquired the urgent attention of researchers worldwide. One emerging and popular approach for this task is the utilization of machine learning techniques.
  • 1.9K
  • 03 Mar 2022
Topic Review
Evolution of Intelligent Vehicle Technology
The time evolution of intelligent vehicle technology is explained, which highlights the development of an intelligentvehicle and its safety applications, focusing on the various usages of perception sensors in production.
  • 1.9K
  • 25 Nov 2020
Topic Review
Bayesian Nonlinear Mixed Effects Models
Nonlinear mixed effects models have become a standard platform for analysis when data is in the form of continuous and repeated measurements of subjects from a population of interest, while temporal profiles of subjects commonly follow a nonlinear tendency. While frequentist analysis of nonlinear mixed effects models has a long history, Bayesian analysis of the models has received comparatively little attention until the late 1980s, primarily due to the time-consuming nature of Bayesian computation. Since the early 1990s, Bayesian approaches for the models began to emerge to leverage rapid developments in computing power, and have recently received significant attention due to (1) superiority to quantify the uncertainty of parameter estimation; (2) utility to incorporate prior knowledge into the models; and (3) flexibility to match exactly the increasing complexity of scientific research arising from diverse industrial and academic fields. 
  • 1.9K
  • 23 Mar 2022
Topic Review
Interpretable Machine Learning in Healthcare
Recently, machine Learning (ML) has been highly used in many areas, such as speech recognition and image processing. The revolution in industrial technology using ML proves the great success of ML and its applications in analyzing complex patterns, which are presented in a variety of applications in a wide range of sectors, including healthcare.
  • 1.9K
  • 29 Dec 2021
Topic Review
Virtual Synchrony
Virtual synchrony is an interprocess message passing (sometimes called ordered, reliable multicast) technology. Virtual synchrony systems allow programs running in a network to organize themselves into process groups, and to send messages to groups (as opposed to sending them to specific processes). Each message is delivered to all the group members, in the identical order, and this is true even when two messages are transmitted simultaneously by different senders. Application design and implementation is greatly simplified by this property: every group member sees the same events (group membership changes and incoming messages) in the same order. A virtually synchronous service is typically implemented using a style of programming called state machine replication, in which a service is first implemented using a single program that receives inputs from clients through some form of remote message passing infrastructure, then enters a new state and responds in a deterministic manner. The initial implementation is then transformed so that multiple instances of the program can be launched on different machines, using a virtually synchronous message passing system to replicate the incoming messages over the members. The replicas will see the same events in the same order, and are in the same states, hence they will make the same state transitions and remain in a consistent state. The replication of the service provides a form of fault-tolerance: if a replica fails (by crashing), the others remain and can continue to provide responses. Different members of the replica group can also be programmed to subdivide the workload, typically by using the group membership to determine their respective roles. This permits a group of N members to run as much as N times faster than a single member, or to handle N times as many requests, while continuing to offer fault-tolerance in the event of a crash. Virtual synchrony is distinguished from classical state machine replication because the model includes features whereby a programmer can request early (optimistic) delivery of messages, or relaxed forms of ordering. When used appropriately, these features can enable substantial speedup. However, the programmer needs to be sure that the relaxation of guarantees will not compromise correctness. For example, in a service that uses locking to protect concurrently updated data, the messaging system can be instructed to use an inexpensive form of message ordering, in which the messaging system respects the ordering in which individual senders send messages (FIFO guarantee) but does not attempt to impose an agreed order if messages are sent concurrently by different senders. Provided that the sender indeed held locks on the data, it can be shown that FIFO ordering suffices for correctness. The benefit is that FIFO ordering is much less costly to implement than total ordering for concurrent messages. To give another example, by delivering messages optimistically, virtual synchrony systems can outperform the Paxos that is normally required for implementation of state machine replication: Paxos normally requires a 2-phase protocol, whereas optimistic virtual synchrony protocols can deliver messages immediately upon their arrival. However, this could result in a violation of the safety property of the state machine replication model. To prevent such problems, the programmer who uses this feature is required to invoke a primitive called flush, which delays the caller until any optimistically delivered messages have reached all of the group members. Provided that the programmer understands this behavior and is careful to call flush before interacting with external clients or persistent storage, higher performance can be achieved without loss of safety. The flexibility associated with these limited forms of event reordering and optimistic early delivery permit virtual synchrony platforms to achieve extremely high data rates while still preserving very strong fault-tolerance and consistency guarantees.
  • 1.9K
  • 01 Dec 2022
Topic Review
Rain Fade Models
Developing a rain fade model involves mathematical analysis of rain attenuation phenomena by reasoning and cause-based interaction.
  • 1.9K
  • 27 May 2021
Topic Review
Artificial Intelligent in Education
The application of Artificial Intelligence or AI in education has been the subject of academic research. The field examines learning wherever it occurs, in traditional classrooms or at workplaces so to support formal education and lifelong learning. It combines interdisciplinary AI and learning sciences (such as education, psychology, neuroscience, linguistics, sociology and anthropology) in order to facilitate the development of effective adaptive learning environments and various flexible, inclusive tools. Nowadays, there are several new challenges in the field of education technology in the era of smart phones, tablets, cloud computing, Big Data, etc., whose current research questions focus on concepts such as ICT-enabled personalized learning, mobile learning, educational games, collaborative learning on social media, MOOCs, augmented reality application in education and so on. Therefore, to meet these new challenges in education, several fields of research using AI have emerged over time to improve teaching and learning using digital technologies.
  • 1.9K
  • 03 Mar 2022
Topic Review
Objective Diagnosis for Histopathological Images
Histopathology refers to the examination by a pathologist of biopsy samples. Histopathology images are captured by a microscope to locate, examine, and classify many diseases, such as different cancer types. They provide a detailed view of different types of diseases and their tissue status. These images are an essential resource with which to define biological compositions or analyze cell and tissue structures. This imaging modality is very important for diagnostic applications.The analysis of histopathology images is a prolific and relevant research area supporting disease diagnosis. In this paper, the challenges of histopathology image analysis are evaluated. An extensive review of conventional and deep learning techniques that have been applied in histological image analyses is presented. This entry summarizes many current datasets and highlights important challenges and constraints with recent deep learning techniques, alongside possible future research avenues. Despite the progress made in this research area so far, it is still a significant area of open research because of the variety of imaging techniques and disease-specific characteristics. 
  • 1.9K
  • 29 Jan 2021
Topic Review
Bio-Inspired Optimization Algorithms
The application of artificial intelligence in everyday life is becoming all-pervasive and unavoidable. Within that vast field, a special place belongs to biomimetic/bio-inspired algorithms for multiparameter optimization, which find their use in a large number of areas. Novel methods and advances are being published at an accelerated pace. 
  • 1.9K
  • 24 Jul 2023
Topic Review
AI and Self-Learning: Opportunities and Challenges
The integration of artificial intelligence (AI) and self-learning algorithms has revolutionized the field of machine learning. This research explores the opportunities, challenges, and risks associated with AI and self-learning, with a particular focus on the impact of different types of AI on self-learning. The research examines how AI algorithms are designed to learn from data and improve their performance over time with little or no human intervention. While AI and self-learning present significant opportunities for automation, efficiency, and innovation, they also pose challenges such as data privacy, security, and ethical concerns. The research provides several success stories of AI and self-learning in various industries and applications. Furthermore, the research outlines future directions for the development and implementation of AI and self-learning algorithms and provides recommendations for all involved parties.
  • 1.9K
  • 22 May 2023
Topic Review
Wireless Sensor Networks based IoT
The WSN based IoT (WSN-IoT) design problems include network coverage and connectivity issues, energy consumption, bandwidth requirement, network lifetime maximization, communication protocols and state of the art infrastructure. In this paper, the authors propose machine learning methods as an optimization tool for regular WSN-IoT nodes deployed in smart city applications. 
  • 1.8K
  • 26 Jul 2021
Topic Review
Artificial Intelligence in Agriculture: Benefits, Challenges, and Trends
The world’s population has reached 8 billion and is projected to reach 9.7 billion by 2050, increasing the demand for food production. Artificial intelligence (AI) technologies that optimize resources and increase productivity are vital in an environment that has tensions in the supply chain and increasingly frequent weather events. 
  • 1.8K
  • 05 Jul 2023
Topic Review
AI-Based Wormhole Attack Detection Techniques
The popularity of wireless sensor networks for establishing different communication systems is increasing daily. A wireless network consists of sensors prone to various security threats. These sensor nodes make a wireless network vulnerable to denial-of-service attacks. One of them is a wormhole attack that uses a low latency link between two malicious sensor nodes and affects the routing paths of the entire network. This attack is brutal as it is resistant to many cryptographic schemes and hard to observe within the network. 
  • 1.8K
  • 12 Aug 2022
Topic Review Peer Reviewed
Machine Learning in Healthcare Communication
Machine learning (ML) is a study of computer algorithms for automation through experience. ML is a subset of artificial intelligence (AI) that develops computer systems, which are able to perform tasks generally having need of human intelligence. While healthcare communication is important in order to tactfully translate and disseminate information to support and educate patients and public, ML is proven applicable in healthcare with the ability for complex dialogue management and conversational flexibility. In this topical review, we will highlight how the application of ML/AI in healthcare communication is able to benefit humans. This includes chatbots for the COVID-19 health education, cancer therapy, and medical imaging. 
  • 1.8K
  • 13 Apr 2022
Topic Review
Overview of Deep Learning-Based Visual Multi-Object Tracking
Multi-target tracking is an advanced visual work in computer vision, which is essential for understanding the autonomous driving environment. Due to the excellent performance of deep learning in visual object tracking, many state-of-the-art multi-target tracking algorithms have been developed.
  • 1.8K
  • 22 Nov 2022
Topic Review
Secure Bluetooth Communication in Smart Healthcare Systems
Smart health presents an ever-expanding attack surface due to the continuous adoption of a broad variety of Internet of Medical Things (IoMT) devices and applications. IoMT is a common approach to smart city solutions that deliver long-term benefits to critical infrastructures, such as smart healthcare. Many of the IoMT devices in smart cities use Bluetooth technology for short-range communication due to its flexibility, low resource consumption, and flexibility. As smart healthcare applications rely on distributed control optimization, artificial intelligence (AI) and deep learning (DL) offer effective approaches to mitigate cyber-attacks. 
  • 1.8K
  • 21 Nov 2022
Topic Review
Models for Evaluation Intrusion Detection Systems in IoT
Using the Internet of Things (IoT) for various applications, such as home and wearables devices, network applications, and even self-driven vehicles, detecting abnormal traffic is one of the problematic areas for researchers to protect network infrastructure from adversary activities. Several network systems suffer from drawbacks that allow intruders to use malicious traffic to obtain unauthorized access. Attacks such as Distributed Denial of Service attacks (DDoS), Denial of Service attacks (DoS), and Service Scans demand a unique automatic system capable of identifying traffic abnormality at the earliest stage to avoid system damage. Numerous automatic approaches can detect abnormal traffic. However, accuracy is not only the issue with current Intrusion Detection Systems (IDS), but the efficiency, flexibility, and scalability need to be enhanced to detect attack traffic from various IoT networks. 
  • 1.8K
  • 27 May 2022
Topic Review
Deep Learning for Accurate Real-Time Weed Detection
This article discusses the possibility of accurately detecting the position of weeds in real-time in real conditions. Presented detailed recommendations for solving the problem with scene density, considered ways for increasing accuracy, and FPS.
  • 1.8K
  • 09 Jan 2022
Topic Review
Deep Learning and Lung Disease
The recent developments of deep learning support the identification and classification of lung diseases in medical images.
  • 1.8K
  • 26 Jan 2021
Topic Review
Data Locality in High Performance Computing
Big data has revolutionized science and technology leading to the transformation of our societies. High-performance computing (HPC) provides the necessary computational power for big data analysis using artificial intelligence and methods. Data locality is a broad term that encapsulates different aspects including bringing computations to data, minimizing data movement by efficient exploitation of cache hierarchies, reducing intra- and inter-node communications, locality-aware process and thread mapping, and in situ and transit data analysis. 
  • 1.8K
  • 17 Jan 2023
  • Page
  • of
  • 59
Academic Video Service