You're using an outdated browser. Please upgrade to a modern browser for the best experience.
Topic Review
Artificial Intelligence in Edge-Based IoT Applications
Given its advantages in low latency, fast response, context-aware services, mobility, and privacy preservation, edge computing has emerged as the key support for intelligent applications and 5G/6G Internet of things (IoT) networks. This technology extends the cloud by providing intermediate services at the edge of the network and improving the quality of service for latency-sensitive applications. Many AI-based solutions with machine learning, deep learning, and swarm intelligence have exhibited the high potential to perform intelligent cognitive sensing, intelligent network management, big data analytics, and security enhancement for edge-based smart applications. 
  • 1.4K
  • 15 Feb 2023
Topic Review
Multimedia Steganalysis
Steganography techniques aim to hide the existence of secret messages in an innocent-looking medium, where the medium before and after embedding looks symmetric. Steganalysis techniques aim to breach steganography techniques and detect the presence of invisible messages. 
  • 1.4K
  • 08 Feb 2022
Topic Review
Efficient Real-Time Decision Making in IoT
Efficient Real-Time Decision Making in IoT(the Internet of Things) is about using real-time sensor data, using fresh sensor data that represent the current real-world status to minimize.          
  • 1.4K
  • 09 Feb 2022
Topic Review
AI and Self-Learning: Opportunities and Challenges
The integration of artificial intelligence (AI) and self-learning algorithms has revolutionized the field of machine learning. This research explores the opportunities, challenges, and risks associated with AI and self-learning, with a particular focus on the impact of different types of AI on self-learning. The research examines how AI algorithms are designed to learn from data and improve their performance over time with little or no human intervention. While AI and self-learning present significant opportunities for automation, efficiency, and innovation, they also pose challenges such as data privacy, security, and ethical concerns. The research provides several success stories of AI and self-learning in various industries and applications. Furthermore, the research outlines future directions for the development and implementation of AI and self-learning algorithms and provides recommendations for all involved parties.
  • 1.4K
  • 22 May 2023
Topic Review
Autonomous Vehicle
An Autonomous Vehicle (AV), or a driverless car, or a self-driving vehicle is a car, bus, truck, or any other vehicle that is able to drive from point A to point B and perform all necessary driving operations and functions without any human intervention. An Autonomous Vehicle is normally equipped with different types of sensors to perceive the surrounding environment, including Normal Vision Cameras, Infrared Cameras, RADAR, LiDAR, and Ultrasonic Sensors.  An autonomous vehicle should be able to detect and recognise all type of road users including surrounding vehicles, pedestrians, cyclists, traffic signs, road markings, and can segment the free spaces, intersections, buildings, and trees to perform a safe driving task.  Currently, no realistic prediction expects we see fully autonomous vehicles earlier than 2030. 
  • 1.3K
  • 17 Feb 2021
Topic Review
Food-Waste-Reduction Based on IoT and Big Data
IoT technology through ICT infrastructure and smart devices combines to gather huge amounts of data in real-time, which is commonly known as big data. The big data generated by IoT devices will be stored in the big data storage system and will be used for analysis. The importance of Food Wastage Reduction (FWR) is related to the loss of all the natural resources in the supply chain, including expenditures related to the use of land, water supply, and energy consumption. The application of IoT to FWR systems is also examined where use RFID sensors as a key tool to monitor food waste for each individual in accordance with the proposed model, while describe the application of IoT-based technologies to agricultural supply chain management in developing countries.
  • 1.3K
  • 08 Dec 2023
Topic Review
ANN in Intelligent Attendance System
Determining the rate of student attendance is an important task in determining the completion of the courses. Despite the success of the technology, it is unfortunate that in many academic institutions, the current systems used to detect student absences. Furthermore, one of the crucial problems in the attendance system does not count student background for continuing in the courses. In this paper, we propose an intelligent approach for calculating student attendance based on their Grade Point Average (GPA) and their activities, this approach uses Artificial Neural Network (ANN) for calculating the attendance rating accurately, meaning the system provide a new rating for each student based on their background. The aim of this research is developing an attendance system for motivation students taking attendance or taking high grade in the class. The result of this approach helps the instructor to allow students who have more activities with more absents to continue in the courses if not the students have low activity should taking high attendance. This system will more efficient for monitoring students for replacing absent to activity.
  • 1.3K
  • 28 Oct 2020
Topic Review
Artificial Intelligence Techniques in Surveillance Video Anomaly Detection
The Surveillance Video Anomaly Detection (SVAD) system is a sophisticated technology designed to detect unusual or suspicious behavior in video surveillance footage without human intervention. The system operates by analyzing the video frames and identifying deviations from normal patterns of movement or activity. This is achieved through advanced algorithms and machine learning techniques that can detect and analyze the position of pixels in the video frame at the time of an event.
  • 1.3K
  • 10 May 2023
Topic Review
Machine Learning in Gastroenterology/Endoscopy
Over time, machine learning (ML), a component of artificial intelligence (AI), has been implemented in a variety of medical specialties, such as radiology, pathology, gastroenterology, neurology, obstetrics and gynecology, ophthalmology, and orthopedics, with the goal of improving the quality of healthcare and medical diagnosis. In clinical gastroenterology practice, due to technological developments, estimates show that AI could have the ability to create a predictive model; for instance, it could develop an ML model that can stratify the risk in patients with upper gastrointestinal bleeding, establish the existence of a specific gastrointestinal disease, define the best treatment, and offer prognosis and prediction of the therapeutic response. In this context, by applying ML or deep learning (DL) (AI using neural networks), clinical management in gastroenterology can begin to focus on more personalized treatment centered on the patient and based on making the best individual decisions, instead of relying mostly on guidelines developed for a specific condition. Moreover, the goal of implementing these AI-based algorithms is to increase the possibility of diagnosing a gastrointestinal disease at early stage or the ability to predict the development of a particular condition in advance. Because both AI and gastroenterology encompass many subdomains, the interaction between them might take on various forms. In recent years, we have witnessed a large explosion of research in attempts to improve various fields of gastroenterology, such as endoscopy, hepatology, inflammatory bowel diseases, and many others, with the aid of ML. We also note that, because of the requirement to diagnose more patients with gastrointestinal cancers at an early stage of the disease, which is associated with curative treatment and better prognosis, many studies were developed to address improvement of the detection of these tumors with the aid of AI. The term ML, introduced for the first time in 1959 by Arthur Samuel from the IBM company, refers to an IT domain whereby a computer system can acquire the ability to “learn” by using data without specific programming and can therefore develop a predictive mathematical algorithm based on input data, using recognition of “features”. The ML “model” is subsequently able to adapt to new situations in which it becomes able to predict and make decisions.
  • 1.3K
  • 02 Feb 2021
Topic Review
Artificial Neural Networks for Navigation Systems
Several machine learning (ML) methodologies are gaining popularity as artificial intelligence (AI) becomes increasingly prevalent. An artificial neural network (ANN) may be used as a “black-box” modeling strategy without the need for a detailed system physical model. It is more reasonable to solely use the input and output data to explain the system’s actions. ANNs have been extensively researched, as artificial intelligence has progressed to enhance navigation performance. In some circumstances, the Global Navigation Satellite System (GNSS) can offer consistent and dependable navigational options. A key advancement in contemporary navigation is the fusion of the GNSS and inertial navigation system (INS). Numerous strategies have been put out to increase the accuracy for jamming, GNSS-prohibited environments, the integration of GNSS/INS or other technologies by means of a Kalman filter as well as to solve the signal blockage issue in metropolitan areas. A neural-network-based fusion approach is suggested to address GNSS outages. 
  • 1.3K
  • 21 Apr 2023
Topic Review
Associative Classification Method
Machine learning techniques are ever prevalent as datasets continue to grow daily. Associative classification (AC), which combines classification and association rule mining algorithms, plays an important role in understanding big datasets that generate a large number of rules. Clustering, on the other hand, can contribute by reducing the rule space to produce compact models. 
  • 1.3K
  • 20 Sep 2022
Topic Review
Journalistic Knowledge Platform
A Journalistic Knowledge Platform (JKP) is an information system that employ artificial intelligence and big data techniques such as machine learning and knowledge graphs to manage and support the knowledge work needed in all stages of news production. JKPs automate the process of annotating metadata and support daily workflows like news production, archiving, monitoring, management and distribution. JKPs harvest and analyse news and social media information over the net in real time, leverage encyclopaedic sources, and provide journalists with both meaningful background knowledge and newsworthy information. JKPs can provide a digitalisation path towards reduced production costs and improved information quality while adapting the current workflows of newsrooms to new forms of journalism and readers’ demands.
  • 1.3K
  • 20 Jun 2022
Topic Review
AI&ML for Medical Sector
This work represents a comprehensive analysis of the potential AI, ML, and IoT technologies for defending against the COVID-19 pandemic. The existing and potential applications of AI, ML, and IoT, along with a detailed analysis of the enabling tools and techniques are outlined. A critical discussion on the risks and limitations of the aforementioned technologies are also included.
  • 1.3K
  • 21 Jan 2021
Topic Review
Radar Depth and Velocity Estimation
Radar can measure range and Doppler velocity, but both of them cannot be directly used for downstream tasks. The range measurements are sparse and therefore difficult to associate with their visual correspondences. The Doppler velocity is measured in the radial axis and, therefore, cannot be directly used for tracking.
  • 1.3K
  • 08 Jun 2022
Topic Review
Bayesian Nonlinear Mixed Effects Models
Nonlinear mixed effects models have become a standard platform for analysis when data is in the form of continuous and repeated measurements of subjects from a population of interest, while temporal profiles of subjects commonly follow a nonlinear tendency. While frequentist analysis of nonlinear mixed effects models has a long history, Bayesian analysis of the models has received comparatively little attention until the late 1980s, primarily due to the time-consuming nature of Bayesian computation. Since the early 1990s, Bayesian approaches for the models began to emerge to leverage rapid developments in computing power, and have recently received significant attention due to (1) superiority to quantify the uncertainty of parameter estimation; (2) utility to incorporate prior knowledge into the models; and (3) flexibility to match exactly the increasing complexity of scientific research arising from diverse industrial and academic fields. 
  • 1.3K
  • 23 Mar 2022
Topic Review
Wireless Sensors for Brain Activity
Over the last decade, the area of electroencephalography (EEG) witnessed a progressive move from high-end large measurement devices, relying on accurate construction and providing high sensitivity, to miniature hardware, more specifically wireless wearable EEG devices. While accurate, traditional EEG systems need a complex structure and long periods of application time, unwittingly causing discomfort and distress on the users. Given their size and price, aside from their lower sensitivity and narrower spectrum band(s), wearable EEG devices may be used regularly by individuals for continuous collection of user data from non-medical environments. This allows their usage for diverse, nontraditional, non-medical applications, including cognition, BCI, education, and gaming. Given the reduced need for standardization or accuracy, the area remains a rather incipient one, mostly driven by the emergence of new devices that represent the critical link of the innovation chain.
  • 1.3K
  • 26 Jan 2021
Topic Review
AI-Assisted Design-on-Simulation for Life Prediction
Many researchers have adopted the finite-element-based design-on-simulation (DoS) technology for the reliability assessment of electronic packaging. DoS technology can effectively shorten the design cycle, reduce costs, and effectively optimize the packaging structure. However, the simulation analysis results are highly dependent on the individual researcher and are usually inconsistent between them. Artificial intelligence (AI) can help researchers avoid the shortcomings of the human factor. 
  • 1.3K
  • 28 Sep 2021
Topic Review
Explainable Artificial Intelligence for Smart Cities
The emergence of Explainable Artificial Intelligence (XAI) has enhanced the lives of humans and envisioned the concept of smart cities using informed actions, enhanced user interpretations and explanations, and firm decision-making processes. The XAI systems can unbox the potential of black-box AI models and describe them explicitly.
  • 1.3K
  • 19 Apr 2023
Topic Review
Rainfall Prediction System
Rainfall prediction is one of the challenging tasks in weather forecasting process. Accurate rainfall prediction is now more difficult than before due to the extreme climate variations.
  • 1.3K
  • 18 May 2022
Topic Review
Deep Learning in SOC Estimation for Li-Ion Batteries
As one of the critical state parameters of the battery management system, the state of charge (SOC) of lithium batteries can provide an essential reference for battery safety management, charge/discharge control, and the energy management of electric vehicles (EVs). The SOC estimation of a Li-ion battery in the deep learning method uses deep learning theory of computer science to build a model that builds the approximate relationship between input data (voltage, current, temperature, power, capacity, etc.) and output data (SOC) by available data. According to different neural network structures, it can be classified as a single, hybrid, or trans structure. 
  • 1.3K
  • 02 Nov 2022
  • Page
  • of
  • 58
Video Production Service