Your browser does not fully support modern features. Please upgrade for a smoother experience.
Subject:
All Disciplines Arts & Humanities Biology & Life Sciences Business & Economics Chemistry & Materials Science Computer Science & Mathematics Engineering Environmental & Earth Sciences Medicine & Pharmacology Physical Sciences Public Health & Healthcare Social Sciences
Sort by:
Most Viewed Latest Alphabetical (A-Z) Alphabetical (Z-A)
Filter:
All Topic Review Biography Peer Reviewed Entry Video Entry
Topic Review
Reliable Storage of Cloud Data
The prime objective of the cloud data storage process is to make the service, irrespective of being infinitely extensible, a more reliable storage and low-cost model that also encourages different data storage types. Owing to the storage process, it must satisfy the cloud users’ prerequisites. Nevertheless, storing massive amounts of data becomes critical as this affects the data quality or integrity. Hence, this poses various challenges for existing methodologies. An efficient, reliable cloud storage model is proposed using a hybrid heuristic approach to overcome the challenges. The prime intention of the proposed system is to store the data effectively in the cloud environment by resolving two constraints, which are general and specific (structural). The cloud data were initially gathered and used to analyze the storage performance. Since the data were extensive, different datasets and storage devices were considered. Every piece of data as specified by its corresponding features, whereas the devices were characterized by the hardware or software components. Subsequently, the objective function was formulated using the network’s structural and general constraints. The structural constraints were determined by the interactions between the devices and data instances in the cloud. Then, the general constraints regarding the data allocation rules and device capacity were defined. To mitigate the constraints, the components were optimized using the Hybrid Pelican–Billiards Optimization Algorithm (HP-BOA) to store the cloud data. Finally, the performance was validated, and the results were analyzed and compared against existing approaches. Thus, the proposed model exhibited the desired results for storing cloud data appropriately. 
  • 851
  • 15 May 2023
Topic Review
Association Rule Mining in Facility Management
Maintenance represents a substantial share of work in various industries. Due to its significant financial impact, industry and research focus on improving the effectiveness of maintenance. Predictive Maintenance (PM) is one way to reduce costs and downtimes by planning maintenance work based on an asset’s actual condition rather than relying on fixed time-based maintenance cycles. Association rule mining (ARM) is a suitable method for PM tasks when data are unlabeled and less structured, as is the case in the facility management domain.
  • 851
  • 07 Feb 2024
Topic Review
BENS−B5G
Fifth-generation (5G) technology is anticipated to allow a slew of novel applications across a variety of industries. The wireless communication of the 5G and Beyond-5G (B5G) networks will accommodate a wide variety of services and user expectations, including intense end-user connectivity, sub-1 ms delay, and a transmission rate of 100 Gbps. Network slicing is envisioned as an appropriate technique that can meet these disparate requirements. The intrinsic qualities of a blockchain, which has lately acquired prominence, mean that it is critical for the 5G network and B5G networks. In particular, the incorporation of blockchain technology into B5G enables the network to effectively monitor and control resource utilization and sharing. Using blockchain technology, a network-slicing architecture referred to as the Blockchain Consensus Framework is introduced that allows resource providers to dynamically contract resources, especially the radio access network (RAN) schedule, to guarantee that their end-to-end services are effortlessly executed. 
  • 850
  • 30 Sep 2022
Topic Review
A Stakeholder-Specific View on Impact Sourcing
Impact Sourcing is the outsourcing of activities to disadvantaged social groups in order to help them become participants of the globalized digital world and thus benefit from higher incomes and wealth creation.
  • 844
  • 23 Nov 2022
Topic Review
HAGGIS
HAGGIS is a high-level reference programming language used primarily to examine Computing Science for Scottish pupils taking SQA courses on the subject. HAGGIS is used as a tool to bridge the gap between pseudocode and typical computer programming. HAGGIS is not based on any one language but a mixture that is intended to allow a pupil familiar with any of the many languages used in classrooms to easily understand the syntactic construct being used in an example. It has multiple programming paradigms of functional, imperative and object-oriented to suit this purpose. There are three separate language definitions, one for each level at which computing is assessed by the SQA; these are proper subsets of each other, so for example any program contained by the National 5 level language is also well-defined at Higher and Advanced Higher levels. Higher includes the definition of procedures and functions and the use of record types and files, while Advanced Higher includes object-orientation. Online HAGGIS interpreters have been developed to provide a way for examiners and teachers to check their programs are correctly defined and behave as expected.
  • 839
  • 19 Oct 2022
Topic Review
Integrating Brazilian health databases
The volume of data generated by health systems is substantial and is likely to continue growing exponentially with the growing adoption of the Internet of Things. Efforts to improve data discovery and integration are complicated by the complexity, dimensionality and heterogeneity of the data, inadequate data, and other data quality issues. This work-in-progress has as its main goal the integration of two Brazilian health databases in order to improve the quality of tuberculosis mortality data. A phonetic encoding technique (Soundex) and a pattern matching recognition (Jaro) are proposed as solutions and results compared. Both techniques identified over 500 true matches with Jaro discovering more true matches than Soundex.
  • 838
  • 19 Feb 2021
Topic Review
Computer Navigation
Computer-navigated surgery has been used in neurosurgery since the 1980s, where improved accuracy in resections for cancer was achieved by mapping brain tumours pre-operatively and using this to plan surgical resection. It is only at the turn of the 21st century that it has been adopted within the orthopaedic community, principally in the field of spinal surgery. A technology that provides real-time feedback within a field that has a small margin for error has an obvious home in the specialty of orthopaedic oncology. Computer navigation encompasses all techniques using computing to augment surgical procedures. The two main types of navigation currently used in orthopaedic surgery are “image-based” and “patient-specific instrumentation and reconstruction”.
  • 838
  • 09 Jul 2021
Topic Review
LPWAN Key Exchange
The Internet of Things (IoT) is one of the fastest emerging technologies in the industry. It includes diverse applications with different requirements to provide services to users. Secure, low-powered, and long-range transmissions are some of the most vital requirements in developing IoT applications. IoT uses several communication technologies to fulfill transmission requirements. However, Low Powered Wide Area Networks (LPWAN) transmission standards have been gaining attention because of their exceptional low-powered and long-distance transmission capabilities. The features of LPWAN transmission standards make them a perfect candidate for IoT applications. However, the current LPWAN standards lack state-of-the-art security mechanisms because of the limitations of the IoT devices in energy and computational capacity. Most of the LPWAN standards, such as Sigfox, NB-IoT, and Weightless, use static keys for node authentication and encryption. LoRaWAN is the only LPWAN technology providing session key mechanisms for better security. However, the session key mechanism is vulnerable to replay attacks.
  • 834
  • 22 Jul 2022
Topic Review
Ethernet Frame Format
Ethernet is a widely used networking technology that finds its application in local area networks (LANs), metropolitan area networks (MANs), wide area networks (WANs), and other fields, such as industry, avionics, telecommunication, and multimedia. The Ethernet technology was introduced in 1980, and the first standardization was conducted in 1983 by IEEE 802.3.
  • 833
  • 09 Nov 2023
Topic Review
Qualitative Research Methods for Large Language Models
In the current era of artificial intelligence, large language models are being increasingly used for various applications, such as language translation, text generation, and human-like conversation. The fact that these models consist of large amounts of data, including many different opinions and perspectives, could introduce the possibility of a new qualitative research approach: Due to the probabilistic character of their answers, “interviewing” these large language models could give insights into public opinions in a way that otherwise only interviews with large groups of subjects could deliver. 
  • 832
  • 27 Nov 2023
Topic Review
Link Lifetime Prediction in Vehicular Ad Hoc Networks
In urban mobility, Vehicular Ad Hoc Networks (VANETs) provide a variety of intelligent applications. By enhancing automobile traffic management, these technologies enable advancements in safety and help decrease the frequency of accidents. The transportation system can now follow the development and growth of cities without sacrificing the quality and organisation of its services thanks to safety apps that include collision alerts, real-time traffic information, and safe driving applications, among others. Applications can occasionally demand a lot of computing power, making their processing impractical for cars with limited onboard processing capacity. Offloading of computation is encouraged by such a restriction. However, because vehicle mobility operations are dynamic, communication times (also known as link lifetimes) between nodes are frequently short. VANET applications and processes are impacted by such communication delays (e.g., the offloading decision when using the Computational Offloading technique). Making an accurate prediction of the link lifespan between vehicles is therefore challenging.
  • 828
  • 09 Sep 2022
Topic Review
Smart Chatbot for User Authentication
Despite being the most widely used authentication mechanism, password-based authentication is not very secure, being easily guessed or brute-forced. To address this, many systems which especially value security adopt Multi-Factor Authentication (MFA), in which multiple different authentication mechanisms are used concurrently. JitHDA (Just-in-time human dynamics based authentication engine) is a new authentication mechanism which can add another option to MFA capabilities. JitHDA observes human behaviour and human dynamics to gather up to date information on the user from which authentication questions can be dynamically generated. 
  • 827
  • 23 Dec 2022
Topic Review
Foot-Detection Approach Based on Seven-Foot Dimensions
Unsuitable shoe shapes and sizes are a critical reason for unhealthy feet, may severely contribute to chronic injuries such as foot ulcers in susceptible people (e.g., diabetes patients), and thus need accurate measurements in the manner of expert-based procedures.
  • 827
  • 25 Jun 2023
Topic Review
Trust Management Model for Secure Internet of Vehicles
The Internet of Vehicles (IoV) enables vehicles to share data that help vehicles perceive the surrounding environment. However, vehicles can spread false information to other IoV nodes; this incorrect information misleads vehicles and causes confusion in traffic, therefore, a vehicular trust model is needed to check the trustworthiness of the message. 
  • 826
  • 28 Jul 2023
Topic Review
Programming/2 BeanShell
You can write your scrips using the BeanShell. Edit a file, save it with the extension ".bsh" The program will be executed by pressing (press [F8]). Text output will go to "BeanShell" (a window below the main editor). See  BeanShell article to read more about this scripting language. This manual is provided here for your convenience. The manual below is based on these external resources. This document is about BeanShell. BeanShell is a small, free, embeddable Java source interpreter with object scripting language features, written in Java. BeanShell executes standard Java statements and expressions but also extends Java into the scripting domain with common scripting language conventions and syntax. BeanShell is a natural scripting language for Java. Traditionally, the primary difference between a scripting language and a compiled language has been in its type system: the way in which you define and use data elements. You might be thinking that there is a more obvious difference here - that of "interpreted" code vs. compiled code. But the compiler in and of itself does not fundamentally change the way you work with a language. Nor does interpreting a language necessarily make it more useful for what we think of as "scripting". It is the type system of a language that makes it possible for a compiler to analyze the structure of an application for correctness. Without types, compilation is reduced to just a grammar check and an optimization for speed. From the developer's perspective, it is also the type system that characterizes the way in which we interact with the code. Types are good. Without strongly type languages it would be very hard to write large scale systems and make any assertions about their correctness before they are run. But working with types imposes a burden on the developer. Types are labels and labeling things can be tedious. It can be especially tedious during certain kinds of development or special applications where it is flexibility and not program structure that is paramount. There are times where simplicity and ease of use is a more important criterion. This is not just rationalization to cover some underlying laziness. Productivity affects what people do and more importantly do *not* do in the real world, much more than you might think. There is a lot of important software that exists in the world today only because the cost/benefit ratio in some developer's mind reached a certain threshold. Unit testing - one of the foundations of writing good code - is a prime example. Unit tests for well written code are, in general, vitally important as a collective but almost insignificant individually. It's a "tragedy of the commons" that leads individual developers to repeatedly weigh the importance of writing another unit test with working on "real code". Give developers have a tool that makes it easy to perform a test with a line or two of code they will probably use it. If, moreover, it is also a tool that they enjoy using during their development process - that saves the time, they will be even more inclined to use it. Customizability through scripting also opens the door to applications that are more powerful than the sum of their parts. When users can extend, enhance, and add to their applications they use them in new and unexpected ways. Scripting is powerful. Traditionally scripting languages have traded in the power of types for simplicity. Most scripting languages distill the type system to just one or a handful of types such as strings, numbers, or simple lists. This is sufficient for many kinds of scripting. Many scripting languages operate in a loose, unstructured land - a place dominated by text and course-grained tools. As such these scripting languages have evolved sophisticated mechanisms for working with these simple types (regular expressions, pipes, etc.). As a result there has developed a casm between the scripting languages and the application languages created by the collapse of the type system in-between. The scripting languages have remained a separate species, isolated and speaking a different dialect from their brothers the application languages. BeanShell is a new kind of scripting language. BeanShell begins with the standard Java language and bridges it into the scripting domain in a natural way, but allowing the developer to relaxing types where appropriate. It is possible to write BeanShell scripts that look exactly like Java method code. But it's also possible to write scripts that look more like a traditional scripting language, while still maintaining the framework of the Java syntax. BeanShell emulates typed variables and parameters when they are used. This allows you to "seed" your code with strong types where appropriate. You can "shore up" repeatedly used methods as you work on them, migrating them closer to Java. Eventually you may find that you want to compile these methods and maintain them in standard Java. With BeanShell this is easy. BeanShell does not impose a syntactic boundary between your scripts and Java. But the bridge to Java extends much deeper than simple code similarity. BeanShell is one of a new breed of scripting languages made possible by Java's advanced reflection capabilities. Since BeanShell can run in the same Java virtual machine as your application, you can freely work with real, live, Java objects - passing them into and out of your scripts. Combined with BeanShell's ability to implement Java interfaces, you can achieve seamless and simple integration of scripting into your Java applications. BeanShell does not impose a type boundary between your scripts and Java.
  • 820
  • 29 Nov 2022
Topic Review
Social Behavioral Biometrics
Social Behavioral Biometrics (SBB) is a novel biometric category. This innovative field of study investigates a person’s social interactions and communication patterns to ascertain their identity. 
  • 815
  • 12 Oct 2023
Topic Review
Evaluation Criteria for Tools Supporting Remote Work
The pandemic period has made remote work a reality in many organizations. Despite the possible negative aspects of this form of work, many employers and employees appreciate its flexibility and effectiveness. Therefore, employers are looking for the most optimal tools to support this form of work. However, this may be difficult due to their complexity, different functionality, or different conditions of the company’s operations. Decisions on the choice of a given solution are usually made in a group of decision makers. Often their subjective assessments differ from each other, making it even more difficult to make a decision. 
  • 812
  • 06 Jul 2023
Topic Review
Real-Time Detection of Red Fruit
The real-time and accurate recognition of fruits and vegetables is crucial for the intelligent control of fruit and vegetable robots.
  • 811
  • 11 Jan 2024
Topic Review
Barriers and Support Factors of Open Data
Obstacles to Open Data can manifest at various levels. Fourteen potential supporting factors (n = 14) and thirteen barriers (n = 13) to the provision and anonymization of personal data were identified. These encompassed technical prerequisites as well as institutional, personnel, ethical, and legal considerations. These findings offer insights into existing obstacles and supportive structures within Open Data processes for effective implementation.
  • 802
  • 29 Dec 2023
Topic Review
Natural-language Generation
Natural-language generation (NLG) is a software process that produces natural language output. While it is widely agreed that the output of any NLG process is text, there is some disagreement on whether the inputs of an NLG system need to be non-linguistic. Common applications of NLG methods include the production of various reports, for example weather and patient reports; image captions; and chatbots. Automated NLG can be compared to the process humans use when they turn ideas into writing or speech. Psycholinguists prefer the term language production for this process, which can also be described in mathematical terms, or modeled in a computer for psychological research. NLG systems can also be compared to translators of artificial computer languages, such as decompilers or transpilers, which also produce human-readable code generated from an intermediate representation. Human languages tend to be considerably more complex and allow for much more ambiguity and variety of expression than programming languages, which makes NLG more challenging. NLG may be viewed as complementary to natural-language understanding (NLU): whereas in natural-language understanding, the system needs to disambiguate the input sentence to produce the machine representation language, in NLG the system needs to make decisions about how to put a representation into words. The practical considerations in building NLU vs. NLG systems are not symmetrical. NLU needs to deal with ambiguous or erroneous user input, whereas the ideas the system wants to express through NLG are generally known precisely. NLG needs to choose a specific, self-consistent textual representation from many potential representations, whereas NLU generally tries to produce a single, normalized representation of the idea expressed. NLG has existed since ELIZA was developed in the mid 1960s, but the methods were first used commercially in the 1990s. NLG techniques range from simple template-based systems like a mail merge that generates form letters, to systems that have a complex understanding of human grammar. NLG can also be accomplished by training a statistical model using machine learning, typically on a large corpus of human-written texts.
  • 800
  • 07 Nov 2022
  • Page
  • of
  • 25
Academic Video Service