Topic Review
Data Envelopment Analysis (DEA)
Data Envelopment Analysis (DEA) is a non-parametric methodology for measuring the efficiency of Decision Making Units (DMUs) using multiple inputs to outputs configurations. This is the most commonly used tool for frontier estimations in assessments of productivity and efficiency applied to all fields of economic activities.
  • 10.6K
  • 28 Jan 2022
Topic Review
Mobile Technology in Tourism
The influence of mobile technology on tourism is very significant. With the support of mobile-related devices (smartphones, glasses, or other wearable devices), technology, data and services, multiple travel concepts, and travel modes including mobile tourism, smart tourism, e-tourism, and sustainable tourism have emerged or developed further. Mobile technology is touted as the next technology wave that can fundamentally change tourism and hotels. Moreover, mobile technology is playing an increasing role in the travel experience, and increasing travel research is concentrated in this field. Research findings show that, first, the research of mobile technology in tourism can be divided into three phases and to a certain extent is synchronized with the development of mobile technology. Second, in the area of social sciences, the research of mobile technology in tourism needs further exploration, which must refer to related research in the areas of Transportation and IT to expand the perspective of research. Top journal analysis, journal co-citation analysis, author co-citation analysis, and collaboration network analysis reveal the most representative journals, authors, institutions, and countries/regions in this research field. This finding provides a valuable reference for scholars in this field. Additionally, this research also grasped the hot and cutting-edge topics in this field through the analysis of keywords in this field. Finally, the clustering of co-citation references presents the knowledge base of mobile technology research in the tourism field: mobile technology, travel mode, mobile instrument, travel behavior research, mobile applications, and geo-based technology.
  • 10.4K
  • 28 Sep 2020
Topic Review
Vocaloid
Vocaloid is a singing voice synthesizer and the first engine released in the Vocaloid series. It was succeeded by Vocaloid 2. This version was made to be able to sing both English and Japanese.
  • 10.3K
  • 17 Oct 2022
Topic Review
Snakes and Ladders
Snakes and Ladders, known originally as Moksha Patam, is an ancient Indian board game for two or more players regarded today as a worldwide classic. It is played on a game board with numbered, gridded squares. A number of "ladders" and "snakes" are pictured on the board, each connecting two specific board squares. The object of the game is to navigate one's game piece, according to die rolls, from the start (bottom square) to the finish (top square), helped by climbing ladders but hindered by falling down snakes. The game is a simple race based on sheer luck, and it is popular with young children. The historic version had its roots in morality lessons, on which a player's progression up the board represented a life journey complicated by virtues (ladders) and vices (snakes). The game is also sold under other names such as Chutes and Ladders, Bible Ups and Downs, etc., some with a morality motif; a morality Chutes and Ladders was published by Milton Bradley starting from 1943.
  • 10.3K
  • 09 Oct 2022
Topic Review
Root-Finding Algorithm
In mathematics and computing, a root-finding algorithm is an algorithm for finding zeroes, also called "roots", of continuous functions. A zero of a function f, from the real numbers to real numbers or from the complex numbers to the complex numbers, is a number x such that f(x) = 0. As, generally, the zeroes of a function cannot be computed exactly nor expressed in closed form, root-finding algorithms provide approximations to zeroes, expressed either as floating point numbers or as small isolating intervals, or disks for complex roots (an interval or disk output being equivalent to an approximate output together with an error bound). Solving an equation f(x) = g(x) is the same as finding the roots of the function h(x) = f(x) – g(x). Thus root-finding algorithms allow solving any equation defined by continuous functions. However, most root-finding algorithms do not guarantee that they will find all the roots; in particular, if such an algorithm does not find any root, that does not mean that no root exists. Most numerical root-finding methods use iteration, producing a sequence of numbers that hopefully converge towards the root as a limit. They require one or more initial guesses of the root as starting values, then each iteration of the algorithm produces a successively more accurate approximation to the root. Since the iteration must be stopped at some point these methods produce an approximation to the root, not an exact solution. Many methods compute subsequent values by evaluating an auxiliary function on the preceding values. The limit is thus a fixed point of the auxiliary function, which is chosen for having the roots of the original equation as fixed points, and for converging rapidly to these fixed points. The behaviour of general root-finding algorithms is studied in numerical analysis. However, for polynomials, root-finding study belongs generally to computer algebra, since algebraic properties of polynomials are fundamental for the most efficient algorithms. The efficiency of an algorithm may depend dramatically on the characteristics of the given functions. For example, many algorithms use the derivative of the input function, while others work on every continuous function. In general, numerical algorithms are not guaranteed to find all the roots of a function, so failing to find a root does not prove that there is no root. However, for polynomials, there are specific algorithms that use algebraic properties for certifying that no root is missed, and locating the roots in separate intervals (or disks for complex roots) that are small enough to ensure the convergence of numerical methods (typically Newton's method) to the unique root so located.
  • 10.2K
  • 14 Oct 2022
Topic Review
Smart Clothing
Smart clothing can be defined as the intelligent system that senses and reacts to the changes and stimuli of the environment and the wearer’s conditions, such as electrical, thermal and magnetic ones. Smart clothing has various functions (e.g., protection, temperature regulation, monitoring, entertainment, expression of personality, etc.) and embodies many features (e.g., efficient, intelligent, computable, etc.), combining cutting-edge technologies in related fields such as electronic information, sensors and materials. Smart clothing has emerged to meet consumers’ personalized needs in healthcare, work, entertainment, etc., and has rapidly become a hotspot in the clothing industry and research field. However, as smart clothing gets popular, sustainability issues are becoming increasingly prominent during its development and circulation. 
  • 10.2K
  • 19 Apr 2022
Topic Review
Government Censorship of Telegram Messenger
Telegram Messenger application has been blocked by multiple countries.
  • 9.7K
  • 14 Nov 2022
Topic Review
Well-Defined
In mathematics, an expression is called well-defined or unambiguous if its definition assigns it a unique interpretation or value. Otherwise, the expression is said to be not well-defined, ill-defined or ambiguous. A function is well-defined if it gives the same result when the representation of the input is changed without changing the value of the input. For instance, if f takes real numbers as input, and if f(0.5) does not equal f(1/2) then f is not well-defined (and thus not a function). The term well-defined can also be used to indicate that a logical expression is unambiguous or uncontradictory. A function that is not well-defined is not the same as a function that is undefined. For example, if f(x) = 1/x, then the fact that f(0) is undefined does not mean that the f is not well-defined — but that 0 is simply not in the domain of f.
  • 9.0K
  • 01 Dec 2022
Topic Review
Decimation (Signal Processing)
In digital signal processing, decimation is the process of reducing the sampling rate of a signal.  The term downsampling usually refers to one step of the process, but sometimes the terms are used interchangeably.  Complementary to upsampling, which increases sampling rate, decimation is a specific case of sample rate conversion in a multi-rate digital signal processing system. A system component that performs decimation is called a decimator. When decimation is performed on a sequence of samples of a signal or other continuous function, it produces an approximation of the sequence that would have been obtained by sampling the signal at a lower rate (or density, as in the case of a photograph). The decimation factor is usually an integer or a rational fraction greater than one. This factor multiplies the sampling interval or, equivalently, divides the sampling rate. For example, if compact disc audio at 44,100 samples/second is decimated by a factor of 5/4, the resulting sample rate is 35,280.
  • 8.9K
  • 23 Nov 2022
Topic Review
Burden of Proof
The burden of proof (Latin: onus probandi, shortened from Onus probandi incumbit ei qui dicit, non ei qui negat) is the obligation on a party in a dispute to provide sufficient warrant for its position.
  • 8.6K
  • 25 Nov 2022
  • Page
  • of
  • 365
Video Production Service