Encyclopedia
Scholarly Community
Encyclopedia
Entry
Video
Image
Journal
Book
News
About
Log in/Sign up
Submit
Entry
Video
Image
and
or
not
All
${ type }
To
Search
Subject:
All Disciplines
Arts & Humanities
Biology & Life Sciences
Business & Economics
Chemistry & Materials Science
Computer Science & Mathematics
Engineering
Environmental & Earth Sciences
Medicine & Pharmacology
Physical Sciences
Public Health & Healthcare
Social Sciences
Sort:
Most Viewed
Latest
Alphabetical (A-Z)
Alphabetical (Z-A)
Filter:
All
Topic Review
Biography
Peer Reviewed Entry
Video Entry
Topic Review
Persuasive Technology
Persuasive technology is broadly defined as technology that is designed to change attitudes or behaviors of the users through persuasion and social influence, but not necessarily through coercion. Such technologies are regularly used in sales, diplomacy, politics, religion, military training, public health, and management, and may potentially be used in any area of human-human or human-computer interaction. Most self-identified persuasive technology research focuses on interactive, computational technologies, including desktop computers, Internet services, video games, and mobile devices, but this incorporates and builds on the results, theories, and methods of experimental psychology, rhetoric, and human-computer interaction. The design of persuasive technologies can be seen as a particular case of design with intent.
4.1K
28 Sep 2022
Topic Review
DualShock
The DualShock (originally Dual Shock; trademarked as DUALSHOCK or DUAL SHOCK; with the PlayStation 5 version named DualSense) is a line of gamepads with vibration-feedback and analog controls developed by Sony Interactive Entertainment for the PlayStation family of systems. Introduced in November 1997, it was initially marketed as a secondary peripheral for the original PlayStation, with updated versions of the PlayStation console including the controller, Sony subsequently phased out the controller that was originally included with the console, called the PlayStation controller, as well as the Sony Dual Analog Controller. The DualShock is the best-selling gamepad of all time in terms of units sold, excluding bundled controllers.
4.1K
24 Nov 2022
Topic Review
Matthews Correlation Coefficient
The Matthews correlation coefficient (MCC) or phi coefficient is used in machine learning as a measure of the quality of binary (two-class) classifications, introduced by biochemist Brian W. Matthews in 1975. The MCC is defined identically to Pearson's phi coefficient, introduced by Karl Pearson, also known as the Yule phi coefficient from its introduction by Udny Yule in 1912. Despite these antecedents which predate Matthews's use by several decades, the term MCC is widely used in the field of bioinformatics and machine learning. The coefficient takes into account true and false positives and negatives and is generally regarded as a balanced measure which can be used even if the classes are of very different sizes. The MCC is in essence a correlation coefficient between the observed and predicted binary classifications; it returns a value between −1 and +1. A coefficient of +1 represents a perfect prediction, 0 no better than random prediction and −1 indicates total disagreement between prediction and observation. However, if MCC equals neither −1, 0, or +1, it is not a reliable indicator of how similar a predictor is to random guessing because MCC is dependent on the dataset. MCC is closely related to the chi-square statistic for a 2×2 contingency table where n is the total number of observations. While there is no perfect way of describing the confusion matrix of true and false positives and negatives by a single number, the Matthews correlation coefficient is generally regarded as being one of the best such measures. Other measures, such as the proportion of correct predictions (also termed accuracy), are not useful when the two classes are of very different sizes. For example, assigning every object to the larger set achieves a high proportion of correct predictions, but is not generally a useful classification. The MCC can be calculated directly from the confusion matrix using the formula: In this equation, TP is the number of true positives, TN the number of true negatives, FP the number of false positives and FN the number of false negatives. If any of the four sums in the denominator is zero, the denominator can be arbitrarily set to one; this results in a Matthews correlation coefficient of zero, which can be shown to be the correct limiting value. The MCC can be calculated with the formula: using the positive predictive value, the true positive rate, the true negative rate, the negative predictive value, the false discovery rate, the false negative rate, the false positive rate, and the false omission rate. The original formula as given by Matthews was: This is equal to the formula given above. As a correlation coefficient, the Matthews correlation coefficient is the geometric mean of the regression coefficients of the problem and its dual. The component regression coefficients of the Matthews correlation coefficient are Markedness (Δp) and Youden's J statistic (Informedness or Δp'). Markedness and Informedness correspond to different directions of information flow and generalize Youden's J statistic, the [math]\displaystyle{ \delta }[/math]p statistics and (as their geometric mean) the Matthews Correlation Coefficient to more than two classes. Some scientists claim the Matthews correlation coefficient to be the most informative single score to establish the quality of a binary classifier prediction in a confusion matrix context.
4.1K
18 Nov 2022
Topic Review
Closed Graph
In mathematics, particularly in functional analysis and topology, closed graph is a property of functions. A function f : X → Y between topological spaces has a closed graph if its graph is a closed subset of the product space X × Y. A related property is open graph. This property is studied because there are many theorems, known as closed graph theorems, giving conditions under which a function with a closed graph is necessarily continuous. One particularly well-known class of closed graph theorems are the closed graph theorems in functional analysis.
4.0K
18 Oct 2022
Topic Review
False Position Method
In mathematics, the false position method or regula falsi is a very old method for solving an equation in one unknown, that, in modified form, is still in use. In simple terms, the method is the trial and error technique of using test ("false") values for the variable and then adjusting the test value according to the outcome. This is sometimes also referred to as "guess and check". Versions of the method predate the advent of algebra and the use of equations. As an example, consider problem 26 in the Rhind papyrus, which asks for a solution of (written in modern notation) the equation x + x/4 = 15. This is solved by false position, using a technique that predates formally written equations. First, guess that x = 4 to obtain, on the left, 4 + 4/4 = 5. This guess is a good choice since it produces an integer value. However, 4 is not the solution of the original equation, as it gives a value which is three times too small. To compensate, multiply x (currently set to 4) by 3 and substitute again to get 12 + 12/4 = 15, verifying that the solution is x = 12. Modern versions of the technique employ systematic ways of choosing new test values and are concerned with the questions of whether or not an approximation to a solution can be obtained, and if it can, how fast can the approximation be found.
4.0K
15 Nov 2022
Topic Review
Adaptive Clustering
The paper is devoted to an overview of multi-agent principles, methods, and technologies intended to adaptive real-time data clustering. The proposed methods provide new principles of self-organization of records and clusters, represented by software agents, making it possible to increase the adaptability of different clustering processes significantly. The paper also presents a comparative review of the methods and results recently developed in this area and their industrial applications. An ability of self-organization of items and clusters suggests a new perspective to form groups in a bottom-up online fashion together with continuous adaption previously obtained decisions. Multi-agent technology allows implementing this methodology in a parallel and asynchronous multi-thread manner, providing highly flexible, scalable, and reliable solutions. Industrial applications of the intended for solving too complex engineering problems are discussed together with several practical examples of data clustering in manufacturing applications, such as the pre-analysis of customer datasets in the sales process, pattern discovery, and ongoing forecasting and consolidation of orders and resources in logistics, clustering semantic networks in insurance document processing. Future research is outlined in the areas such as capturing the semantics of problem domains and guided self-organization on the virtual market.
4.0K
08 Feb 2021
Topic Review
Concept Learning
Concept learning, also known as category learning, concept attainment, and concept formation, is defined by Bruner, Goodnow, & Austin (1967) as "the search for and listing of attributes that can be used to distinguish exemplars from non exemplars of various categories". More simply put, concepts are the mental categories that help us classify objects, events, or ideas, building on the understanding that each object, event, or idea has a set of common relevant features. Thus, concept learning is a strategy which requires a learner to compare and contrast groups or categories that contain concept-relevant features with groups or categories that do not contain concept-relevant features. In a concept learning task, a human or machine learner is trained to classify objects by being shown a set of example objects along with their class labels. The learner simplifies what has been observed by condensing it in the form of an example. This simplified version of what has been learned is then applied to future examples. Concept learning may be simple or complex because learning takes place over many areas. When a concept is difficult, it is less likely that the learner will be able to simplify, and therefore will be less likely to learn. Colloquially, the task is known as learning from examples. Most theories of concept learning are based on the storage of exemplars and avoid summarization or overt abstraction of any kind.
4.0K
17 Oct 2022
Topic Review
Windows 10 Version History (Version 1607)
The Windows 10 Anniversary Update (also known as version 1607 and codenamed "Redstone 1") is the second major update to Windows 10 and the first in a series of updates under the Redstone codenames. It carries the build number 10.0.14393.
4.0K
20 Oct 2022
Topic Review
Salient Object Detection
Detection and localization of regions of images that attract immediate human visual attention is currently an intensive area of research in computer vision. The capability of automatic identification and segmentation of such salient image regions has immediate consequences for applications in the field of computer vision, computer graphics, and multimedia. A large number of salient object detection (SOD) methods have been devised to effectively mimic the capability of the human visual system to detect the salient regions in images. These methods can be broadly categorized into two categories based on their feature engineering mechanism: conventional or deep learning-based. In this survey, most of the influential advances in image-based SOD from both conventional as well as deep learning-based categories have been reviewed in detail. Relevant saliency modeling trends with key issues, core techniques, and the scope for future research work have been discussed in the context of difficulties often faced in salient object detection. Results are presented for various challenging cases for some large-scale public datasets. Different metrics considered for assessment of the performance of state-of-the-art salient object detection models are also covered. Some future directions for SOD are presented towards end.
4.0K
19 Nov 2020
Topic Review
Limit Superior and Limit Inferior
In mathematics, the limit inferior and limit superior of a sequence can be thought of as limiting (i.e., eventual and extreme) bounds on the sequence. They can be thought of in a similar fashion for a function (see limit of a function). For a set, they are the infimum and supremum of the set's limit points, respectively. In general, when there are multiple objects around which a sequence, function, or set accumulates, the inferior and superior limits extract the smallest and largest of them; the type of object and the measure of size is context-dependent, but the notion of extreme limits is invariant. Limit inferior is also called infimum limit, limit infimum, liminf, inferior limit, lower limit, or inner limit; limit superior is also known as supremum limit, limit supremum, limsup, superior limit, upper limit, or outer limit. The limit inferior of a sequence [math]\displaystyle{ x_n }[/math] is denoted by The limit superior of a sequence [math]\displaystyle{ x_n }[/math] is denoted by
4.0K
30 Nov 2022
Topic Review
Yao's Millionaires' Problem
Yao's Millionaires' problem is a secure multi-party computation problem which was introduced in 1982 by computer scientist and computational theorist Andrew Yao. The problem discusses two millionaires, Alice and Bob, who are interested in knowing which of them is richer without revealing their actual wealth. This problem is analogous to a more general problem where there are two numbers [math]\displaystyle{ a }[/math] and [math]\displaystyle{ b }[/math] and the goal is to determine whether the inequality [math]\displaystyle{ a \geq b }[/math] is true or false without revealing the actual values of [math]\displaystyle{ a }[/math] and [math]\displaystyle{ b }[/math]. The Millionaires' Problem is an important problem in cryptography, the solution of which is used in e-commerce and data mining. Commercial applications sometimes have to compare numbers which are confidential and whose security is important. Many solutions have been introduced for the problem, among which the first solution, presented by Yao himself, was exponential in time and space.
4.0K
15 Nov 2022
Topic Review
Graphical Projection
Graphical projection is a protocol, used in technical drawing, by which an image of a three-dimensional object is projected onto a planar surface without the aid of numerical calculation.
3.9K
07 Nov 2022
Topic Review
Criticism of Facebook
Facebook (and parent company Meta Platforms) has been the subject of criticism and legal action. Criticisms include the outsize influence Facebook has on the lives and health of its users and employees, as well as Facebook's influence on the way media, specifically news, is reported and distributed. Notable issues include Internet privacy, such as use of a widespread "like" button on third-party websites tracking users, possible indefinite records of user information, automatic facial recognition software, and its role in the workplace, including employer-employee account disclosure. The use of Facebook can have negative psychological effects that include feelings of sexual jealousy, and stress, a lack of attention, and social media addiction that in some cases is comparable to drug addiction. Facebook's operations have also received coverage. The company's electricity usage, tax avoidance, real-name user requirement policies, censorship policies, handling of user data, and its involvement in the United States PRISM surveillance program have been highlighted by the media and by critics. Facebook has come under scrutiny for 'ignoring' or shirking its responsibility for the content posted on its platform, including copyright and intellectual property infringement, hate speech, incitement of rape, violence against minorities, terrorism, fake news, Facebook murder, crimes, and violent incidents live-streamed through its Facebook Live functionality. The company and its employees have also been subject to litigation cases over the years, with its most prominent case concerning allegations that CEO Mark Zuckerberg broke an oral contract with Cameron Winklevoss, Tyler Winklevoss, and Divya Narendra to build the then-named "HarvardConnection" social network in 2004, instead allegedly opting to steal the idea and code to launch Facebook months before HarvardConnection began. The original lawsuit was eventually settled in 2009, with Facebook paying approximately $20 million in cash and 1.25 million shares. A new lawsuit in 2011 was dismissed. Some critics point to problems which they say will result in the demise of Facebook. Facebook has been banned by several governments for various reasons, including Syria, China, Iran and Russia.
3.9K
14 Oct 2022
Topic Review
Stuxnet
Stuxnet is a malicious computer worm first uncovered in 2010 and thought to have been in development since at least 2005. Stuxnet targets supervisory control and data acquisition (SCADA) systems and is believed to be responsible for causing substantial damage to the nuclear program of Iran. Although neither country has openly admitted responsibility, the worm is widely understood to be a cyberweapon built jointly by the United States and Israel in a collaborative effort known as Operation Olympic Games. Stuxnet specifically targets programmable logic controllers (PLCs), which allow the automation of electromechanical processes such as those used to control machinery and industrial processes including gas centrifuges for separating nuclear material. Exploiting four zero-day flaws, Stuxnet functions by targeting machines using the Microsoft Windows operating system and networks, then seeking out Siemens Step7 software. Stuxnet reportedly compromised Iranian PLCs, collecting information on industrial systems and causing the fast-spinning centrifuges to tear themselves apart. Stuxnet's design and architecture are not domain-specific and it could be tailored as a platform for attacking modern SCADA and PLC systems (e.g., in factory assembly lines or power plants), most of which are in Europe, Japan , and the United States. Stuxnet reportedly ruined almost one-fifth of Iran's nuclear centrifuges. Targeting industrial control systems, the worm infected over 200,000 computers and caused 1,000 machines to physically degrade. Stuxnet has three modules: a worm that executes all routines related to the main payload of the attack; a link file that automatically executes the propagated copies of the worm; and a rootkit component responsible for hiding all malicious files and processes, to prevent detection of Stuxnet. It is typically introduced to the target environment via an infected USB flash drive, thus crossing any air gap. The worm then propagates across the network, scanning for Siemens Step7 software on computers controlling a PLC. In the absence of either criterion, Stuxnet becomes dormant inside the computer. If both the conditions are fulfilled, Stuxnet introduces the infected rootkit onto the PLC and Step7 software, modifying the code and giving unexpected commands to the PLC while returning a loop of normal operation system values back to the users.
3.9K
30 Nov 2022
Topic Review
Video Games in Education
This page includes some history of video games being used as an additional or alternative method to traditional education. This page presents why using video games are beneficial to use for educational purposes in the classroom as well as the limitations. This page additionally discusses how learning from video games outside the classroom is possible as well.
3.7K
04 Nov 2022
Topic Review
Yan Tan Tethera
Yan Tan Tethera is a sheep-counting rhyme/system traditionally used by shepherds in Northern England and earlier in some other parts of Britain. Until the Industrial Revolution, the use of traditional number systems was common among shepherds, especially in the fells of the Lake District. The Yan Tan Tethera system was also used for counting stitches in knitting. The words derive from a Brythonic Celtic language. Though most of these number systems fell out of use by 1910, some are still in use. The word yan or yen for "one" in some northern English dialects generally represents a regular development in Northern English in which the Old English long vowel /ɑː/ was broken into /ie/, /ia/ and so on. This explains the shift to yan and ane from the Old English ān, which is itself derived from the Proto-Germanic *ainaz. Another example of this development is the Northern English word for "home", hame, which has forms such as hyem, yem and yam all deriving from the Old English hām.
3.6K
20 Nov 2022
Topic Review
Binary Scaling
Binary scaling is a computer programming technique used typically in embedded C, DSP and assembler programs to implement non-integer operations by using the native integer arithmetic of the processor.
3.5K
04 Nov 2022
Topic Review
Bitcoin Core
Bitcoin Core is free and open-source software that serves as a bitcoin node (the set of which form the bitcoin network) and provides a bitcoin wallet which fully verifies payments. It is considered to be bitcoin's reference implementation. Initially, the software was published by Satoshi Nakamoto under the name "Bitcoin", and later renamed to "Bitcoin Core" to distinguish it from the network. For this reason, it is also known as the Satoshi client. The MIT Digital Currency Initiative funds some of the development of Bitcoin Core. The project also maintains the cryptography library libsecp256k1.
3.5K
09 Nov 2022
Topic Review
Word Lists by Frequency
Word lists by frequency are lists of a language's words grouped by frequency of occurrence within some given text corpus, either by levels or as a ranked list, serving the purpose of vocabulary acquisition. A word list by frequency "provides a rational basis for making sure that learners get the best return for their vocabulary learning effort" (Nation 1997), but is mainly intended for course writers, not directly for learners. Frequency lists are also made for lexicographical purposes, serving as a sort of checklist to ensure that common words are not left out. Some major pitfalls are the corpus content, the corpus register, and the definition of "word". While word counting is a thousand years old, with still gigantic analysis done by hand in the mid-20th century, natural language electronic processing of large corpora such as movie subtitles (SUBTLEX megastudy) has accelerated the research field. In computational linguistics, a frequency list is a sorted list of words (word types) together with their frequency, where frequency here usually means the number of occurrences in a given corpus, from which the rank can be derived as the position in the list.
3.5K
29 Nov 2022
Topic Review
Gauss's Lemma (Polynomial)
In algebra, Gauss's lemma, named after Carl Friedrich Gauss, is a statement about polynomials over the integers, or, more generally, over a unique factorization domain (that is, a ring that has a unique factorization property similar to the fundamental theorem of arithmetic). Gauss's lemma underlies all the theory of factorization and greatest common divisors of such polynomials. Gauss's lemma asserts that the product of two primitive polynomials is primitive (a polynomial with integer coefficients is primitive if it has 1 as a greatest common divisor of its coefficients). A corollary of Gauss's lemma, sometimes also called Gauss's lemma, is that a primitive polynomial is irreducible over the integers if and only if it is irreducible over the rational numbers. More generally, a primitive polynomial has the same complete factorization over the integers and over the rational numbers. In the case of coefficients in a unique factorization domain R, "rational numbers" must be replaced by "field of fractions of R". This implies that, if R is either a field, the ring of integers, or a unique factorization domain, then every polynomial ring (in one or several indeterminates) over R is a unique factorization domain. Another consequence is that factorization and greatest common divisor computation of polynomials with integers or rational coefficients may be reduced to similar computations on integers and primitive polynomials. This is systematically used (explicitly or implicitly) in all implemented algorithms (see Polynomial greatest common divisor and Factorization of polynomials). Gauss's lemma, and all its consequences that do not involve the existence of a complete factorization remain true over any GCD domain (an integral domain over which greatest common divisors exist). In particular, a polynomial ring over a GCD domain is also a GCD domain. If one calls primitive a polynomial such that the coefficients generate the unit ideal, Gauss's lemma is true over every commutative ring. However, some care must be taken, when using this definition of primitive, as, over a unique factorization domain that is not a principal ideal domain, there are polynomials that are primitive in the above sense and not primitive in this new sense.
3.5K
28 Nov 2022
Page
of
47
Featured Entry Collections
>>
Featured Books
>>
Encyclopedia of Social Sciences
Chief Editor:
Kum Fai Yuen
Encyclopedia of COVID-19
Chief Editor:
Stephen Bustin
Encyclopedia of Fungi
Chief Editor:
Luis V. Lopez-Llorca
Encyclopedia of Digital Society, Industry 5.0 and Smart City
Chief Editor:
Sandro Serpa
Entry
Video
Image
Journal
Book
News
About
Log in/Sign up
New Entry
New Video
New Images
About
Terms and Conditions
Privacy Policy
Advisory Board
Contact
Partner
ScholarVision Creations
Feedback
Top
Feedback
×
Help Center
Browse our user manual, common Q&A, author guidelines, etc.
Rate your experience
Let us know your experience and what we could improve.
Report an error
Is something wrong? Please let us know!
Other feedback
Other feedback you would like to report.
×
Did you find what you were looking for?
Love
Like
Neutral
Dislike
Hate
0
/500
Email
Do you agree to share your valuable feedback publicly on
Encyclopedia
’s homepage?
Yes, I agree. Encyclopedia can post it.
No, I do not agree. I would not like to post my testimonial.
Webpage
Upload a screenshot
(Max file size 2MB)
Submit
Back
Close
×