Topic Review
Agent-Based Programming
Intelligent and autonomous agents is a subarea of symbolic artificial intelligence where these agents decide, either reactively or proactively, upon a course of action by reasoning about the information that is available about the world (including the environment, the agent itself, and other agents). It encompasses a multitude of techniques, such as negotiation protocols, agent simulation, multi-agent argumentation, multi-agent planning, and many others. In an agent-based programming language, agents are the building blocks, and programs are obtained by programming their behaviours (how an agent reasons), their goals (what an agent aims to achieve) and their interoperation (how agents collaborate to solve a task).
  • 2.4K
  • 08 Mar 2021
Topic Review
Dragon King Theory
Dragon king (DK) is a double metaphor for an event that is both extremely large in size or impact (a "king") and born of unique origins (a "dragon") relative to its peers (other events from the same system). DK events are generated by / correspond to mechanisms such as positive feedback, tipping points, bifurcations, and phase transitions, that tend to occur in nonlinear and complex systems, and serve to amplify DK events to extreme levels. By understanding and monitoring these dynamics, some predictability of such events may be obtained. The theory has been developed by Didier Sornette, who hypothesizes that many of the crises that we face are in fact DK rather than black swans—i.e., they may be predictable to some degree. Given the importance of crises to the long-term organization of a variety of systems, the DK theory urges that special attention be given to the study and monitoring of extremes, and that a dynamic view be taken. From a scientific viewpoint, such extremes are interesting because they may reveal underlying, often hidden, organizing principles. Practically speaking, one should study extreme risks, but not forget that significant uncertainty will almost always be present, and should be rigorously considered in decisions regarding risk management and design. The theory of DK is related to concepts such as black swan theory, outliers, complex systems, nonlinear dynamics, power laws, extreme value theory, prediction, extreme risks, and risk management.
  • 2.4K
  • 23 Oct 2022
Topic Review
Applications of VR
Applications of VR (also known as virtual reality) can be found in fields as diverse as entertainment, marketing, education, medicine, and many others. They provide numerous possibilities for users to explore virtual realities for various purposes.
  • 2.4K
  • 01 Dec 2022
Topic Review
Trigonometry
Trigonometry (from grc τρίγωνον (trígōnon) 'triangle', and μέτρον (métron) 'measure') is a branch of mathematics that studies relationships between side lengths and angles of triangles. The field emerged in the Hellenistic world during the 3rd century BC from applications of geometry to astronomical studies. The Greeks focused on the calculation of chords, while mathematicians in India created the earliest-known tables of values for trigonometric ratios (also called trigonometric functions) such as sine. Throughout history, trigonometry has been applied in areas such as geodesy, surveying, celestial mechanics, and navigation. Trigonometry is known for its many identities. These trigonometric identities are commonly used for rewriting trigonometrical expressions with the aim to simplify an expression, to find a more useful form of an expression, or to solve an equation.
  • 2.4K
  • 31 Oct 2022
Topic Review
Addiction and Spending in Gacha Games
Gacha games are the most dominant games on the mobile market. These are free-to-play games with a lottery-like system, where the user pays with in-game currency to enter a draw in order to obtain the character or item they want. If a player does not obtain what he hoped for, there is the option of paying with his own money for more draws, and this is the main way to monetize the Gacha game.
  • 2.3K
  • 21 Jul 2023
Topic Review
Videos Data Augmentation for Deep Learning Models
In most Computer Vision applications, Deep Learning models achieve state-of-the-art performances. One drawback of Deep Learning is the large amount of data needed to train the models. Unfortunately, in many applications, data are difficult or expensive to collect. Data augmentation can alleviate the problem, generating new data from a smaller initial dataset. Geometric and color space image augmentation methods can increase accuracy of Deep Learning models but are often not enough. More advanced solutions are Domain Randomization methods or the use of simulation to artificially generate the missing data. Data augmentation algorithms are usually specifically designed for single images. Most recently, Deep Learning models have been applied to the analysis of video sequences.
  • 2.3K
  • 25 Mar 2022
Topic Review
Forensic Statistics
Forensic statistics is the application of probability models and statistical techniques to scientific evidence, such as DNA evidence, and the law. In contrast to "everyday" statistics, to not engender bias or unduly draw conclusions, forensic statisticians report likelihoods as likelihood ratios (LR). This ratio of probabilities is then used by juries or judges to draw inferences or conclusions and decide legal matters. Jurors and judges rely on the strength of a DNA match, given by statistics, to make conclusions and determine guilt or innocence in legal matters. In forensic science, the DNA evidence received for DNA profiling often contains a mixture of more than one person's DNA. DNA profiles are generated using a set procedure, however, the interpretation of a DNA profile becomes more complicated when the sample contains a mixture of DNA. Regardless of the number of contributors to the forensic sample, statistics and probabilities must be used to provide weight to the evidence and to describe what the results of the DNA evidence mean. In a single-source DNA profile, the statistic used is termed a random match probability (RMP). RMPs can also be used in certain situations to describe the results of the interpretation of a DNA mixture. Other statistical tools to describe DNA mixture profiles include likelihood ratios (LR) and combined probability of inclusion (CPI), also known as random man not excluded (RMNE). Computer programs have been implemented with forensic DNA statistics for assessing the biological relationships between two or more people. Forensic science uses several approaches for DNA statistics with computer programs such as; match probability, exclusion probability, likelihood ratios, Bayesian approaches, and paternity and kinship testing. Although the precise origin of this term remains unclear, it is apparent that the term was used in the 1980s and 1990s. Among the first forensic statistics conferences were two held in 1991 and 1993.
  • 2.3K
  • 12 Oct 2022
Topic Review
Incompressible Navier Stokes Equations
A closely related problem to The Clay Math Institute "Navier-Stokes, breakdown of smooth solutions here on an arbitrary cube subset of three dimensional space with periodic boundary conditions is examined. The incompressible Navier-Stokes Equations are presented in a new and conventionally different way here, by naturally reducing them to an operator form which is then further analyzed. It is shown that a reduction to a general 2D N-S system decoupled from a 1D non-linear partial differential equation is possible to obtain. This is executed using integration over n-dimensional compact intervals which allows decoupling. Here we extract the measure-zero points in the domain where singularities may occur and are left with a pde that exhibits finite time singularity. The operator form is considered in a physical geometric vorticity case, and a more general case. In the general case, the solution is revealed to have smooth solutions which exhibit finite-time blowup on a fine measure zero set and using the Gagliardo-Nirenberg inequalities it is shown that for any non zero measure set in the form of cube subset of 3D there is finite time blowup.
  • 2.3K
  • 10 Dec 2020
Topic Review
Power Architecture
Power Architecture is a registered trademark for similar reduced instruction set computing (RISC) instruction sets for microprocessors developed and manufactured by such companies as IBM, Freescale/NXP, AppliedMicro, LSI, Teledyne e2v and Synopsys. The governing body is Power.org, comprising over 40 companies and organizations. "Power Architecture" is a broad term including all products based on newer POWER, PowerPC and Cell processors. The term "Power Architecture" should not be confused with IBM's different generations of "POWER Instruction Set Architecture", an earlier instruction set for IBM RISC processors of the 1990s from which the PowerPC instruction set was derived. Power Architecture is a family name describing processor architecture, software, toolchain, community and end-user appliances and not a strict term describing specific products or technologies. More details and documentation on the Power Architecture can be found on the IBM Portal for OpenPOWER.
  • 2.3K
  • 18 Nov 2022
Topic Review
Blockchain Technology: Research and Applied
Blockchain being a leading technology in the 21st century is revolutionizing each sector of life. Services are being provided and upgraded using its salient features and fruitful characteristics. Businesses are being enhanced by using this technology. Countries are shifting towards digital currencies i.e., an initial application of blockchain application. It omits the need of central authority by its distributed ledger functionality. This distributed ledger is achieved by using a consensus mechanism in blockchain. A consensus algorithm plays a core role in the implementation of blockchain. Any application implementing blockchain uses consensus algorithms to achieve its desired task.
  • 2.3K
  • 09 Dec 2021
  • Page
  • of
  • 371
ScholarVision Creations