You're using an outdated browser. Please upgrade to a modern browser for the best experience.
Subject:
All Disciplines Arts & Humanities Biology & Life Sciences Business & Economics Chemistry & Materials Science Computer Science & Mathematics Engineering Environmental & Earth Sciences Medicine & Pharmacology Physical Sciences Public Health & Healthcare Social Sciences
Sort by:
Most Viewed Latest Alphabetical (A-Z) Alphabetical (Z-A)
Filter:
All Topic Review Biography Peer Reviewed Entry Video Entry
Topic Review
Drvspace
DriveSpace (initially known as DoubleSpace) is a disk compression utility supplied with MS-DOS starting from version 6.0 in 1993 and ending in 2000 with the release of Windows Me. The purpose of DriveSpace is to increase the amount of data the user could store on disks by transparently compressing and decompressing data on-the-fly. It is primarily intended for use with hard drives, but use for floppy disks is also supported. This feature was removed in Windows XP and later.
  • 609
  • 12 Oct 2022
Topic Review
Examples of Differential Equations
Differential equations arise in many problems in physics, engineering, and other sciences. The following examples show how to solve differential equations in a few simple cases when an exact solution exists.
  • 604
  • 11 Nov 2022
Topic Review
Domain Registration
Domain registration is the process of acquiring a domain name from a domain name registrar.
  • 602
  • 04 Nov 2022
Topic Review
Power (Statistics)
The power of a binary hypothesis test is the probability that the test rejects the null hypothesis (H0) when a specific alternative hypothesis (H1) is true. The statistical power ranges from 0 to 1, and as statistical power increases, the probability of making a type II error (wrongly failing to reject the null hypothesis) decreases. For a type II error probability of β, the corresponding statistical power is 1 − β. For example, if experiment 1 has a statistical power of 0.7, and experiment 2 has a statistical power of 0.95, then there is a stronger probability that experiment 1 had a type II error than experiment 2, and experiment 2 is more reliable than experiment 1 due to the reduction in probability of a type II error. It can be equivalently thought of as the probability of accepting the alternative hypothesis (H1) when it is true—that is, the ability of a test to detect a specific effect, if that specific effect actually exists. That is, If [math]\displaystyle{ H_1 }[/math] is not an equality but rather simply the negation of [math]\displaystyle{ H_0 }[/math] (so for example with [math]\displaystyle{ H_0:\mu=0 }[/math] for some unobserved population parameter [math]\displaystyle{ \mu, }[/math] we have simply [math]\displaystyle{ H_1:\mu\ne 0 }[/math]) then power cannot be calculated unless probabilities are known for all possible values of the parameter that violate the null hypothesis. Thus one generally refers to a test's power against a specific alternative hypothesis. As the power increases, there is a decreasing probability of a type II error, also referred to as the false negative rate (β) since the power is equal to 1 − β. A similar concept is the type I error probability, also referred to as the "false positive rate" or the level of a test under the null hypothesis. Power analysis can be used to calculate the minimum sample size required so that one can be reasonably likely to detect an effect of a given size. For example: "how many times do I need to toss a coin to conclude it is rigged by a certain amount?" Power analysis can also be used to calculate the minimum effect size that is likely to be detected in a study using a given sample size. In addition, the concept of power is used to make comparisons between different statistical testing procedures: for example, between a parametric test and a nonparametric test of the same hypothesis. In the context of binary classification, the power of a test is called its statistical sensitivity, its true positive rate, or its probability of detection.
  • 601
  • 28 Oct 2022
Topic Review
Attack Investigation
Attack investigation is an important research field in forensics analysis. Many existing supervised attack investigation methods rely on well-labeled data for effective training. While the unsupervised approach based on BERT can mitigate the issues, the high degree of similarity between certain real-world attacks and normal behaviors makes it challenging to accurately identify disguised attacks.
  • 598
  • 08 Jan 2024
Topic Review
Resistance Database Initiative
HIV Resistance Response Database Initiative (RDI) is a not-for-profit organisation established in 2002 with the mission of improving the clinical management of HIV infection through the application of bioinformatics to HIV drug resistance and treatment outcome data. The RDI has the following specific goals: The RDI consists of a small executive group based in the UK, an international advisory group of leading HIV/AIDS scientists and clinicians, and an extensive global network of collaborators and data contributors.
  • 594
  • 04 Nov 2022
Topic Review
Terminate and Stay Resident Program
A terminate-and-stay-resident program (commonly TSR) is a computer program running under DOS that uses a system call to return control to DOS as though it has finished, but remains in computer memory so it can be reactivated later. This technique partially overcame DOS's limitation of executing only one program, or task, at a time. TSRs are used only in DOS, not in Windows. Some TSRs are utility software that a computer user might call up several times a day, while working in another program, using a hotkey. Borland Sidekick was an early and popular example of this type. Others serve as device drivers for hardware that the operating system does not directly support.
  • 585
  • 14 Oct 2022
Topic Review
CSC Version 6.0
The Center for Internet Security Critical Security Controls Version 6.0 was released October 15, 2015.
  • 584
  • 09 Nov 2022
Topic Review
EFx Factory
The EFx Factory (Architectural-Guidance Software Factory) is a pioneering Architectural Guidance Software Factory from Microsoft, and one of the first implementations of a software factory to be built. The ‘EFx Factory’ implements the .NET Distributed Architecture for Service-Oriented applications and services. The factory is based upon an Architectural Application Framework called Enterprise Framework that describes a physical Microsoft .NET architecture leveraging Microsoft Enterprise Library and other service-oriented patterns from Microsoft patterns & practices. The EFX Factory was designed and built by development consultants within Microsoft Consulting Services in response to customer demand for an implementation of the .NET Architecture, best practices for .NET development, and guidance on best use of Enterprise Library and a number of other application blocks freely available from Microsoft.
  • 582
  • 30 Nov 2022
Topic Review
Quantum Finite Automata
In quantum computing, quantum finite automata (QFA) or quantum state machines are a quantum analog of probabilistic automata or a Markov decision process. They provide a mathematical abstraction of real-world quantum computers. Several types of automata may be defined, including measure-once and measure-many automata. Quantum finite automata can also be understood as the quantization of subshifts of finite type, or as a quantization of Markov chains. QFAs are, in turn, special cases of geometric finite automata or topological finite automata. The automata work by receiving a finite-length string [math]\displaystyle{ \sigma=(\sigma_0,\sigma_1,\cdots,\sigma_k) }[/math] of letters [math]\displaystyle{ \sigma_i }[/math] from a finite alphabet [math]\displaystyle{ \Sigma }[/math], and assigning to each such string a probability [math]\displaystyle{ \operatorname{Pr}(\sigma) }[/math] indicating the probability of the automaton being in an accept state; that is, indicating whether the automaton accepted or rejected the string. The languages accepted by QFAs are not the regular languages of deterministic finite automata, nor are they the stochastic languages of probabilistic finite automata. Study of these quantum languages remains an active area of research.
  • 576
  • 20 Oct 2022
Topic Review
Polynomial Least Squares
In mathematical statistics, polynomial least squares comprises a broad range of statistical methods for estimating an underlying polynomial that describes observations. These methods include polynomial regression, curve fitting, linear regression, least squares, ordinary least squares, simple linear regression, linear least squares, approximation theory and method of moments. Polynomial least squares has applications in radar trackers, estimation theory, signal processing, statistics, and econometrics. Two common applications of polynomial least squares methods are generating a low-degree polynomial that approximates a complicated function and estimating an assumed underlying polynomial from corrupted (also known as "noisy") observations. The former is commonly used in statistics and econometrics to fit a scatter plot with a first degree polynomial (that is, a linear expression). The latter is commonly used in target tracking in the form of Kalman filtering, which is effectively a recursive implementation of polynomial least squares. Estimating an assumed underlying deterministic polynomial can be used in econometrics as well. In effect, both applications produce average curves as generalizations of the common average of a set of numbers, which is equivalent to zero degree polynomial least squares. In the above applications, the term "approximate" is used when no statistical measurement or observation errors are assumed, as when fitting a scatter plot. The term "estimate", derived from statistical estimation theory, is used when assuming that measurements or observations of a polynomial are corrupted.
  • 571
  • 21 Oct 2022
Topic Review
Infinite–dimensional Vector Function
An infinite–dimensional vector function is a function whose values lie in an infinite-dimensional topological vector space, such as a Hilbert space or a Banach space. Such functions are applied in most sciences including physics.
  • 571
  • 15 Nov 2022
Topic Review
Microsoft DoubleSpace BIOS Parameter Block
DriveSpace (initially known as DoubleSpace) is a disk compression utility supplied with MS-DOS starting from version 6.0 in 1993 and ending in 2000 with the release of Windows Me. The purpose of DriveSpace is to increase the amount of data the user could store on disks by transparently compressing and decompressing data on-the-fly. It is primarily intended for use with hard drives, but use for floppy disks is also supported. This feature was removed in Windows XP and later.
  • 568
  • 26 Oct 2022
Topic Review
Domain Name Speculation
Domain name speculation is the practice of identifying and registering or acquiring Internet domain names as an investment with the intent of selling them later for a profit. The main targets of domain name speculation are generic words which can be valuable for type-in traffic and for the dominant position they would have in any field due to their descriptive nature. Hence generic words and phrases such as poker, insurance, travel, creditcards, loan and others are attractive targets of domain speculation in any top-level domain. The speculative characteristics of domain names may be linked to news reports or current events. However, the effective period during which such opportunities exist may be limited. Quick turnaround in the resale of domains is often called domain flipping. Domain flipping may also involve the process of buying a valuable domain name and building a related website around it, all this with the objective of selling the domain and newly built website to an interested party.
  • 565
  • 14 Oct 2022
Topic Review
CVF (Microsoft)
DriveSpace (initially known as DoubleSpace) is a disk compression utility supplied with MS-DOS starting from version 6.0 in 1993 and ending in 2000 with the release of Windows Me. The purpose of DriveSpace is to increase the amount of data the user could store on disks by transparently compressing and decompressing data on-the-fly. It is primarily intended for use with hard drives, but use for floppy disks is also supported. This feature was removed in Windows XP and later.
  • 563
  • 19 Oct 2022
Topic Review
Denotational Semantics of the Actor Model
The denotational semantics of the Actor model is the subject of denotational domain theory for Actors. The historical development of this subject is recounted in [Hewitt 2008b].
  • 563
  • 30 Oct 2022
Topic Review
Life Annuity
A life annuity is an annuity, or series of payments at fixed intervals, paid while the purchaser (or annuitant) is alive. The majority of life annuities are insurance products sold or issued by life insurance companies however substantial case law indicates that annuity products are not necessarily insurance products. Annuities can be purchased to provide an income during retirement, or originate from a structured settlement of a personal injury lawsuit. Life annuities may be sold in exchange for the immediate payment of a lump sum (single-payment annuity) or a series of regular payments (flexible payment annuity), prior to the onset of the annuity. The payment stream from the issuer to the annuitant has an unknown duration based principally upon the date of death of the annuitant. At this point the contract will terminate and the remainder of the fund accumulated is forfeited unless there are other annuitants or beneficiaries in the contract. Thus a life annuity is a form of longevity insurance, where the uncertainty of an individual's lifespan is transferred from the individual to the insurer, which reduces its own uncertainty by pooling many clients.
  • 560
  • 10 Nov 2022
Topic Review
Comparison of Streaming Media Systems
This is a comparison of streaming media systems. A more complete list of streaming media systems is also available.
  • 557
  • 18 Oct 2022
Topic Review
PureSystems
PureSystems is an IBM product line of factory pre-configured components and servers also being referred to as an "Expert Integrated System". The centrepiece of PureSystems is the IBM Flex System Manager in tandem with the so-called "Patterns of Expertise" for the automated configuration and management of PureSystems. PureSystems can host four different operating systems (AIX, IBM i, Linux, Windows) and five hypervisors (Hyper-V, KVM, PowerVM, VMware, Xen) on two different hardware architectures: Power Architecture and x86. PureSystems is marketed as a converged system, which packages multiple information technology components into a single product.
  • 549
  • 09 Oct 2022
Topic Review
Traditional Deformation Analysis and Octree-Based Deformation Analysis
Convergence and rockmass failure are significant hazards to personnel and physical assets in underground tunnels, caverns, and mines. Mobile Laser Scanning Systems (MLS) can deliver large volumes of point cloud data at a high frequency and on a large scale.
  • 546
  • 03 Jan 2024
  • Page
  • of
  • 48
Academic Video Service