You're using an outdated browser. Please upgrade to a modern browser for the best experience.
Subject:
All Disciplines Arts & Humanities Biology & Life Sciences Business & Economics Chemistry & Materials Science Computer Science & Mathematics Engineering Environmental & Earth Sciences Medicine & Pharmacology Physical Sciences Public Health & Healthcare Social Sciences
Sort by:
Most Viewed Latest Alphabetical (A-Z) Alphabetical (Z-A)
Filter:
All Topic Review Biography Peer Reviewed Entry Video Entry
Topic Review
Operational Technology
Operational technology (OT) is hardware and software that detects or causes a change, through the direct monitoring and/or control of industrial equipment, assets, processes and events. The term has become established to demonstrate the technological and functional differences between traditional information technology (IT) systems and industrial control systems environment, the so-called "IT in the non-carpeted areas".
  • 1.5K
  • 09 Nov 2022
Topic Review
Diaspora
Diaspora is a free personal web server that implements a distributed social networking service. Installations of the software form nodes (termed "pods") which make up the distributed Diaspora social network. The project was founded by Dan Grippi, Maxwell Salzberg, Raphael Sofaer and Ilya Zhitomirskiy, students at New York University's Courant Institute of Mathematical Sciences. The group received crowdfunding in excess of $200,000 via Kickstarter. A consumer alpha version was released on 23 November 2010. Konrad Lawson, blogging for the Chronicle of Higher Education, suggested Diaspora in July 2011 as an alternative to corporately produced software.
  • 1.5K
  • 09 Nov 2022
Topic Review
Personally Identifiable Information
Personal information, described in United States legal fields as either personally identifiable information (PII), or sensitive personal information (SPI), as used in information security and privacy laws, is information that can be used on its own or with other information to identify, contact, or locate a single person, or to identify an individual in context. The abbreviation PII is widely accepted in the U.S. context, but the phrase it abbreviates has four common variants based on personal / personally, and identifiable / identifying. Not all are equivalent, and for legal purposes the effective definitions vary depending on the jurisdiction and the purposes for which the term is being used. (In other countries with privacy protection laws derived from the OECD privacy principles, the term used is more often "personal information", which may be somewhat broader: in Australia's Privacy Act 1988 (Cth) "personal information" also includes information from which the person's identity is "reasonably ascertainable", potentially covering some information not covered by PII.) Under European and other data protection regimes, which centre primarily around the General Data Protection Regulation, the term "personal data" is significantly broader, and determines the scope of the regulatory regime. NIST Special Publication 800-122 defines PII as "any information about an individual maintained by an agency, including (1) any information that can be used to distinguish or trace an individual's identity, such as name, social security number, date and place of birth, mother's maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information." So, for example, a user's IP address is not classed as PII on its own, but is classified as linked PII. The concept of PII has become prevalent as information technology and the Internet have made it easier to collect PII leading to a profitable market in collecting and reselling PII. PII can also be exploited by criminals to stalk or steal the identity of a person, or to aid in the planning of criminal acts. As a response to these threats, many website privacy policies specifically address the gathering of PII, and lawmakers have enacted a series of legislation to limit the distribution and accessibility of PII. However, PII is a legal concept, not a technical concept, and as noted, it is not utilised in all jurisdictions. Because of the versatility and power of modern re-identification algorithms, the absence of PII data does not mean that the remaining data does not identify individuals. While some attributes may not be uniquely identifying on their own, any attribute can be potentially identifying in combination with others. These attributes have been referred to as quasi-identifiers or pseudo-identifiers. While such data may not constitute PII in the United States, it is highly likely to remain personal data under European data protection law.
  • 1.5K
  • 03 Nov 2022
Topic Review
Hybrid Blockchains
A hybrid blockchain is often advocated where multiple parties, trust, data access management and sharing, friction, regulations and a combination of centralization and decentralization is involved. A hybrid blockchain is also used by entities that need the benefits of both the public and private characteristics, which can be achieved using Interchain, bridges or other interoperability solutions between legacy systems and blockchains, both public or private. Private, public, consortium and permissioned blockchains all have their own setbacks and benefits. Entities that do not want to expose sensitive business data to the internet are limited to private blockchains. Entities looking for no access restrictions at all can leverage public blockchains like Bitcoin or Ethereum. The hybrid blockchain ensures sensitive business data stays private on business nodes unless permitted. It also validates the hash of the private transactions through consensus algorithms and even public checkpoints such as Bitcoin and Ethereum. Through Interchain, the hash of a private transaction can be placed on the Bitcoin network or any other public blockchain such as Ethereum, making an immutable record of the event with the benefit of a public blockchain’s hash power.
  • 1.5K
  • 11 Oct 2022
Topic Review
Distance Geometry Problem
The distance geometry problem is that of characterization and study of sets of points based only on given values of the distances between member pairs. Therefore distance geometry has immediate relevance where distance values are determined or considered, such as biology, sensor network, surveying, cartography, and physics.
  • 1.5K
  • 03 Nov 2022
Topic Review
Go Strategy and Tactics
The game of Go has simple rules that can be learned very quickly but, as with chess and similar board games, complex strategies may be deployed by experienced players.
  • 1.5K
  • 27 Oct 2022
Topic Review
Sixth Term Examination Paper
Sixth Term Examination Papers in Mathematics, often referred to as STEP, are university admissions tests for undergraduate Mathematics courses developed by the University of Cambridge. STEP papers are typically taken post-interview, as part of a conditional offer of an undergraduate place. There are also a number of candidates who sit STEP papers as a challenge. The papers are designed to test ability to answer questions similar in style to undergraduate Mathematics. There are two official users of STEP Mathematics: the University of Cambridge and the University of Warwick. Candidates applying to study Mathematics or Computer Science with Mathematics at the University of Cambridge are required to take STEP papers as part of the terms of their conditional offer. In addition, other courses at Cambridge University with a large mathematics component, such as engineering occasionally require STEP. Candidates applying to study Mathematics or closely-related subjects at the University of Warwick can take step as part of their offer. A typical STEP offer for a candidate applying to read Mathematics at the University of Cambridge would be at least a grade 1 in both STEP2 and STEP 3. From 2019 some colleges may (depending on the individual applicant's circumstances) require a grade 1 in either STEP 1 or 2 , (or in both). Candidates applying to the University of Warwick to read Mathematics, or closely related subjects, can use a grade 1 from any STEP paper as part of their offer.
  • 1.5K
  • 15 Nov 2022
Topic Review
Augmented Reality in K-12 Education
Augmented Reality (AR) could provide key benefits in education and create a richer user experience by increasing the motivation and engagement of the students. Initially, AR was used as a science-oriented tool, but after its acceptance by students and teachers, it evolved into a modern pedagogical tool that was adopted into the classroom to enhance the educational process. In summary, AR-based technology has become a popular topic in educational fields in the last decade as well as in educational research [26]. Taking into consideration various modern educational disciplines, technologies such as AR must be included in the learning environment in science education; otherwise, the absence of them could possibly negatively affect productivity and learning achievements [27]. However, the educational values of AR in the domain of physical science are not exclusively based on the use of AR technologies themselves. These educational values are more likely connected to how AR is designed, implemented and integrated into formal and informal learning settings [28].
  • 1.5K
  • 12 Aug 2022
Topic Review
Electrochemical RAM
Electrochemical Random-Access Memory (ECRAM) is a type of non-volatile memory (NVM) with multiple levels per cell (MLC) designed for deep learning analog acceleration. An ECRAM cell is a three-terminal device composed of a conductive channel, an insulating electrolyte, an ionic reservoir, and metal contacts. The resistance of the channel is modulated by ionic exchange at the interface between the channel and the electrolyte upon application of an electric field. The charge-transfer process allows both for state retention in the absence of applied power, and for programming of multiple distinct levels, both differentiating ECRAM operation from that of a field-effect transistor (FET). The write operation is deterministic and can result in symmetrical potentiation and depression, making ECRAM arrays attractive for acting as artificial synaptic weights in physical implementations of artificial neural networks (ANN). The technological challenges include open circuit potential (OCP) and semiconductor foundry compatibility associated with energy materials. Universities, government laboratories, and corporate research teams have contributed to the development of ECRAM for analog computing. Notably, Sandia National Laboratories designed a lithium-based cell inspired by solid-state battery materials, Stanford University built an organic proton-based cell, and International Business Machines (IBM) demonstrated in-memory selector-free parallel programming for a logistic regression task in an array of metal-oxide ECRAM designed for insertion in the back end of line (BEOL). In 2022, researchers at Massachusetts Institute of Technology built an inorganic, CMOS-compatible protonic technology that achieved near-ideal modulation characteristics using nanosecond fast pulses
  • 1.5K
  • 18 Nov 2022
Topic Review
VBS1
VBS1 (Virtual Battlefield Systems 1) is a military simulator which relies heavily on modern game technology and is therefore generally referred to as a serious game. The platform is derived from the first-person entertainment game Operation Flashpoint and is developed by Bohemia Interactive Australia. The system enables the practice of small unit military tactics in an interactive multiplayer 3D environment. The platform provides real-time scenario management facilities, customized vehicles and equipment, user-definable mission scenarios, and variable environmental conditions. This combination of military simulator functionality and modern gaming technology proved to be a success and resulted in a broad military customer base. VBS2 is the successor of this platform.
  • 1.5K
  • 29 Nov 2022
Topic Review
Exclusion of the Null Hypothesis
In inferential statistics, the null hypothesis (often denoted H0) is a default hypothesis that a quantity to be measured is zero (null). Typically, the quantity to be measured is the difference between two situations, for instance to try to determine if there is a positive proof that an effect has occurred or that samples derive from different batches. The null hypothesis is effectively stating that a quantity (of interest) is larger or equal to zero AND smaller or equal to zero. If either requirement can be positively overturned, the null hypothesis is "excluded from the realm of possibilities". The null hypothesis is generally assumed to remain possibly true. Multiple analyses can be performed to show how the hypothesis should be either: rejected or excluded e.g. having high confidence level, thus demonstrating a statistically significant difference. This is demonstrated by showing that zero is outside of the specified confidence interval of the measurement on either side, typically within the real numbers. Failure to exclude the null hypothesis (with any confidence) does logically NOT confirm or support the (unprovable) null hypothesis. (When you have not proven something is e.g. bigger than x, it does not necessarily mean you have made it plausible that it is smaller or equal than x; alternatively you may just have done a lousy measurement with low accuracy. Confirming the null hypothesis two-sided would amount to positively proving it is bigger or equal than 0 AND to positively proving it is smaller or equal than 0; this is something for which you need infinite accuracy as well as exactly zero effect neither of which normally are realistic. Also measurements will never indicate a non-zero probability of exactly zero difference.) So failure of an exclusion of a null hypothesis amounts to a "don't know" at the specified confidence level; it does not immediately imply null somehow, as the data may already show a (less strong) indication for a non-null. The used confidence level does absolutely certainly not correspond to the likelihood of null at failing to exclude; in fact in this case a high used confidence level expands the still plausible range. A non-null hypothesis can have the following meanings, depending on the author a) a value other than zero is used, b) some margin other than zero is used and c) the "alternative" hypothesis. Testing (excluding or failing to exclude) the null hypothesis provides evidence that there are (or are not) statistically sufficient grounds to believe there is a relationship between two phenomena (e.g., that a potential treatment has a non-zero effect, either way). Testing the null hypothesis is a central task in statistical hypothesis testing in the modern practice of science. There are precise criteria for excluding or not excluding a null hypothesis at a certain confidence level. The confidence level should indicate the likelihood that much more and better data would still be able to exclude the null hypothesis on the same side. The concept of a null hypothesis is used differently in two approaches to statistical inference. In the significance testing approach of Ronald Fisher, a null hypothesis is rejected if the observed data is significantly unlikely to have occurred if the null hypothesis were true. In this case, the null hypothesis is rejected and an alternative hypothesis is accepted in its place. If the data is consistent with the null hypothesis statistically possibly true, then the null hypothesis is not rejected. In neither case is the null hypothesis or its alternative proven; with better or more data, the null may still be rejected. This is analogous to the legal principle of presumption of innocence, in which a suspect or defendant is assumed to be innocent (null is not rejected) until proven guilty (null is rejected) beyond a reasonable doubt (to a statistically significant degree). In the hypothesis testing approach of Jerzy Neyman and Egon Pearson, a null hypothesis is contrasted with an alternative hypothesis, and the two hypotheses are distinguished on the basis of data, with certain error rates. It is used in formulating answers in research. Statistical inference can be done without a null hypothesis, by specifying a statistical model corresponding to each candidate hypothesis, and by using model selection techniques to choose the most appropriate model. (The most common selection techniques are based on either Akaike information criterion or Bayes factor).
  • 1.5K
  • 17 Oct 2022
Topic Review
Head-On Soccer
Head-On Soccer[lower-alpha 1] is a soccer video game originally developed and published by U.S. Gold for the Sega Genesis in 1995. Featuring an arcade-style approach to soccer compared to other titles that were released at the time, Head-On Soccer allows players to have the choice of playing across any of the game modes available against with either CPU-controlled opponents or other players with the team of their choosing. Initially launched for the Genesis, it was then released on the Super Nintendo Entertainment System a few months after the original version and was later ported to the Atari Jaguar in December of the same year under the name Fever Pitch Soccer, which was the title of the game on PAL territories. Head-On Soccer received generally positive reception from critics since its release on the Sega Genesis, with praise towards the graphics, sound, gameplay and the ability to upgrade the various abilities of the team players upon winning matches. The Super NES version also received positive reception, with praise to the improved graphics and sound from the Genesis original while the Jaguar version, though criticized for not taking advantage of the hardware and its similarity with the 16-bit versions, was generally well-received and considered by some reviewers as one of the better titles for the system. Critics, however, compared the game with both the FIFA series from Electronic Arts and the original International Superstar Soccer from Konami.
  • 1.5K
  • 29 Sep 2022
Topic Review
IT Baseline Protection Catalogs
The IT Baseline Protection Catalogs, or IT-Grundschutz-Kataloge, ("IT Baseline Protection Manual" before 2005) are a collection of documents from the Germany Federal Office for Security in Information Technology (BSI) that provide useful information for detecting weaknesses and combating attacks in the information technology (IT) environment (IT cluster). The collection encompasses over 3000 pages, including the introduction and catalogs. It serves as the basis for the IT baseline protection certification of an enterprise.
  • 1.5K
  • 31 Oct 2022
Topic Review
DSAdd
As the next version of Windows NT after Windows 2000, as well as the successor to Windows Me, Windows XP introduced many new features but it also removed some others.
  • 1.5K
  • 11 Oct 2022
Topic Review
Strategic Urban Planning
The general objectives of strategic urban planning (SUP) include clarifying which city model is desired and working towards that goal, coordinating public and private efforts, channelling energy, adapting to new circumstances and improving the living conditions of the citizens affected. Strategic planning is a technique that has been applied to many facets of human activity; we have only to mention Sun Tzu, Arthur Thomson or Henry Mintzberg; however, the application of strategic planning to urban contexts, or cities, regions and other metropolitan areas is a relatively recent development whose beginnings were eminently practical and artistical: a mixture of thought, techniques and art or expertise. Fifteen years of practice proved to be enough time for the technique to spread and for the first “Meeting of American and European cities for the Exchange of Experiences in Strategic Planning” to be organized. Institutions sponsoring the meeting, held in Barcelona in 1993, included the Inter-American Development Bank, the European Community Commission and the Iberoamerican Cooperation Institute. The cities of Amsterdam, Lisbon, Lille, Barcelona, Toronto and Santiago de Chile participated, among others. At that meeting it was demonstrated, along with other relevant aspects, that if cooperative processes are used in large cities in order to carry out strategic planning processes, and if a reasonable degree of comprehension is reached between the administration, businesses and an ample representation of social agents, organizational synergies will develop that will eventually improve resource management and citizens’ quality of life.
  • 1.5K
  • 24 Oct 2022
Topic Review
Sensor Data Fusion Algorithms
Sensor Data Fusion (SDT) algorithms and methods have been utilised in many applications ranging from automobiles to healthcare systems. They can be used to design a redundant, reliable, and complementary system with the intent of enhancing the system’s performance. SDT can be multifaceted, involving many representations such as pixels, features, signals, and symbols.
  • 1.5K
  • 12 Dec 2023
Topic Review
Conceptual Interoperability
Conceptual interoperability is a concept in simulation theory. However, it is broadly applicable for other model-based information technology domains. From the early ideas of Harkrider and Lunceford, simulation composability has been studied in more detail. Petty and Weisel formulated the current working definition: "Composability is the capability to select and assemble simulation components in various combinations into simulation systems to satisfy specific user requirements. The defining characteristic of composability is the ability to combine and recombine components into different simulation systems for different purposes." A recent RAND study provided a coherent overview of the state of composability for military simulation systems within the U.S. Department of Defense; many of its findings have much broader applicability.
  • 1.5K
  • 25 Oct 2022
Topic Review
Clip (Command)
The clipboard is a buffer that some operating systems provide for short-term storage and transfer within and between application programs. The clipboard is usually temporary and unnamed, and its contents reside in the computer's RAM. The clipboard provides an application programming interface by which programs can specify cut, copy and paste operations. It is left to the program to define methods for the user to command these operations, which may include keybindings and menu selections. When an element is copied or cut, the clipboard must store enough information to enable a sensible result no matter where the element is pasted. Application programs may extend the clipboard functions that the operating system provides. A clipboard manager may give the user additional control over the clipboard. Specific clipboard semantics vary among operating systems, can also vary between versions of the same system, and can sometimes be changed by programs and by user preferences. Windows, Linux and macOS support a single clipboard transaction.
  • 1.5K
  • 01 Nov 2022
Topic Review
IBM VisualAge
VisualAge was the name of a family of computer integrated development environments from IBM, which included support for multiple programming languages. VisualAge was first released in October 1993 and was discontinued on 30 April 2007 and its web page removed in September 2011. VisualAge was also marketed as “VisualAge Smalltalk”. IBM has stated that XL C/C++ is the 'follow-on' product to VisualAge.
  • 1.5K
  • 09 Nov 2022
Topic Review
Loot System
In video games, a loot system is a method of distributing in-game items amongst a group of players, after having "looted" them.
  • 1.4K
  • 07 Nov 2022
  • Page
  • of
  • 48
Academic Video Service