You're using an outdated browser. Please upgrade to a modern browser for the best experience.
Subject:
All Disciplines Arts & Humanities Biology & Life Sciences Business & Economics Chemistry & Materials Science Computer Science & Mathematics Engineering Environmental & Earth Sciences Medicine & Pharmacology Physical Sciences Public Health & Healthcare Social Sciences
Sort by:
Most Viewed Latest Alphabetical (A-Z) Alphabetical (Z-A)
Filter:
All Topic Review Biography Peer Reviewed Entry Video Entry
Topic Review
Network Function Virtualization
Network function virtualization (NFV) is an emerging technology that is becoming increasingly important due to its many advantages. NFV transforms legacy hardware-based network infrastructure into software-based virtualized networks. This transformation increases the flexibility and scalability of networks, at the same time reducing the time for the creation of new networks. However, the attack surface of the network increases, which requires the definition of a clear map of where attacks may happen.
  • 1.3K
  • 30 May 2022
Topic Review
GNU Build System
The GNU Build System, also known as the Autotools, is a suite of programming tools designed to assist in making source code packages portable to many Unix-like systems. It can be difficult to make a software program portable: the C compiler differs from system to system; certain library functions are missing on some systems; header files may have different names. One way to handle this is to write conditional code, with code blocks selected by means of preprocessor directives (#ifdef); but because of the wide variety of build environments this approach quickly becomes unmanageable. Autotools is designed to address this problem more manageably. Autotools is part of the GNU toolchain and is widely used in many free software and open source packages. Its component tools are free software-licensed under the GNU General Public License with special license exceptions permitting its use with proprietary software. The GNU Build System makes it possible to build many programs using a two-step process: configure followed by make.
  • 1.3K
  • 20 Oct 2022
Topic Review
Data Re-Identification
Data Re-Identification is the practice of matching anonymous data (also known as de-identified data) with publicly available information, or auxiliary data, in order to discover the individual to which the data belongs to. This is a concern because companies with privacy policies, health care providers, and financial institutions may release the data they collect after the data has gone through the de-identification process. The de-identification process involves masking, generalizing or deleting both direct and indirect identifiers; the definition of this process is not universal, however. Information in the public domain, even seemingly anonymized, may thus be re-identified in combination with other pieces of available data and basic computer science techniques. The Common Rule Agencies, a collection of multiple U.S. federal agencies and departments including the U.S. Department of Health and Human Services, speculate that re-identification is becoming gradually easier because of "big data" - the abundance and constant collection and analysis of information along the evolution of technologies and the advances of algorithms. However, others have claimed that de-identification is a safe and effective data liberation tool and do not view re-identification as a concern. A 2000 study found that 87 percent of the U.S. population can be identified using a combination of their gender, birthdate and zip code. Others do not think that re-identification is a serious threat, and call it a "myth"; they claim that the combination of zip code, date of birth and gender is rare or partially complete, such as only the year and month birth without the date, or the county name instead of the specific zip code, thus the risk of such re-identification is reduced in many instances.
  • 1.3K
  • 31 Oct 2022
Topic Review
Envy-Free Cake-Cutting
An envy-free cake-cutting is a kind of fair cake-cutting. It is a division of a heterogeneous resource ("cake") that satisfies the envy-free criterion, namely, that every partner feels that their allocated share is at least as good as any other share, according to their own subjective valuation. When there are only two partners, the problem is easy and has been solved in Biblical times by the divide and choose protocol. When there are three or more partners, the problem becomes much more challenging. Two major variants of the problem have been studied: Connected pieces, e.g. if the cake is a 1-dimensional interval then each partner must receive a single sub-interval. If there are n partners, only n−1 cuts are needed. General pieces, e.g. if the cake is a 1-dimensional interval then each partner can receive a union of disjoint sub-intervals.
  • 1.3K
  • 02 Dec 2022
Topic Review
DSMod
As the next version of Windows NT after Windows 2000, as well as the successor to Windows Me, Windows XP introduced many new features but it also removed some others.
  • 1.3K
  • 06 Oct 2022
Topic Review
Magnus Expansion
In mathematics and physics, the Magnus expansion, named after Wilhelm Magnus (1907–1990), provides an exponential representation of the solution of a first-order homogeneous linear differential equation for a linear operator. In particular, it furnishes the fundamental matrix of a system of linear ordinary differential equations of order n with varying coefficients. The exponent is aggregated as an infinite series, whose terms involve multiple integrals and nested commutators.
  • 1.3K
  • 21 Oct 2022
Topic Review
Iterared Skew Polynomial Rings
Skew polynomial rings used to be, especially during the 1970’s and 1980’s, a popular topic of the modern abstract Algebra with great theoretical interest. The researchers’ attention about them has renewed recently, due to the important applications that they have found to the study of Quantum Groups and to Cryptography. The present work studies a special class of iterated skew polynomial rings over a ring R, defined with respect to a finite set of pairwise commuting derivations of R.
  • 1.2K
  • 17 Dec 2020
Topic Review
ID-Based Encryption
ID-based encryption, or identity-based encryption (IBE), is an important primitive of ID-based cryptography. As such it is a type of public-key encryption in which the public key of a user is some unique information about the identity of the user (e.g. a user's email address). This means that a sender who has access to the public parameters of the system can encrypt a message using e.g. the text-value of the receiver's name or email address as a key. The receiver obtains its decryption key from a central authority, which needs to be trusted as it generates secret keys for every user. ID-based encryption was proposed by Adi Shamir in 1984. He was however only able to give an instantiation of identity-based signatures. Identity-based encryption remained an open problem for many years. The pairing-based Boneh–Franklin scheme and Cocks's encryption scheme based on quadratic residues both solved the IBE problem in 2001.
  • 1.2K
  • 25 Oct 2022
Topic Review
Dengue Detection
The dengue virus (DENV) is a vector-borne flavivirus that infects around 390 million individuals each year with 2.5 billion being in danger. Having access to testing is paramount in preventing future infections and receiving adequate treatment. Currently, there are numerous conventional methods for DENV testing, such as NS1 based antigen testing, IgM/IgG antibody testing, and Polymerase Chain Reaction (PCR).
  • 1.2K
  • 05 Jul 2021
Topic Review
Fat Object
In geometry, a fat object is an object in two or more dimensions, whose lengths in the different dimensions are similar. For example, a square is fat because its length and width are identical. A 2-by-1 rectangle is thinner than a square, but it is fat relative to a 10-by-1 rectangle. Similarly, a circle is fatter than a 1-by-10 ellipse and an equilateral triangle is fatter than a very obtuse triangle. Fat objects are especially important in computational geometry. Many algorithms in computational geometry can perform much better if their input consists of only fat objects; see the applications section below.
  • 1.2K
  • 28 Nov 2022
Topic Review
Batch File
A batch file is a script file in DOS, OS/2 and Microsoft Windows. It consists of a series of commands to be executed by the command-line interpreter, stored in a plain text file. A batch file may contain any command the interpreter accepts interactively and use constructs that enable conditional branching and looping within the batch file, such as IF, FOR, and GOTO labels. The term "batch" is from batch processing, meaning "non-interactive execution", though a batch file may not process a batch of multiple data. Similar to Job Control Language (JCL), DCL and other systems on mainframe and minicomputer systems, batch files were added to ease the work required for certain regular tasks by allowing the user to set up a script to automate them. When a batch file is run, the shell program (usually COMMAND.COM or cmd.exe) reads the file and executes its commands, normally line-by-line. Unix-like operating systems, such as Linux, have a similar, but more flexible, type of file called a shell script. The filename extension .bat is used in DOS and Windows. Windows NT and OS/2 also added .cmd. Batch files for other environments may have different extensions, e.g., .btm in 4DOS, 4OS2 and 4NT related shells. The detailed handling of batch files has changed significantly between versions. Some of the detail in this article applies to all batch files, while other details apply only to certain versions.
  • 1.2K
  • 19 Oct 2022
Topic Review
Ko Fight
A ko (Japanese: コウ, 劫, kō, from the translation of the Sanskrit term kalpa) fight is a tactical and strategic phase that can arise in the game of go.
  • 1.2K
  • 28 Nov 2022
Topic Review
Systems Theory in Anthropology
Systems theory in anthropology is an interdisciplinary, non-representative, non-referential, and non-Cartesian approach that brings together natural and social sciences to understand society in its complexity. The basic idea of a system theory in social science is to solve the classical problem of duality; mind-body, subject-object, form-content, signifier-signified, and structure-agency. System theory suggests that instead of creating closed categories into binaries (subject-object); the system should stay open so as to allow free flow of process and interactions. In this way the binaries are dissolved. Complex systems in nature—for example, ecosystems—involve a dynamic interaction of many variables (e.g. animals, plants, insects and bacteria; predators and prey; climate, the seasons and the weather, etc.) These interactions can adapt to changing conditions but maintain a balance both between the various parts and as a whole; this balance is maintained through homeostasis. Human societies are complex systems, as it were, human ecosystems. Early humans, as hunter-gatherers, recognized and worked within the parameters of the complex systems in nature and their lives were circumscribed by the realities of nature. But they couldn't explain complex systems. Only in recent centuries did the need arise to define complex systems scientifically. Complex systems theories first developed in math in the late 19th century, then in biology in the 1920s to explain ecosystems, then to deal with artificial intelligence (cybernetics), etc. Anthropologist Gregory Bateson is the most influential and earliest founder of system theory in social sciences. In the 1940s, as a result of the Macy conferences, he immediately recognized its application to human societies with their many variables and the flexible but sustainable balance that they maintain. Bateson describes system as "any unit containing feedback structure and therefore competent to process information." Thus an open system allows interaction between concepts and materiality or subject and the environment or abstract and real. In natural science, systems theory has been a widely used approach. Austrian biologist, Karl Ludwig von Bertalanffy, developed the idea of the general systems theory (GST). The GST is a multidisciplinary approach of system analysis.
  • 1.2K
  • 27 Oct 2022
Topic Review
Sun Ray
The Sun Ray was a stateless thin client computer (and associated software) aimed at corporate environments, originally introduced by Sun Microsystems in September 1999 and discontinued by Oracle Corporation in 2014. It featured a smart card reader and several models featured an integrated flat panel display. The idea of a stateless desktop was a significant shift from, and the eventual successor to, Sun's earlier line of diskless Java-only desktops, the JavaStation.
  • 1.2K
  • 10 Nov 2022
Topic Review
Local Zeta-Function
In number theory, the local zeta function Z(V, s) (sometimes called the congruent zeta function) is defined as where Nm is the number of points of V defined over the mth cyclotomic field extension Fqm of Fq, and V is a non-singular n-dimensional projective algebraic variety over the field Fq with q elements. By the variable transformation u = q−s, then it is defined by as the formal power series of the variable [math]\displaystyle{ u }[/math]. Equivalently, the local zeta function sometimes is defined as follows: In other word, the local zeta function Z(V, u) with coefficients in the finite field Fq is defined as a function whose logarithmic derivative generates the numbers Nm of the solutions of equation, defining V, in the m degree extension Fqm.
  • 1.2K
  • 21 Oct 2022
Topic Review
Here Be Dragons
Here Be Dragons (formerly known as Vrse.works) is a medium-agnostic creative studio co-founded by Patrick Milling-Smith, Chris Milk and Brian Carmody.
  • 1.2K
  • 08 Oct 2022
Topic Review
IoT and Machine Learning
Machine learning (ML) is a powerful tool that delivers insights hidden in Internet of Things (IoT) data. These hybrid technologies work smartly to improve the decision-making process in different areas such as education, security, business, and the healthcare industry. ML empowers the IoT to demystify hidden patterns in bulk data for optimal prediction and recommendation systems. Healthcare has embraced IoT and ML so that automated machines make medical records, predict disease diagnoses, and, most importantly, conduct real-time monitoring of patients. Individual ML algorithms perform differently on different datasets. Due to the predictive results varying, this might impact the overall results. The variation in prediction results looms large in the clinical decision-making process. Therefore, it is essential to understand the different ML algorithms used to handle IoT data in the healthcare sector.
  • 1.2K
  • 16 Jun 2021
Topic Review
Negation and Speculation Corpora in Natural Language Processing
Negation and speculation are universal linguistic phenomena that affect the performance of Natural Language Processing (NLP) applications, such as those for opinion mining and information retrieval, especially in biomedical data. 
  • 1.2K
  • 20 Jun 2022
Topic Review
Latin-1 Supplement (Unicode Block)
The Latin-1 Supplement (also called C1 Controls and Latin-1 Supplement) is the second Unicode block in the Unicode standard. It encodes the upper range of ISO 8859-1: 80 (U+0080) - FF (U+00FF). Controls C1 (0080–009F) are not graphic. This block ranges from U+0080 to U+00FF, contains 128 characters and includes the C1 controls, Latin-1 punctuation and symbols, 30 pairs of majuscule and minuscule accented Latin characters and 2 mathematical operators. The C1 controls and Latin-1 Supplement block has been included in its present form, with the same character repertoire since version 1.0 of the Unicode Standard. Its block name in Unicode 1.0 was simply Latin1.
  • 1.2K
  • 28 Nov 2022
Topic Review
Atari Assembler Editor
Atari Assembler Editor (sometimes written as Atari Assembler/Editor) is a ROM cartridge-based development system released by Atari, Inc. in 1981. It is used to edit, assemble, and debug 6502 programs for the Atari 8-bit family of home computers. It was programmed by Kathleen O'Brien of Shepardson Microsystems, the company which wrote Atari BASIC, and Assembler Editor shares many design concepts with that language. Assembly times are slow, making the cartridge challenging to use for larger programs. In the manual, Atari recommended the Assembler Editor as a tool for writing subroutines to speed up Atari BASIC, which would be much smaller than full applications. The Atari Macro Assembler was offered as an alternative with better performance and more features, such as macros, but it was disk-based, copy-protected, and did not include an editor or debugger. Despite the recommendation, commercial software was written using the Assembler Editor, such as the games Eastern Front (1941), Caverns of Mars, Galahad and the Holy Grail, and Kid Grid. The source code to the original Assembler Editor was licensed to Optimized Systems Software who shipped EASMD based on it.
  • 1.2K
  • 21 Nov 2022
  • Page
  • of
  • 48
Academic Video Service