Topic Review
Lempel-Ziv Complexity
The Lempel-Ziv complexity is a measure that was first presented in the article On the Complexity of Finite Sequences (IEEE Trans. On IT-22,1 1976), by two Israeli computer scientists, Abraham Lempel and Jacob Ziv. This complexity measure is related to Kolmogorov complexity, but the only function it uses is the recursive copy (i.e., the shallow copy). The underlying mechanism in this complexity measure is the starting point for some algorithms for lossless data compression, like LZ77, LZ78 and LZW. Even though it is based on an elementary principle of words copying, this complexity measure is not too restrictive in the sense that it satisfies the main qualities expected by such a measure: sequences with a certain regularity do not have a too large complexity, and the complexity grows as the sequence grows in length and irregularity. The Lempel-Ziv complexity can be used to measure the repetitiveness of binary sequences and text, like song lyrics or prose. Fractal dimension estimates of real-world data have also been shown to correlate with Lempel-Ziv complexity.
  • 486
  • 07 Nov 2022
Topic Review
Quantum Machine Learning for Security Assessment in IoMT
Internet of Medical Things (IoMT) is an ecosystem composed of connected electronic items such as small sensors/actuators and other cyber-physical devices (CPDs) in medical services. When these devices are linked together, they can support patients through medical monitoring, analysis, and reporting in more autonomous and intelligent ways.
  • 485
  • 13 Oct 2023
Topic Review
Student Accessibility Services in Higher Education Institutions
The Convention on the Rights of Persons with Disabilities (CRPD) highlights the right of people with disabilities to access education without discrimination and equal opportunities. In general terms, accessibility in education means that a person with disabilities must be able to "acquire the same information, engage in the same interactions, and enjoy the same services, in an equally effective and equally integrated manner, with substantially equivalent ease of use, as a person without disabilities". In higher education institutions (HEIs), this implies, as broadly as possible, environments (including virtual), processes, access to information, objects, tools, devices, communication, goods, and services, by considering universal design principles and reasonable adjustments.
  • 484
  • 07 Apr 2022
Topic Review
Distributed Agile Software Development
Distributed Agile Software Development is a research area that considers the effects of applying the principles of Agile software development to software development in a globally distributed development setting. The goal of applying these principles is overcoming challenges in projects which are geographically distributed. The principles of Agile software development provide structures to promote better communication, which is an important factor in successfully working in a distributed setting. However, not having face-to-face interaction takes away one of the core principles of Agile. This makes Distributed Agile Software Development more challenging than Agile Software Development in general.
  • 484
  • 11 Nov 2022
Topic Review
Gatling
Gatling is an open-source load- and performance-testing framework based on Scala, Akka and Netty. The first stable release was published on January 13, 2012. In 2015, Gatling's founder, Stéphane Landelle, created a company (named "Gatling Corp"), dedicated to the development of the open-source project. According to Gatling Corp's official blog, Gatling was downloaded more than 1,000,000 times (2021). In June 2016, Gatling officially presented Gatling FrontLine, Gatling's Enterprise Version with additional features. The software is designed to be used as a load testing tool for analyzing and measuring the performance of a variety of services, with a focus on web applications. Gatling was mentioned twice in ThoughtWorks Technology Radar, in 2013 and 2014, "as a tool worth trying", with an emphasis on "the interesting premise of treating your performance tests as production code". The latest stable release is Gatling 3.8.0, published on July 06, 2022.
  • 483
  • 08 Nov 2022
Topic Review
Handicap (Go)
Within most systems and at most levels in the game of Go, a handicap is given to offset the strength difference between players of different ranks.
  • 483
  • 28 Nov 2022
Topic Review
Ten Rays Model
The ten-ray model is a model applied to the transmissions in the urban area, to generate a model of ten rays typically four rays more are added to the six rays model, these are ([math]\displaystyle{ R3 }[/math] and [math]\displaystyle{ R4 }[/math] bouncing on both sides of the wall); This incorporate paths from one to three reflections: specifically, there is the LOS (Line of sight), GR (ground reflected), SW (single-wall reflected), DW (double-wall reflected), TW (triple-wall reflected), WG (wall-ground reflected) and GW (ground-wall reflected paths). Where each one of the paths bounces on both sides of the wall. Experimentally, it has been demonstrated that the ten ray model simulates or can represent the propagation of signals through a dielectric canyon, in it which the rays that travel from a transmitter point to a receiver point bounce many times. As example for this model it is assume: a rectilinear free space with two walls, one upper and the other lower, from which two vertical bases are positioned at their ends, these are the transmitting and receiving antennas that it’s locate in such a way that their heights don’t surpass the limits of the top wall; Achieved this the structure acts as free space for its functioning similar to that of a dielectric canyon of signals propagation, since the rays transmitted from the transmitting antenna will collide each side of the upper and lower walls infinity of times (for this example up to 3 reflections) until reaching the receiving antenna. During the course of the rays for each reflection they suffer, part of the energy of the signal is dissipated in each reflection, normally after the third reflection of said ray its resulting component which is a retro-reflected ray is insignificant with a negligible energy.
  • 483
  • 01 Dec 2022
Topic Review
IBM I Control Language
The IBM i Control Language (CL) is a scripting language for the IBM's IBM i platform (previously called OS/400 when running on AS/400 systems) bearing a resemblance to the IBM Job Control Language and consisting of an ever-expanding set of command objects (*CMD) used to invoke traditional AS/400 programs and/or get help on what those programs do. CL can also be used to create CL programs (congruent to shell scripts) where there are additional commands that provide program-like functionality (IF/ELSE, variable declaration, file input, etc.) Although CL is a scripting language for system administration, it is used mainly to create compiled programs. The use of interpreted CL scripts through the SBMDBJOB command is in fact extremely limited. While thousands of commands were written by IBM developers to perform system level tasks like compiling programs, backing up data, changing system configurations, displaying system object details, or deleting them, commands are not limited to systems level concerns and can be drafted for user applications as well.
  • 482
  • 29 Nov 2022
Topic Review
Domain Name Auction
A domain name auction facilitates the buying and selling of currently registered domain names, enabling individuals to purchase a previously registered domain that suits their needs from an owner wishing to sell. A Drop registrar offers sales of expiring domains; but with a domain auction there is no need to wait until (and if) a current owner allows the registration to lapse before purchasing the domain you most want to own. Domain auction sites allow users to search multiple domain names that are listed for sale by owner, and to place bids on the names they want to purchase. As in any auction, the highest bidder wins. The more desirable a domain name, the higher the winning bid, and auction sites often provide links to escrow agents to facilitate the safe transfer of funds and domain properties between the auctioning parties.
  • 481
  • 21 Nov 2022
Topic Review
Internet-Based Treatments for Trauma Survivors
Internet-based treatments for trauma survivors is a growing class of online treatments that allow for an individual who has experienced trauma to seek and receive treatment without needing to attend psychotherapy in person. The progressive movement to online resources and the need for more accessible mental health services has given rise to the creation of online-based interventions aimed to help those who have experienced traumatic events. Cognitive behavioral therapy (CBT) has shown to be particularly effective in the treatment of trauma-related disorders and adapting CBT to an online format has been shown to be as effective as in-person CBT in the treatment of trauma. Due to its positive outcomes, CBT-based internet treatment options for trauma survivors has been an expanding field in both research and clinical settings.
  • 480
  • 27 Oct 2022
Topic Review
Tensorsketch
User:RMCD bot/subject notice In statistics, machine learning and algorithms, a tensor sketch is a type of dimensionality reduction that is particularly efficient when applied to vectors that have tensor structure. Such a sketch can be used to speed up explicit kernel methods, bilinear pooling in neural networks and is a cornerstone in many numerical linear algebra algorithms.
  • 477
  • 30 Sep 2022
Topic Review
Differential Privacy
Differential privacy is a statistical technique that aims to provide means to maximize the accuracy of queries from statistical databases while measuring (and, thereby, hopefully minimizing) the privacy impact on individuals whose information is in the database. Differential privacy was developed by cryptographers and is thus often associated with cryptography, and it draws much of its language from cryptography. Although differential privacy is often discussed in the context of identifying individuals whose information may be a database, identification and reidentification attacks are not included within the original differential privacy framework.
  • 477
  • 25 Oct 2022
Topic Review
Dr. DivX
Dr. DivX was an application created by DivX, Inc. that is capable of transcoding many video formats to DivX encoded video.
  • 477
  • 28 Nov 2022
Topic Review
No Postage Necessary
"No Postage Necessary” is a 2018 American romantic comedy independent film written and directed by Jeremy Culver as his second narrative feature and starring George Blagden, Charleene Closshey, Robbie Kay, Stelio Savante, with Michael Beach and Raymond J. Barry. The film integrates current political happenings—including the Silk Road (online marketplace) (referred to "The Spice Trail" in the film's script), hacktivism, and cyberterrorism—into a dramedy set in Plant City, FL. Principal photography began in Plant City, FL in August 2016 filming on 35mm film. The film initially premiered on October 14, 2017 as an Official Selection of the Heartland Film Festival. The shortened and final version later premiered on June 28, 2018 at the Tampa Theatre in Hillsborough County, Florida where the movie was filmed exclusively. It released in theaters in ten markets across the United States on July 6, 2018 by Two Roads Picture Co. as well as on the decentralized application Vevue, making history as the first film to ever release via blockchain technology and available to stream using cryptocurrency as payment. The film’s script is now part of the Core Collection of the Margaret Herrick Library at the Academy of Motion Picture Arts and Sciences. No Postage Necessary received mixed reviews, with praise for its heartfelt performances, music, production values, and ending that is “satisfying without artifice”, but criticism suggesting it relies too heavily on its unique distribution strategy as its "hook”.
  • 476
  • 30 Sep 2022
Topic Review
Bayesian Inference in Motor Learning
Bayesian inference is a statistical tool that can be applied to motor learning, specifically to adaptation. Adaptation is a short-term learning process involving gradual improvement in performance in response to a change in sensory information. Bayesian inference is used to describe the way the nervous system combines this sensory information with prior knowledge to estimate the position or other characteristics of something in the environment. Bayesian inference can also be used to show how information from multiple senses (e.g. visual and proprioception) can be combined for the same purpose. In either case, Bayesian inference dictates that the estimate is most influenced by whichever information is most certain.
  • 474
  • 27 Oct 2022
Topic Review
Unbalanced Oil and Vinegar
In cryptography, the Unbalanced Oil and Vinegar (UOV) scheme is a modified version of the Oil and Vinegar scheme designed by J. Patarin. Both are digital signature schemes. They belong to the group of multivariate cryptography. The security of this signature scheme is based on an NP-hard mathematical problem. To create and validate signatures a minimal quadratic equations system must be solved. Solving m equations with n variables is an NP-hard problem, which means the problem is almost certainly difficult to solve efficiently in the worst case, even when using a quantum computer. While the problem is easy if m is either much much larger or much much smaller than n, importantly for cryptographic purposes, the problem is thought to be difficult in the average case when m and n are nearly equal, even when using a quantum computer. As a result, a number of signature schemes have been devised based on multivariate equations with the goal of achieving quantum resistant signatures.
  • 472
  • 17 Nov 2022
Topic Review
DNIX
DNIX (original spelling: D-Nix) is a discontinued Unix-like real-time operating system from the Swedish company Dataindustrier AB (DIAB). A version named ABCenix was developed for the ABC 1600 computer from Luxor. Daisy Systems also had a system named Daisy DNIX on some of their computer-aided design (CAD) workstations. It was unrelated to DIAB's product.
  • 469
  • 01 Nov 2022
Topic Review
Fuzzy Control
Fuzzy logic is derived from fuzzy set theory dealing with reasoning that is approximate rather than precisely deduced from classical predicate logic. It can be thought of as the application side of fuzzy set theory dealing with well thought out real world expert values for a complex problem (Klir 1997). Degrees of truth are often confused with probabilities. However, they are conceptually distinct; fuzzy truth represents membership in vaguely defined sets, not likelihood of some event or condition. To illustrate the difference, consider this scenario: Bob is in a house with two adjacent rooms: the kitchen and the dining room. In many cases, Bob's status within the set of things "in the kitchen" is completely plain: he's either "in the kitchen" or "not in the kitchen". What about when Bob stands in the doorway? He may be considered "partially in the kitchen". Quantifying this partial state yields a fuzzy set membership. With only his big toe in the dining room, we might say Bob is 99% "in the kitchen" and 1% "in the dining room", for instance. No event (like a coin toss) will resolve Bob to being completely "in the kitchen" or "not in the kitchen", as long as he's standing in that doorway. Fuzzy sets are based on vague definitions of sets, not randomness. Fuzzy logic allows for set membership values to range (inclusively) between 0 and 1, and in its linguistic form, imprecise concepts like "slightly", "quite" and "very". Specifically, it allows partial membership in a set. It is related to fuzzy sets and possibility theory. It was introduced in 1965 by Lotfi Zadeh at the University of California, Berkeley. Fuzzy logic is controversial in some circles, despite wide acceptance and a broad track record of successful applications. It is rejected by some control engineers for validation and other reasons, and by some statisticians who hold that probability is the only rigorous mathematical description of uncertainty. Critics also argue that it cannot be a superset of ordinary set theory since membership functions are defined in terms of conventional sets.
  • 467
  • 02 Nov 2022
Topic Review
FabLab
The fast expansion of digital culture has fostered the creation of makerspaces such as fabrication laboratories (FabLabs) that, thanks to their flexibility and their use of open source tools, strengthen the sense of community and produce true transformations within those communities. Fabrication laboratories (FabLabs) are an integral part of “communities and spaces with more or less open to the public levels, with objectives and targets agreed by its members in which, through learning processes, production, prototyping, design, and manufacturing, both tangible and intangible assets, complex two-way exchanges of information are produced, knowledge, technology, skills and resources among users, users and society and between users and industry” that constitute the new wave of collaborative ecologies into which elements such as makerspaces, hackerspaces, living-labs or co-workings also fall, highlighted by its origin and its capacity as an international organization. A FabLab is a strong social space offering affordable and accessible manufactured tools and is sometimes conceived as an appropriate platform to begin the prototyping and development processes of any object. They emerged in the 2000s at the Massachusetts Institute of Technology (MIT) from Professor Neil Gershenfeld’s subject, called “How to Make (Almost) Anything”. Thanks to the creation of this first FabLab, Professor Gershenfeld’s students could realize their designs, popularizing digital manufacturing and making the subject successful.
  • 464
  • 06 May 2022
Topic Review
Cray-2
The Cray-2 is a supercomputer with four vector processors made by Cray Research starting in 1985. At 1.9 GFLOPS peak performance, it was the fastest machine in the world when it was released, replacing the Cray X-MP in that spot. It was, in turn, replaced in that spot by the Cray Y-MP in 1988. The Cray-2 was the first of Seymour Cray's designs to successfully use multiple CPUs. This had been attempted in the CDC 8600 in the early 1970s, but the emitter-coupled logic (ECL) transistors of the era were too difficult to package into a working machine. The Cray-2 addressed this through the use of ECL integrated circuits, packing them in a novel 3D wiring that greatly increased circuit density. The dense packaging and resulting heat loads were a major problem for the Cray-2. This was solved in a unique fashion by forcing the electrically inert Fluorinert liquid through the circuitry under pressure and then cooling it outside the processor box. The unique "waterfall" cooler system came to represent high-performance computing in the public eye and was found in many informational films and as a movie prop for some time. Unlike the original Cray-1, the Cray-2 had difficulties delivering peak performance. Other machines from the company, like the X-MP and Y-MP, outsold the Cray-2 by a wide margin. When Cray began development of the Cray-3, the company chose to develop the Cray C90 series instead. This is the same sequence of events that occurred when the 8600 was being developed, and as in that case, Cray left the company.
  • 463
  • 04 Nov 2022
  • Page
  • of
  • 47
ScholarVision Creations