Topic Review
Roadrunner (Supercomputer)
Roadrunner was a supercomputer built by IBM for the Los Alamos National Laboratory in New Mexico, USA. The US$100-million Roadrunner was designed for a peak performance of 1.7 petaflops. It achieved 1.026 petaflops on May 25, 2008, to become the world's first TOP500 LINPACK sustained 1.0 petaflops system. In November 2008, it reached a top performance of 1.456 petaFLOPS, retaining its top spot in the TOP500 list. It was also the fourth-most energy-efficient supercomputer in the world on the Supermicro Green500 list, with an operational rate of 444.94 megaflops per watt of power used. The hybrid Roadrunner design was then reused for several other energy efficient supercomputers. Roadrunner was decommissioned by Los Alamos on March 31, 2013. In its place, Los Alamos commissioned a supercomputer called Cielo, which was installed in 2010. Cielo was smaller and more energy efficient than Roadrunner, and cost $54 million.
  • 620
  • 19 Oct 2022
Topic Review
Louvain Modularity
The Louvain Method for community detection is a method to extract communities from large networks created by Blondel et al. from the University of Louvain (affiliation of authors has given the method its name). The method is a greedy optimization method that appears to run in time [math]\displaystyle{ O(n \log n) }[/math].
  • 620
  • 10 Nov 2022
Topic Review
Sensor Data Fusion Algorithms
Sensor Data Fusion (SDT) algorithms and methods have been utilised in many applications ranging from automobiles to healthcare systems. They can be used to design a redundant, reliable, and complementary system with the intent of enhancing the system’s performance. SDT can be multifaceted, involving many representations such as pixels, features, signals, and symbols.
  • 619
  • 12 Dec 2023
Topic Review
Famous Photographical Manipulations
Photographical Manipulation is the alteration of a photograph. The U.S. National Press Photographers Association (NPPA) Digital Manipulation Code of Ethics states: “As journalists we believe the guiding principle of our profession is accuracy; therefore, we believe it is wrong to alter the content of a photograph in any way that deceives the public. As photojournalists, we have the responsibility to document society and to preserve its images as a matter of historical record. It is clear that the emerging electronic technologies provide new challenges to the integrity of photographic images... in light of this, we the National Press Photographers Association, reaffirm the basis of our ethics: Accurate representation is the benchmark of our profession. We believe photojournalistic guidelines for fair and accurate reporting should be the criteria for judging what may be done electronically to a photograph. Altering the editorial content... is a breach of the ethical standards recognized by the NPPA.”
  • 618
  • 17 Oct 2022
Topic Review
MIMO Wireless Signals
This entry presents a comprehensive, contemporary review of the latest subsystems, architectures and integrated technologies of MIMO wireless signals backhauling using optical fibre or fibre access networks, such as passive optical networks (PONs).
  • 616
  • 09 Feb 2021
Topic Review
MATE
MATE (/ˈmɑːtɛ/) is a desktop environment composed of free and open-source software that runs on Linux, BSD, and illumos operating systems.
  • 616
  • 23 Nov 2022
Topic Review
History of Programming Languages
History of Programming Languages (HOPL) is an infrequent ACM SIGPLAN conference. Past conferences were held in 1978, 1993, and 2007. The fourth conference was originally intended to take place in June 2020, but has been postponed.
  • 615
  • 27 Oct 2022
Topic Review
Deception Technology
Deception technology is a category of cyber security defense. Deception technology products can detect, analyze, and defend against zero-day and advanced attacks, often in real time. They are automated, accurate, and provide insight into malicious activity within internal networks which may be unseen by other types of cyber defense. Deception technology enables a more proactive security posture by seeking to deceive the attackers, detect them and then defeat them, allowing the enterprise to return to normal operations. Existing defense-in-depth cyber technologies have struggled against the increasing wave of sophisticated and persistent human attackers. These technologies seek primarily to defend a perimeter, but both firewalls and endpoint security cannot defend a perimeter with 100% certainty. Cyber-attackers can penetrate these networks and move unimpeded for months, stealing data and intellectual property. Heuristics may find an attacker within the network, but often generate so many alerts that critical alerts are missed. Since 2014, attacks have accelerated and there is evidence that cyber-attackers are penetrating traditional defenses at a rapidly increasing rate. Deception technology considers the human attacker's point of view and method for exploiting and navigating networks to identify and exfiltrate data. It integrates with existing technologies to provide new visibility into the internal networks, share high probability alerts and threat intelligence with the existing infrastructure.
  • 611
  • 10 Nov 2022
Topic Review
X Display Manager (Program Type)
In the X Window System, an X display manager is a graphical login manager which starts a session on an X server from the same or another computer. A display manager presents the user with a login screen. A session starts when a user successfully enters a valid combination of username and password. When the display manager runs on the user's computer, it starts the X server before presenting the user the login screen, optionally repeating when the user logs out. In this condition, the DM realizes in the X Window System the functionality of getty and login on character-mode terminals. When the display manager runs on a remote computer, it acts like a telnet server, requesting username and password and starting a remote session. X11 Release 3 introduced display managers in October 1988 with the aim of supporting the standalone X terminals, just coming onto the market. Various display managers continue in routine use to provide a graphical login prompt on standalone computer workstations running X. X11R4 introduced the X Display Manager Control Protocol (XDMCP) in December 1989 to fix problems in the X11R3 implementation.
  • 611
  • 29 Sep 2022
Topic Review
Loco2
Loco2, now rebranded as Rail Europe, is an online booking service for train travel in the United Kingdom and Europe. It sells tickets through its website and via its smartphone app which is available on iOS and Android platforms. It was founded in 2006 by brother and sister Jamie and Kate Andrews, and started trading in 2012 from its headquarters in London. Loco2 website and apps were rebranded Rail Europe in November 2019.
  • 609
  • 19 Oct 2022
Topic Review
Bunyakovsky Conjecture
The Bunyakovsky conjecture (or Bouniakowsky conjecture) gives a criterion for a polynomial [math]\displaystyle{ f(x) }[/math] in one variable with integer coefficients to give infinitely many prime values in the sequence[math]\displaystyle{ f(1), f(2), f(3),\ldots. }[/math] It was stated in 1857 by the Russian mathematician Viktor Bunyakovsky. The following three conditions are necessary for [math]\displaystyle{ f(x) }[/math] to have the desired prime-producing property: Bunyakovsky's conjecture is that these conditions are sufficient: if [math]\displaystyle{ f(x) }[/math] satisfies (1)-(3), then [math]\displaystyle{ f(n) }[/math] is prime for infinitely many positive integers [math]\displaystyle{ n }[/math].
  • 608
  • 31 Oct 2022
Topic Review
Signalling (Economics)
In contract theory, signalling (or signaling; see spelling differences) is the idea that one party (the agent) credibly conveys some information about itself to another party (the principal). Although signalling theory was initially developed by Michael Spence based on observed knowledge gaps between organisations and prospective employees, its intuitive nature led it to be adapted to many other domains, such as Human Resource Management, business, and financial markets. In Spence's job-market signaling model, (potential) employees send a signal about their ability level to the employer by acquiring education credentials. The informational value of the credential comes from the fact that the employer believes the credential is positively correlated with having the greater ability and difficulty for low ability employees to obtain. Thus the credential enables the employer to reliably distinguish low ability workers from high ability workers. The concept of signaling is also applicable in competitive altruistic interaction, where the capacity of the receiving party is limited.
  • 606
  • 21 Nov 2022
Topic Review
Opaque Forest Problem
In computational geometry, the opaque forest problem can be stated as follows: "Given a convex polygon C in the plane, determine the minimal forest T of closed, bounded line segments such that every line through C also intersects T". T is said to be the opaque forest, or barrier of C. C is said to be the coverage of T. While any forest that covers C is a barrier of C, we wish to find the one with shortest length. It may be the case that T is constrained to be strictly interior or exterior to C. In this case, we specifically refer to a barrier as interior or exterior. Otherwise, the barrier is assumed to have no constraints on its location.
  • 605
  • 08 Oct 2022
Topic Review
Non-Standard Analysis
The history of calculus is fraught with philosophical debates about the meaning and logical validity of fluxions or infinitesimal numbers. The standard way to resolve these debates is to define the operations of calculus using epsilon–delta procedures rather than infinitesimals. Non-standard analysis instead reformulates the calculus using a logically rigorous notion of infinitesimal numbers. Non-standard analysis was originated in the early 1960s by the mathematician Abraham Robinson. He wrote: "... the idea of infinitely small or infinitesimal quantities seems to appeal naturally to our intuition. At any rate, the use of infinitesimals was widespread during the formative stages of the Differential and Integral Calculus. As for the objection ... that the distance between two distinct real numbers cannot be infinitely small, Gottfried Wilhelm Leibniz argued that the theory of infinitesimals implies the introduction of ideal numbers which might be infinitely small or infinitely large compared with the real numbers but which were to possess the same properties as the latter". Robinson argued that this law of continuity of Leibniz's is a precursor of the transfer principle. Robinson continued: "However, neither he nor his disciples and successors were able to give a rational development leading up to a system of this sort. As a result, the theory of infinitesimals gradually fell into disrepute and was replaced eventually by the classical theory of limits." "It is shown in this book that Leibniz's ideas can be fully vindicated and that they lead to a novel and fruitful approach to classical Analysis and to many other branches of mathematics. The key to our method is provided by the detailed analysis of the relation between mathematical languages and mathematical structures which lies at the bottom of contemporary model theory." In 1973, intuitionist Arend Heyting praised non-standard analysis as "a standard model of important mathematical research".
  • 604
  • 17 Nov 2022
Topic Review
Propellerhead Software
Reason Studios (formerly known as Propellerhead Software) is a music software company, based in Stockholm, Sweden, and founded in 1994. It produces the studio emulation Reason.
  • 602
  • 04 Nov 2022
Topic Review
Root System of a Semi-simple Lie Algebra
In mathematics, there is a one-to-one correspondence between reduced crystallographic root systems and semisimple Lie algebras. Here the construction of a root system of a semisimple Lie algebra – and, conversely, the construction of a semisimple Lie algebra from a reduced crystallographic root system – are shown.
  • 599
  • 17 Nov 2022
Topic Review
Heuristic (Computer Science)
In mathematical optimization and computer science, heuristic (from Greek εὑρίσκω "I find, discover") is a technique designed for solving a problem more quickly when classic methods are too slow, or for finding an approximate solution when classic methods fail to find any exact solution. This is achieved by trading optimality, completeness, accuracy, or precision for speed. In a way, it can be considered a shortcut. A heuristic function, also simply called a heuristic, is a function that ranks alternatives in search algorithms at each branching step based on available information to decide which branch to follow. For example, it may approximate the exact solution.
  • 598
  • 14 Oct 2022
Topic Review
July 2009 Cyberattacks
The July 2009 cyberattacks were a series of coordinated cyberattacks against major government, news media, and financial websites in South Korea and the United States . The attacks involved the activation of a botnet—a large number of hijacked computers—that maliciously accessed targeted websites with the intention of causing their servers to overload due to the influx of traffic, known as a DDoS attack. Most of the hijacked computers were located in South Korea. The estimated number of the hijacked computers varies widely; around 20,000 according to the South Korean National Intelligence Service, around 50,000 according to Symantec's Security Technology Response group, and more than 166,000 according to a Vietnamese computer security researcher who analyzed the log files of the two servers the attackers controlled. An investigation revealed that at least 39 websites were targets in the attacks based on files stored on compromised systems. The targeting and timing of the attacks—which started the same day as a North Korean short-range ballistic missile test—have led to suggestions that they may be from North Korea, although these suggestions have not been substantiated. Researchers would later find links between these cyberattacks, the DarkSeoul attacks in 2013, and other attacks attributed to the Lazarus Group. This attack is considered by some to be the beginning of a series of DDoS attacks carried about by Lazarus dubbed "Operation Troy."
  • 598
  • 11 Nov 2022
Topic Review
MediaWiki Extension
MediaWiki extensions allow MediaWiki to be made more advanced and useful for various purposes. These extensions vary greatly in complexity. The Wikimedia Foundation operates a Git server where many extensions are hosted, and a directory of them can be found on the MediaWiki website. Some other sites also are known for development of—or support for—extensions are MediaWiki.org, which maintains an extension matrix; and Google Code. MediaWiki code review is itself facilitated through a Gerrit instance. Since version 1.16 MediaWiki also used the jQuery library.
  • 596
  • 09 Nov 2022
Topic Review
Regional Accreditation
Regional accreditation is the educational accreditation of schools, colleges, and universities in the United States by one of seven regional accrediting agencies. Accreditation is a voluntary process by which colleges demonstrate to each other, and sometimes to employers and licensing agencies, that their credits and degrees meet minimum standards. It is the self-regulation of the higher education industry. Each regional accreditor oversees the vast majority of public and private educational institutions, both not-for-profit and for-profit, in its region. Their primary function is accreditation of post-secondary institutions, though there is a limited amount of accreditation of primary and secondary schools. Regional accreditation is older than national accreditation and, with a few exceptions, more rigorous than national accreditation. Additionally, most non-profit institutions are regionally accredited while most for-profit colleges and universities are nationally accredited.
  • 595
  • 27 Oct 2022
  • Page
  • of
  • 47
ScholarVision Creations