Topic Review
Wolfram Alpha
Wolfram Alpha (also styled WolframAlpha, and Wolfram|Alpha) is a computational knowledge engine or answer engine developed by Wolfram Alpha LLC, a subsidiary of Wolfram Research. It is an online service that answers factual queries directly by computing the answer from externally sourced "curated data", rather than providing a list of documents or web pages that might contain the answer as a search engine might. Wolfram Alpha, which was released on May 18, 2009, is based on Wolfram's earlier flagship product Wolfram Mathematica, a computational platform or toolkit that encompasses computer algebra, symbolic and numerical computation, visualization, and statistics capabilities. Additional data is gathered from both academic and commercial websites such as the CIA's The World Factbook, the United States Geological Survey, a Cornell University Library publication called All About Birds, Chambers Biographical Dictionary, Dow Jones, the Catalogue of Life, CrunchBase, Best Buy, the FAA and optionally a user's Facebook account.
  • 1.2K
  • 02 Nov 2022
Topic Review
IEEE 802.11ax-2021
IEEE 802.11ax-2021 or 802.11ax is an IEEE standard for wireless local-area networks (WLANs) and the successor of 802.11ac. It is marketed as Wi-Fi 6 (2.4 GHz and 5 GHz) and Wi-Fi 6E (6 GHz) by the Wi-Fi Alliance. It is also known as High Efficiency Wi-Fi, for the overall improvements to Wi-Fi 6 clients under dense environments. It is designed to operate in license-exempt bands between 1 and 7.125 GHz, including the 2.4 and 5 GHz bands already in common use as well as the much wider 6 GHz band (5.925–7.125 GHz in the US). The main goal of this standard is enhancing throughput-per-area[lower-alpha 1] in high-density scenarios, such as corporate offices, shopping malls and dense residential apartments. While the nominal data rate improvement against 802.11ac is only 37%,:qt the overall throughput improvement (over an entire network) is 400% (hence High Efficiency).:qt This also translates to 75% lower latency. The quadruplication of overall throughput is made possible by a higher spectral efficiency. The key feature underpinning 802.11ax is orthogonal frequency-division multiple access (OFDMA), which is equivalent to cellular technology applied into Wi-Fi.:qt Other improvements on spectrum utilization are better power-control methods to avoid interference with neighboring networks, higher order 1024‑QAM, up-link direction added with the down-link of MIMO and MU-MIMO to further increase throughput, as well as dependability improvements of power consumption and security protocols such as Target Wake Time and WPA3. The IEEE 802.11ax-2021 standard was approved in February 9th, 2021.
  • 1.2K
  • 11 Nov 2022
Topic Review
Office Online
Office Online (known before 2014 as Office Web Apps and as of July 2019 as Office) is an online office suite offered by Microsoft, which allows users to create and edit files using lightweight Microsoft Office web apps: Word, Excel, PowerPoint and OneNote. The offering also includes Outlook.com, People, Calendar and OneDrive, all of which are accessible from a unified app switcher. Users can install the on-premises version of this service, called Office Online Server, in private clouds in conjunction with SharePoint, Microsoft Exchange Server and Microsoft Lync Server.
  • 1.2K
  • 25 Nov 2022
Topic Review
IBM ZEnterprise System
IBM zEnterprise System is an IBM mainframe designed to offer both mainframe and distributed server technologies in an integrated system. The zEnterprise System consists of three components. First is a System z server – a choice of the newest enterprise class server, the IBM zEnterprise EC12 that was announced August 28, 2012, the smaller business class server the IBM zEnterprise 114 (z114) announced July 2011, or the older enterprise-class server the IBM zEnterprise 196 (z196) that was introduced July 2010. Second is the IBM zEnterprise BladeCenter Extension (zBX), the infrastructure designed to provide logical integration and host IBM WebSphere DataPower Integrated Appliance XI50 for zEnterprise (DataPower XI50z) or general purpose x86 or Power ISA blades. Last is the management layer, IBM zEnterprise Unified Resource Manager (zManager), which provides a single management view of zEnterprise resources. In July 2013, IBM introduced an updated version of the z114 called the zBC12 and a special version of it designed to be a Linux virtualization server, the zBC12 Enterprise Linux Server running only Linux hosts on the underlying z/VM hypervisor. In January 2015, IBM introduced the z13 mainframe and in February 2016, the z13s was introduced. It is the last z Systems server to support running an operating system in ESA/390 architecture mode. In July 2017, IBM introduced the z14 mainframe. In April 2018, IBM introduced the z14 ZR1 mainframe. In September 2019, IBM introduced the z15 mainframe.
  • 1.2K
  • 20 Oct 2022
Topic Review
Cyc
Cyc (pronounced /ˈsaɪk/ SYKE) is a long-term artificial intelligence project that aims to assemble a comprehensive ontology and knowledge base that spans the basic concepts and rules about how the world works. Hoping to capture common sense knowledge, Cyc focuses on implicit knowledge that other AI platforms may take for granted. This is contrasted with facts one might find somewhere on the internet or retrieve via a search engine or Wikipedia. Cyc enables semantic reasoners to perform human-like reasoning and be less "brittle" when confronted with novel situations. Douglas Lenat began the project in July 1984 at MCC, where he was Principal Scientist 1984–1994, and then, since January 1995, has been under active development by the Cycorp company, where he is the CEO.
  • 1.2K
  • 19 Oct 2022
Topic Review
Schwarz Triangle Function
In complex analysis, the Schwarz triangle function or Schwarz s-function is a function that conformally maps the upper half plane to a triangle in the upper half plane having lines or circular arcs for edges. Let πα, πβ, and πγ be the interior angles at the vertices of the triangle. If any of α, β, and γ are greater than zero, then the Schwarz triangle function can be given in terms of hypergeometric functions as: where a = (1−α−β−γ)/2, b = (1−α+β−γ)/2, c = 1−α, a′ = a − c + 1 = (1+α−β−γ)/2, b′ = b − c + 1 = (1+α+β−γ)/2, and c′ = 2 − c = 1 + α. This mapping has singular points at z = 0, 1, and ∞, corresponding to the vertices of the triangle with angles πα, πγ, and πβ respectively. At these singular points, This formula can be derived using the Schwarzian derivative. This function can be used to map the upper half-plane to a spherical triangle on the Riemann sphere if α + β + γ > 1, or a hyperbolic triangle on the Poincaré disk if α + β + γ < 1. When α + β + γ = 1, then the triangle is a Euclidean triangle with straight edges: a = 0, [math]\displaystyle{ _2 F_1 \left(a, b; c; z\right) = 1 }[/math], and the formula reduces to that given by the Schwarz–Christoffel transformation. In the special case of ideal triangles, where all the angles are zero, the triangle function yields the modular lambda function. This function was introduced by H. A. Schwarz as the inverse function of the conformal mapping uniformizing a Schwarz triangle. Applying successive hyperbolic reflections in its sides, such a triangle generates a tessellation of the upper half plane (or the unit disk after composition with the Cayley transform). The conformal mapping of the upper half plane onto the interior of the geodesic triangle generalizes the Schwarz–Christoffel transformation. By the Schwarz reflection principle, the discrete group generated by hyperbolic reflections in the sides of the triangle induces an action on the two dimensional space of solutions. On the orientation-preserving normal subgroup, this two dimensional representation corresponds to the monodromy of the ordinary differential equation and induces a group of Möbius transformations on quotients of solutions. Since the triangle function is the inverse function of such a quotient, it is therefore an automorphic function for this discrete group of Möbius transformations. This is a special case of a general method of Henri Poincaré that associates automorphic forms with ordinary differential equations with regular singular points.
  • 1.2K
  • 13 Oct 2022
Topic Review
Geologic Modelling
Geologic modelling, geological modelling or geomodelling is the applied science of creating computerized representations of portions of the Earth's crust based on geophysical and geological observations made on and below the Earth surface. A geomodel is the numerical equivalent of a three-dimensional geological map complemented by a description of physical quantities in the domain of interest. Geomodelling is related to the concept of Shared Earth Model; which is a multidisciplinary, interoperable and updatable knowledge base about the subsurface. Geomodelling is commonly used for managing natural resources, identifying natural hazards, and quantifying geological processes, with main applications to oil and gas fields, groundwater aquifers and ore deposits. For example, in the oil and gas industry, realistic geologic models are required as input to reservoir simulator programs, which predict the behavior of the rocks under various hydrocarbon recovery scenarios. A reservoir can only be developed and produced once; therefore, making a mistake by selecting a site with poor conditions for development is tragic and wasteful. Using geological models and reservoir simulation allows reservoir engineers to identify which recovery options offer the safest and most economic, efficient, and effective development plan for a particular reservoir. Geologic modelling is a relatively recent subdiscipline of geology which integrates structural geology, sedimentology, stratigraphy, paleoclimatology, and diagenesis; In 2-dimensions (2D), a geologic formation or unit is represented by a polygon, which can be bounded by faults, unconformities or by its lateral extent, or crop. In geological models a geological unit is bounded by 3-dimensional (3D) triangulated or gridded surfaces. The equivalent to the mapped polygon is the fully enclosed geological unit, using a triangulated mesh. For the purpose of property or fluid modelling these volumes can be separated further into an array of cells, often referred to as voxels (volumetric elements). These 3D grids are the equivalent to 2D grids used to express properties of single surfaces. Geomodelling generally involves the following steps:
  • 1.1K
  • 06 Oct 2022
Topic Review
Setspn
Kerberos (/ˈkɜːrbərɒs/) is a computer-network authentication protocol that works on the basis of tickets to allow nodes communicating over a non-secure network to prove their identity to one another in a secure manner. Its designers aimed it primarily at a client–server model, and it provides mutual authentication—both the user and the server verify each other's identity. Kerberos protocol messages are protected against eavesdropping and replay attacks. Kerberos builds on symmetric-key cryptography and requires a trusted third party, and optionally may use public-key cryptography during certain phases of authentication. Kerberos uses UDP port 88 by default. The protocol was named after the character Kerberos (or Cerberus) from Greek mythology, the ferocious three-headed guard dog of Hades.
  • 1.1K
  • 25 Oct 2022
Topic Review
Geochemical Modeling
Geochemical modeling is the practice of using chemical thermodynamics, chemical kinetics, or both, to analyze the chemical reactions that affect geologic systems, commonly with the aid of a computer. It is used in high-temperature geochemistry to simulate reactions occurring deep in the Earth's interior, in magma, for instance, or to model low-temperature reactions in aqueous solutions near the Earth's surface, the subject of this article.
  • 1.1K
  • 26 Oct 2022
Topic Review
Nine-Point Circle
In geometry, the nine-point circle is a circle that can be constructed for any given triangle. It is so named because it passes through nine significant concyclic points defined from the triangle. These nine points are: The nine-point circle is also known as Feuerbach's circle, Euler's circle, Terquem's circle, the six-points circle, the twelve-points circle, the n-point circle, the medioscribed circle, the mid circle or the circum-midcircle. Its center is the nine-point center of the triangle.
  • 1.1K
  • 19 Oct 2022
Topic Review
Applications of Multi-Connectivity in 5G Networks and Beyond
To manage a growing number of users and an ever-increasing demand for bandwidth, future 5th Generation (5G) cellular networks will combine different radio access technologies (cellular, satellite, and WiFi, among others) and different types of equipment (pico-cells, femto-cells, small-cells, macro-cells, etc.). Multi-connectivity is an emerging paradigm aiming to leverage this heterogeneous architecture. To achieve that, multi-connectivity proposes to enable each User Equipment to simultaneously use component carriers from different and heterogeneous network nodes: base stations, WiFi Access Points, etc. This could offer many benefits in terms of Quality of Service, energy efficiency, fairness, mobility, spectrum and interference management. That is why this survey aims to present an overview of multi-connectivity in 5G networks and Beyond. To do so, a comprehensive review of existing standards and enabling technologies is proposed. Then, a taxonomy is defined to classify the different elements characterizing multi-connectivity in 5G and future networks. Thereafter, existing research works using multi-connectivity to improve Quality of Service, energy efficiency, fairness, mobility management and spectrum and interference management are analyzed and compared. In addition, lessons common to these different contexts are presented. Finally, open challenges for multi-connectivity in 5G networks and Beyond are discussed.
  • 1.1K
  • 24 Oct 2022
Topic Review
Reduction (Complexity)
In computability theory and computational complexity theory, a reduction is an algorithm for transforming one problem into another problem. A reduction from one problem to another may be used to show that the second problem is at least as difficult as the first. Intuitively, problem A is reducible to problem B if an algorithm for solving problem B efficiently (if it existed) could also be used as a subroutine to solve problem A efficiently. When this is true, solving A cannot be harder than solving B. "Harder" means having a higher estimate of the required computational resources in a given context (e.g., higher time complexity, greater memory requirement, expensive need for extra hardware processor cores for a parallel solution compared to a single-threaded solution, etc.). We write A ≤m B, usually with a subscript on the ≤ to indicate the type of reduction being used (m : mapping reduction, p : polynomial reduction). The mathematical structure generated on a set of problems by the reductions of a particular type generally forms a preorder, whose equivalence classes may be used to define degrees of unsolvability and complexity classes.
  • 1.1K
  • 30 Sep 2022
Topic Review
Hyperbolic Function
In mathematics, hyperbolic functions are analogs of the ordinary trigonometric functions defined for the hyperbola rather than on the circle: just as the points (cos t, sin t) form a circle with a unit radius, the points (cosh t, sinh t) form the right half of the equilateral hyperbola. Hyperbolic functions occur in the solutions of many linear differential equations (for example, the equation defining a catenary), of some cubic equations, in calculations of angles and distances in hyperbolic geometry, and of Laplace's equation in Cartesian coordinates. Laplace's equations are important in many areas of physics, including electromagnetic theory, heat transfer, fluid dynamics, and special relativity. The basic hyperbolic functions are: from which are derived: corresponding to the derived trigonometric functions. The inverse hyperbolic functions are: The hyperbolic functions take a real argument called a hyperbolic angle. The size of a hyperbolic angle is twice the area of its hyperbolic sector. The hyperbolic functions may be defined in terms of the legs of a right triangle covering this sector. In complex analysis, the hyperbolic functions arise as the imaginary parts of sine and cosine. The hyperbolic sine and the hyperbolic cosine are entire functions. As a result, the other hyperbolic functions are meromorphic in the whole complex plane. By Lindemann–Weierstrass theorem, the hyperbolic functions have a transcendental value for every non-zero algebraic value of the argument. Hyperbolic functions were introduced in the 1760s independently by Vincenzo Riccati and Johann Heinrich Lambert. Riccati used Sc. and Cc. (sinus/cosinus circulare) to refer to circular functions and Sh. and Ch. (sinus/cosinus hyperbolico) to refer to hyperbolic functions. Lambert adopted the names but altered the abbreviations to what they are today. The abbreviations sh, ch, th, cth are also at disposition, their use depending more on personal preference of mathematics of influence than on the local language.
  • 1.1K
  • 10 Nov 2022
Topic Review
Level Design
Level design, or environment design, is a discipline of game development involving creation of video game levels—locales, stages, or missions. This is commonly done using a level editor, a game development software designed for building levels; however, some games feature built-in level editing tools.
  • 1.1K
  • 08 Oct 2022
Topic Review
SCORE
SCORE is a scorewriter program, written in FORTRAN for MS-DOS by Stanford University Professor Leland Smith (1925–2013) with a reputation for producing very high-quality results. It was widely used in engraving during the 1980s and 1990s and continues to have a small, dedicated following of engravers, many of whom regard it as the world's best music-engraving program due to its ability to position symbols precisely on the page. Several publications set using SCORE have earned Paul Revere and German Musikpresse engraving awards.
  • 1.1K
  • 08 Nov 2022
Topic Review
Unique Games Conjecture
In computational complexity theory, the unique games conjecture (often referred to as UGC) is a conjecture made by Subhash Khot in 2002. The conjecture postulates that the problem of determining the approximate value of a certain type of game, known as a unique game, has NP-hard computational complexity. It has broad applications in the theory of hardness of approximation. If the unique games conjecture is true and P ≠ NP, then for many important problems it is not only impossible to get an exact solution in polynomial time (as postulated by the P versus NP problem), but also impossible to get a good polynomial-time approximation. The problems for which such an inapproximability result would hold include constraint satisfaction problems, which crop up in a wide variety of disciplines. The conjecture is unusual in that the academic world seems about evenly divided on whether it is true or not.
  • 1.1K
  • 26 Oct 2022
Topic Review
FAO Country Profiles
The FAO Country Profiles are a multilingual web portal which repackages the Food and Agriculture Organization of the United Nations (FAO) information archive on its global activities in agriculture and food security in a single area and catalogues it exclusively by country and thematic areas. The organization aims to offer decision-makers, researchers and project formulators around the world a fast and reliable way to access country-specific information on national food security situations without the need to search individual databases and systems. It aids FAO's database by providing a simple interface containing interactive maps and charts.
  • 1.1K
  • 09 Nov 2022
Topic Review
Firo (Cryptocurrency)
Firo, formerly known as Zcoin, is a cryptocurrency aimed at using cryptography to provide better privacy for its users compared to other cryptocurrencies such as Bitcoin.
  • 1.1K
  • 28 Nov 2022
Topic Review
When 5G Meets Deep Learning
This paper presents a systematic review about how deep learning is being applied to solve some 5G issues. Differently from the current literature, we examine data from the last decade and the works that address diverse 5G specific problems, such as physical medium state estimation, network traffic prediction, user device location prediction, self network management, among others. 
  • 1.1K
  • 23 Dec 2020
Topic Review
Nurses' Health Study
The Nurses' Health Study is a series of prospective studies that examine epidemiology and the long-term effects of nutrition, hormones, environment, and nurses' work-life on health and disease development. The studies have been among the largest investigations into risk factors for major chronic diseases ever conducted. The Nurses' Health Studies have led to many insights on health and well-being, including cancer prevention, cardiovascular disease, and type 2 diabetes. They have included clinicians, epidemiologists, and statisticians at the Channing Laboratory (of Brigham and Women's Hospital), Harvard Medical School, Harvard School of Public Health, and several Harvard-affiliated hospitals, including Brigham and Women's Hospital, Dana–Farber Cancer Institute, Children's Hospital Boston, and Beth Israel Deaconess Medical Center.
  • 1.1K
  • 30 Oct 2022
  • Page
  • of
  • 47
ScholarVision Creations