Topic Review
Radiance
Radiance is a suite of tools for performing lighting simulation originally written by Greg Ward. It includes a renderer as well as many other tools for measuring the simulated light levels. It uses ray tracing to perform all lighting calculations, accelerated by the use of an octree data structure. It pioneered the concept of high-dynamic-range imaging, where light levels are (theoretically) open-ended values instead of a decimal proportion of a maximum (e.g. 0.0 to 1.0) or integer fraction of a maximum (0 to 255 / 255). It also implements global illumination using the Monte Carlo method to sample light falling on a point. Greg Ward started developing Radiance in 1985 while at Lawrence Berkeley National Laboratory. The source code was distributed under a license forbidding further redistribution. In January 2002 Radiance 3.4 was relicensed under a less restrictive license. One study found Radiance to be the most generally useful software package for architectural lighting simulation. The study also noted that Radiance often serves as the underlying simulation engine for many other packages.
  • 439
  • 25 Nov 2022
Topic Review
IBM I Control Language
The IBM i Control Language (CL) is a scripting language for the IBM's IBM i platform (previously called OS/400 when running on AS/400 systems) bearing a resemblance to the IBM Job Control Language and consisting of an ever-expanding set of command objects (*CMD) used to invoke traditional AS/400 programs and/or get help on what those programs do. CL can also be used to create CL programs (congruent to shell scripts) where there are additional commands that provide program-like functionality (IF/ELSE, variable declaration, file input, etc.) Although CL is a scripting language for system administration, it is used mainly to create compiled programs. The use of interpreted CL scripts through the SBMDBJOB command is in fact extremely limited. While thousands of commands were written by IBM developers to perform system level tasks like compiling programs, backing up data, changing system configurations, displaying system object details, or deleting them, commands are not limited to systems level concerns and can be drafted for user applications as well.
  • 439
  • 29 Nov 2022
Topic Review
Environment Modules
The Environment Modules system is a tool to help users manage their Unix or Linux shell environment, by allowing groups of related environment-variable settings to be made or removed dynamically. Modules has been around since the early 1990s and is used at some of the largest computer centers to deploy multiple versions of different software tools to users. The National Energy Research Scientific Computing Center (NERSC) reports that they use Environment Modules to manage nearly all software. Environment Modules is specified as a Baseline Configuration requirement of the DoD High Performance Computing Modernization Program (HPCMP) Project Baseline Configuration team for participating DoD Supercomputing Resource Centers (DSRCs).
  • 439
  • 01 Dec 2022
Topic Review
Application Profiling System Architecture
Along with the rise of cloud and edge computing has come a plethora of solutions that regard the deployment and operation of different types of applications in such environments. Infrastructure as a service (IaaS) providers offer a number of different hardware solutions to facilitate the needs of the growing number of distributed applications. It is critical in this landscape to be able to navigate and discover the best-suited infrastructure solution for the applications, taking into account not only the cost of operation but also the quality of service (QoS) required for any given application. The proposed solution has two main research developments: (a) the creation and optimisation of multidimensional vectors that represent the hardware usage profiles of an application, and (b) the assimilation of a machine learning classification algorithm, in order to create a system that can create hardware-agnostic profiles of a vast variety of containerised applications in terms of nature and computational needs and classify them to known benchmarks. Given that benchmarks are widely used to evaluate a system’s hardware capabilities, having a system that can help select which benchmarks best correlate to a given application can help an IaaS provider make a more informed decision or recommendation on the hardware solution, not in a broad sense, but based on the needs of a specific application.
  • 439
  • 20 Dec 2022
Topic Review
Bayesian Inference in Motor Learning
Bayesian inference is a statistical tool that can be applied to motor learning, specifically to adaptation. Adaptation is a short-term learning process involving gradual improvement in performance in response to a change in sensory information. Bayesian inference is used to describe the way the nervous system combines this sensory information with prior knowledge to estimate the position or other characteristics of something in the environment. Bayesian inference can also be used to show how information from multiple senses (e.g. visual and proprioception) can be combined for the same purpose. In either case, Bayesian inference dictates that the estimate is most influenced by whichever information is most certain.
  • 438
  • 27 Oct 2022
Topic Review
Federated Learning Algorithms in Healthcare
Federated Learning (FL), an emerging distributed collaborative artificial intelligence (AI) paradigm, is particularly suitable for smart healthcare by coordinating the training of numerous clients, that is, in healthcare institutes, without the exchange of private data.
  • 438
  • 26 Dec 2022
Topic Review
Applications of Hybrid Nanofluids
In response to the issues of environment, climate, and human health coupled with the growing demand for energy due to increasing population and technological advancement, the concept of sustainable and renewable energy is presently receiving unprecedented attention. To achieve these feats, energy savings and efficiency are crucial in terms of the development of energy-efficient devices and thermal fluids. Limitations associated with the use of conventional thermal fluids led to the discovery of energy-efficient fluids called “nanofluids, which are established to be better than conventional thermal fluids. The research progress on nanofluids has led to the development of the advanced nanofluids coined “hybrid nanofluids” (HNFs) found to possess superior thermal-optical properties than conventional thermal fluids and nanofluids. 
  • 438
  • 08 Feb 2023
Topic Review
Explainable AI (XAI) Explanation Techniques
Interest in artificial intelligence (AI) has been increasing rapidly over the past decade and has expanded to essentially all domains. Along with it grew the need to understand the predictions and suggestions provided by machine learning. Explanation techniques have been researched intensively in the context of explainable AI (XAI), with the goal of boosting confidence, trust, user satisfaction, and transparency.
  • 438
  • 19 Jun 2023
Topic Review
Internet of Everything
The Internet of Everything (IoE) represents a paradigm shift in the world of connectivity. While the Internet of Things (IoT) initiated the era of interconnected devices, the IoE takes this concept to new heights by interlinking objects, individuals, data, and processes. Symmetry in IoE innovation and technology is essential for creating a harmonious and efficient ecosystem to ensure that the benefits are accessible to a broad spectrum of society while minimizing potential drawbacks. 
  • 438
  • 14 Nov 2023
Topic Review
OMOP CDM for Data-Driven Studies for Cancer Prediction
The current generation of sequencing technologies has led to significant advances in identifying novel disease-associated mutations and generated large amounts of data in a high-throughput manner. Such data in conjunction with clinical routine data are proven to be highly useful in deriving population-level and patient-level predictions, especially in the field of cancer precision medicine. However, data harmonization across multiple national and international clinical sites is an essential step for the assessment of events and outcomes associated with patients, which is currently not adequately addressed. The Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) is an internationally established research data repository introduced by the Observational Health Data Science and Informatics (OHDSI) community to overcome this issue. To address the needs of cancer research, the genomic vocabulary extension was introduced in 2020 to support the standardization of subsequent data analysis. Studies present multicentric investigations, in which the OMOP played an essential role in discovering and optimizing machine learning (ML)-based models. Ultimately, the use of the OMOP CDM leads to standardized data-driven studies for multiple clinical sites and enables a more solid basis utilizing, e.g., ML models that can be reused and combined in early prediction, diagnosis, and improvement of personalized cancer care and biomarker discovery.
  • 437
  • 10 May 2023
  • Page
  • of
  • 366
ScholarVision Creations