Your browser does not fully support modern features. Please upgrade for a smoother experience.
Subject:
All Disciplines Arts & Humanities Biology & Life Sciences Business & Economics Chemistry & Materials Science Computer Science & Mathematics Engineering Environmental & Earth Sciences Medicine & Pharmacology Physical Sciences Public Health & Healthcare Social Sciences
Sort by:
Most Viewed Latest Alphabetical (A-Z) Alphabetical (Z-A)
Filter:
All Topic Review Biography Peer Reviewed Entry Video Entry
Topic Review
Zwitterionic Dental Biomaterials
Biofilms are formed on surfaces inside the oral cavity covered by the acquired pellicle and develop into a complex, dynamic, microbial environment. Oral biofilm is a causative factor of dental and periodontal diseases. Accordingly, novel materials that can resist biofilm formation have attracted significant attention. Zwitterionic polymers (ZPs) have unique features that resist protein adhesion and prevent biofilm formation while maintaining biocompatibility. Recent literature has reflected a rapid increase in the application of ZPs as coatings and additives with promising outcomes. 
  • 7.2K
  • 29 Dec 2020
Topic Review
Hockey Stick Controversy
In the hockey stick controversy, the data and methods used in reconstructions of the temperature record of the past 1000 years have been disputed. Reconstructions have consistently shown that the rise in the instrumental temperature record of the past 150 years is not matched in earlier centuries, and the name "hockey stick graph" was coined for figures showing a long-term decline followed by an abrupt rise in temperatures. These graphs were publicised to explain the scientific findings of climatology, and in addition to scientific debate over the reconstructions, they have been the topic of political dispute. The issue is part of the global warming controversy and has been one focus of political responses to reports by the Intergovernmental Panel on Climate Change (IPCC). Arguments over the reconstructions have been taken up by fossil fuel industry–funded lobbying groups attempting to cast doubt on climate science. The use of proxy indicators to get quantitative estimates of the temperature record of past centuries was developed from the 1990s onwards, and found indications that recent warming was exceptional. The Bradley & Jones 1993 reconstruction introduced the "Composite Plus Scaling" (CPS) method used by most later large-scale reconstructions, and its findings were disputed by Patrick Michaels at the United States House Committee on Science. In 1998, Michael E. Mann, Raymond S. Bradley and Malcolm K. Hughes developed new statistical techniques to produce Mann, Bradley & Hughes 1998 (MBH98), the first eigenvector-based climate field reconstruction (CFR). This showed global patterns of annual surface temperature, and included a graph of average hemispheric temperatures back to 1400. In Mann, Bradley & Hughes 1999 (MBH99) the methodology was extended back to 1000. The term hockey stick was coined by the climatologist Jerry D. Mahlman, to describe the pattern this showed, envisaging a graph that is relatively flat to 1900 as forming an ice hockey stick's "shaft", followed by a sharp increase corresponding to the "blade". A version of this graph was featured prominently in the 2001 IPCC Third Assessment Report (TAR), along with four other reconstructions supporting the same conclusion. The graph was publicised, and became a focus of dispute for those opposed to the strengthening scientific consensus that late 20th-century warmth was exceptional. Those disputing the graph included Pat Michaels, the George C. Marshall Institute and Fred Singer. A paper by Willie Soon and Sallie Baliunas claiming greater medieval warmth was used by the Bush administration chief of staff Philip Cooney to justify altering the first Environmental Protection Agency Report on the Environment. The paper was quickly dismissed by scientists in the Soon and Baliunas controversy, but on July 28, Republican Jim Inhofe spoke in the Senate citing it to claim "that man-made global warming is the greatest hoax ever perpetrated on the American people". Later in 2003, a paper by Steve McIntyre and Ross McKitrick disputing the data used in MBH98 paper was publicised by the George C. Marshall Institute and the Competitive Enterprise Institute. In 2004, Hans von Storch published criticism of the statistical techniques as tending to underplay variations in earlier parts of the graph, though this was disputed and he later accepted that the effect was very small. In 2005, McIntyre and McKitrick published criticisms of the principal component analysis methodology as used in MBH98 and MBH99. The analysis therein was subsequently disputed by published papers, including Huybers 2005 and Wahl & Ammann 2007, which pointed to errors in the McIntyre and McKitrick methodology. In June 2005, Rep. Joe Barton launched what Sherwood Boehlert, chairman of the House Science Committee, called a "misguided and illegitimate investigation" into the data, methods and personal information of Mann, Bradley and Hughes. At Boehlert's request, a panel of scientists convened by the National Research Council was set up, which reported in 2006, supporting Mann's findings with some qualifications, including agreeing that there were some statistical failings but these had little effect on the result. Barton and U.S. Rep. Ed Whitfield requested Edward Wegman to set up a team of statisticians to investigate, and they supported McIntyre and McKitrick's view that there were statistical failings, although they did not quantify whether there was any significant effect. They also produced an extensive network analysis which has been discredited by expert opinion and found to have issues of plagiarism. Arguments against the MBH studies were reintroduced as part of the Climatic Research Unit email controversy, but dismissed by eight independent investigations. More than two dozen reconstructions, using various statistical methods and combinations of proxy records, have supported the broad consensus shown in the original 1998 hockey-stick graph, with variations in how flat the pre-20th century "shaft" appears. The 2007 IPCC Fourth Assessment Report cited 14 reconstructions, 10 of which covered 1,000 years or longer, to support its strengthened conclusion that it was likely that Northern Hemisphere temperatures during the 20th century were the highest in at least the past 1,300 years. Over a dozen subsequent reconstructions, including Mann et al. 2008 and PAGES 2k Consortium 2013, have supported these general conclusions.
  • 7.2K
  • 21 Oct 2022
Topic Review
Deformation (Engineering)
In engineering, deformation refers to the change in size or shape of an object. Displacements are the absolute change in position of a point on the object. Deflection is the relative change in external displacements on an object. Strain is the relative internal change in shape of an infinitesimally small cube of material and can be expressed as a non-dimensional change in length or angle of distortion of the cube. Strains are related to the forces acting on the cube, which are known as stress, by a stress-strain curve. The relationship between stress and strain is generally linear and reversible up until the yield point and the deformation is elastic. The linear relationship for a material is known as Young's modulus. Above the yield point, some degree of permanent distortion remains after unloading and is termed plastic deformation. The determination of the stress and strain throughout a solid object is given by the field of strength of materials and for a structure by structural analysis. Engineering stress and engineering strain are approximations to the internal state that may be determined from the external forces and deformations of an object, provided that there is no significant change in size. When there is a significant change in size, the true stress and true strain can be derived from the instantaneous size of the object. In the figure it can be seen that the compressive loading (indicated by the arrow) has caused deformation in the cylinder so that the original shape (dashed lines) has changed (deformed) into one with bulging sides. The sides bulge because the material, although strong enough to not crack or otherwise fail, is not strong enough to support the load without change. As a result, the material is forced out laterally. Internal forces (in this case at right angles to the deformation) resist the applied load. The concept of a rigid body can be applied if the deformation is negligible.
  • 7.2K
  • 24 Oct 2022
Topic Review
Vacuum State
In quantum field theory, the quantum vacuum state (also called the quantum vacuum or vacuum state) is the quantum state with the lowest possible energy. Generally, it contains no physical particles. Zero-point field is sometimes used as a synonym for the vacuum state of an individual quantized field. According to present-day understanding of what is called the vacuum state or the quantum vacuum, it is "by no means a simple empty space". According to quantum mechanics, the vacuum state is not truly empty but instead contains fleeting electromagnetic waves and particles that pop into and out of existence. The QED vacuum of quantum electrodynamics (or QED) was the first vacuum of quantum field theory to be developed. QED originated in the 1930s, and in the late 1940s and early 1950s it was reformulated by Feynman, Tomonaga and Schwinger, who jointly received the Nobel prize for this work in 1965. Today the electromagnetic interactions and the weak interactions are unified (at very high energies only) in the theory of the electroweak interaction. The Standard Model is a generalization of the QED work to include all the known elementary particles and their interactions (except gravity). Quantum chromodynamics (or QCD) is the portion of the Standard Model that deals with strong interactions, and QCD vacuum is the vacuum of quantum chromodynamics. It is the object of study in the Large Hadron Collider and the Relativistic Heavy Ion Collider, and is related to the so-called vacuum structure of strong interactions.
  • 7.2K
  • 17 Oct 2022
Topic Review Peer Reviewed
Metaverse
The Metaverse is the post-reality universe, a perpetual and persistent multiuser environment merging physical reality with digital virtuality. It is based on the convergence of technologies that enable multisensory interactions with virtual environments, digital objects and people such as virtual reality (VR) and augmented reality (AR). Hence, the Metaverse is an interconnected web of social, networked immersive environments in persistent multiuser platforms. It enables seamless embodied user communication in real-time and dynamic interactions with digital artifacts. Its first iteration was a web of virtual worlds where avatars were able to teleport among them. The contemporary iteration of the Metaverse features social, immersive VR platforms compatible with massive multiplayer online video games, open game worlds and AR collaborative spaces.
  • 7.2K
  • 13 Apr 2022
Topic Review
History of Solar System Formation and Evolution Hypotheses
The history of scientific thought about the Formation and evolution of the Solar System begins with the Copernican Revolution. The first recorded use of the term "Solar System" dates from 1704.
  • 7.2K
  • 22 Nov 2022
Topic Review
Human Body Segments
The knowledge of human body proportions and segmental properties of limbs, head and trunk is of fundamental importance in biomechanical research. Given that many methods are employed, it is important to know which ones are currently available, which data on human body masses, lengths, center of mass (COM) location, weights and moment of inertia (MOI) are available and which methods are most suitable for specific research purposes. Graphical, optical, x-ray and derived techniques, MRI, laser, thermography, has been employed for in-vivo measurement, while direct measurements involve cadaveric studies with dissection and various methods of acquiring shape and size of body segments.
  • 7.2K
  • 20 Nov 2020
Topic Review
Common Factors Theory
Common factors theory, a theory guiding some research in clinical psychology and counseling psychology, proposes that different approaches and evidence-based practices in psychotherapy and counseling share common factors that account for much of the effectiveness of a psychological treatment. This is in contrast to the view that the effectiveness of psychotherapy and counseling is best explained by specific or unique factors (notably, particular methods or procedures) that are suited to treatment of particular problems. According to one review, "it is widely recognized that the debate between common and unique factors in psychotherapy represents a false dichotomy, and these factors must be integrated to maximize effectiveness". In other words, "therapists must engage in specific forms of therapy for common factors to have a medium through which to operate". Common factors is one route by which psychotherapy researchers have attempted to integrate psychotherapies.
  • 7.2K
  • 08 Nov 2022
Topic Review
ISO Standards for Trailer Connectors
A number of ISO standards cover trailer connectors, the electrical connectors between vehicles and the trailers they tow that provide a means of control for the trailers. These are listed below, with notes on significant deviations from them that can cause problems.
  • 7.1K
  • 07 Nov 2022
Topic Review
Tetrabutylammonium Bromide
During the last two decades, tetrabutylammonium bromide (TBAB) has gained significant attention as an efficient metal-free homogeneous phase-transfer catalyst. A catalytic amount of TBAB is sufficient to catalyze various alkylation, oxidation, reduction, and esterification processes. It is also employed as an efficient co-catalyst for numerous coupling reactions. It has also acted as an efficient zwitterionic solvent in many organic transformations under molten conditions.
  • 7.1K
  • 09 Feb 2021
Topic Review
VARTM Processed Composite Materials
Fiber-reinforced composite structures are used in different applications due to their excellent strength to weight ratio. Due to cost and tool handling issues in conventional manufacturing processes, like resin transfer molding (RTM) and autoclave, vacuum-assisted resin transfer molding (VARTM) is the best choice among industries. VARTM is highly productive and cheap. However, the VARTM process produces complex, lightweight, and bulky structures, suitable for mass and cost-effective production, but the presence of voids and fiber misalignment in the final processed composite influences its strength. Voids are the primary defects, and they cannot be eliminated completely, so a design without considering void defects will entail unreliability. Many conventional failure theories were used for composite design but did not consider the effect of voids defects, thus creating misleading failure characteristics.
  • 7.1K
  • 07 Apr 2021
Topic Review
Dhunge Dhara
A dhunge dhara (Nepali:ढुङ्गे धारा Listen (help·info)) or hiti (Newari) is a traditional stone drinking fountain found in Nepal. It is an intricately carved stone waterway through which water flows uninterrupted from underground sources. Dhunge dharas are part of a comprehensive drinking water supply system, commissioned by various rulers of Ancient and Medieval Nepal. The system is supported by numerous ponds and canals that form an elaborate network of water bodies, created as a water resource during the dry season and to help alleviate the water pressure caused by the monsoon rains. After the introduction of modern, piped water systems, starting in the late 19th century, this old system has fallen into disrepair and some parts of it are lost forever. Nevertheless, many people of Nepal still rely on the old hitis on a daily basis.
  • 7.1K
  • 21 Oct 2022
Topic Review
Solubility of Oxygen and Hydrogen in Water
Produced by photosynthesis, oxygen (O2) is a fundamentally important gas in biological systems, playing roles as a terminal electron receptor in respiration and in host defence through the creation of reactive oxygen species (ROS). Hydrogen (H2) plays a role in metabolism for some organisms, such as at thermal vents and in the gut environment, but has a role in controlling growth and development, and in disease states, both in plants and animals.
  • 7.1K
  • 28 Feb 2024
Topic Review
Okra
Okra (Abelmoschus esculentus L.) is a popular vegetable crop with good nutritional significance, along with certain therapeutic values, which makes it a potential candidate in the use of a variety of nutraceuticals.
  • 7.1K
  • 16 Mar 2021
Topic Review
Developmental Psychology
Developmental psychology is the scientific study of how and why human beings change over the course of their life. Originally concerned with infants and children, the field has expanded to include adolescence, adult development, aging, and the entire lifespan. Developmental psychologists aim to explain how thinking, feeling, and behaviors change throughout life. This field examines change across three major dimensions: physical development, cognitive development, and social emotional development. Within these three dimensions are a broad range of topics including motor skills, executive functions, moral understanding, language acquisition, social change, personality, emotional development, self-concept, and identity formation. Developmental psychology examines the influences of nature and nurture on the process of human development, and processes of change in context across time. Many researchers are interested in the interactions among personal characteristics, the individual's behavior, and environmental factors, including the social context and the built environment. Ongoing debates in regards to developmental psychology include biological essentialism vs. neuroplasticity and stages of development vs. dynamic systems of development. Developmental psychology involves a range of fields, such as educational psychology, child psychopathology, forensic developmental psychology, child development, cognitive psychology, ecological psychology, and cultural psychology. Influential developmental psychologists from the 20th century include Urie Bronfenbrenner, Erik Erikson, Sigmund Freud, Jean Piaget, Barbara Rogoff, Esther Thelen, and Lev Vygotsky.
  • 7.1K
  • 30 Oct 2022
Topic Review
Biblical Cosmology
Biblical cosmology is the biblical writers' conception of the cosmos as an organised, structured entity, including its origin, order, meaning and destiny. The Bible was formed over many centuries, involving many authors, and reflects shifting patterns of religious belief; consequently, its cosmology is not always consistent. Nor do the biblical texts necessarily represent the beliefs of all Jews or Christians at the time they were put into writing: the majority of those making up Hebrew Bible or Old Testament in particular represent the beliefs of only a small segment of the ancient Israelite community, the members of a late Judean religious tradition centered in Jerusalem and devoted to the exclusive worship of Yahweh. The ancient Israelites envisaged a universe made up of a flat disc-shaped Earth floating on water, heaven above, underworld below. Humans inhabited Earth during life and the underworld after death; there was no way that mortals could enter heaven, and the underworld was morally neutral; only in Hellenistic times (after c. 330 BCE) did Jews begin to adopt the Greek idea that it would be a place of punishment for misdeeds, and that the righteous would enjoy an afterlife in heaven. In this period too the older three-level cosmology in large measure gave way to the Greek concept of a spherical earth suspended in space at the center of a number of concentric heavens. The opening words of the Genesis creation narrative (Genesis 1:1–26) sum up the biblical editors' view of how the cosmos originated: "In the beginning God created the heavens and the earth"; Yahweh, the God of Israel, was solely responsible for creation and had no rivals, implying Israel's superiority over all other nations. Later Jewish thinkers, adopting ideas from Greek philosophy, concluded that God's Wisdom, Word and Spirit penetrated all things and gave them unity. Christianity in turn adopted these ideas and identified Jesus with the Logos (Word): "In the beginning was the Word, and the Word was with God, and the Word was God" (John 1).
  • 7.1K
  • 11 Jan 2024
Topic Review
Mercedes-Benz W114
The Mercedes-Benz W114 and W115 models are a series of executive sedans and coupés introduced in 1968 by Mercedes-Benz, manufactured through model year 1976, and distinguished in the marketplace by names relating to their engine size. W114 models featured six-cylinder engines and were marketed as the 230, 250, and 280, while W115 models featured four-cylinder engines and were marketed as the 200, 220, 230, and 240. All were styled by Paul Bracq, featuring a three-box design. At the time Mercedes marketed sedans in two size classes, with the W114/W115 positioned below the Mercedes-Benz S-Class. Beginning in 1968, Mercedes marketed their model range as New Generation Models, giving their ID plates the designation '/8' (due to their 1968 Launch year). Because they were the only truly new cars of the so-called 'New Generation' and because of the '/8' or 'slash eight' designation, W114 and W115 models ultimately received the German nickname Strich Acht, loosely translated into the English stroke eight.
  • 7.1K
  • 04 Nov 2022
Topic Review
Networked Control System
An NCS consists of control loops joined through communication networks in which both the control signal and the feedback signal are exchanged between the system and the controller.
  • 7.1K
  • 01 Apr 2021
Topic Review
Metallic Alloy Nanoparticles
Metallic alloy nanoparticles are synthesized by combining two or more different metals. Bimetallic or trimetallic nanoparticles are considered more effective than monometallic nanoparticles because of their synergistic characteristics. In this review, we outline the structure, synthesis method, properties, and biological applications of metallic alloy nanoparticles based on their plasmonic, catalytic, and magnetic characteristics.
  • 7.1K
  • 03 Aug 2020
Topic Review
Minkowski Diagram
The Minkowski diagram, also known as a spacetime diagram, was developed in 1908 by Hermann Minkowski and provides an illustration of the properties of space and time in the special theory of relativity. It allows a qualitative understanding of the corresponding phenomena like time dilation and length contraction without mathematical equations. Minkowski diagrams are two-dimensional graphs that depict events as happening in a universe consisting of one space dimension and one time dimension. Unlike a regular distance-time graph, the distance is displayed on the horizontal axis and time on the vertical axis. Additionally, the time and space units of measurement are chosen in such a way that an object moving at the speed of light is depicted as following a 45° angle to the diagram's axes. In this way, each object, like an observer or a vehicle, traces a certain line in the diagram, which is called its world line. Also, each point in the diagram represents a certain position in space and time, and is called an event, regardless of whether anything relevant happens there and then.
  • 7.1K
  • 10 Oct 2022
  • Page
  • of
  • 2718
Academic Video Service