Topic Review
Precursor
Precursors are characteristic wave patterns caused by dispersion of an impulse's frequency components as it propagates through a medium. Classically, precursors precede the main signal, although in certain situations they may also follow it. Precursor phenomena exist for all types of waves, as their appearance is only predicated on the prominence of dispersion effects in a given mode of wave propagation. This non-specificity has been confirmed by the observation of precursor patterns in different types of electromagnetic radiation (microwaves, visible light, and terahertz radiation) as well as in fluid surface waves and seismic waves.
  • 899
  • 02 Dec 2022
Topic Review
Technicolor
Technicolor theories are models of physics beyond the Standard Model that address electroweak gauge symmetry breaking, the mechanism through which W and Z bosons acquire masses. Early technicolor theories were modelled on quantum chromodynamics (QCD), the "color" theory of the strong nuclear force, which inspired their name. Instead of introducing elementary Higgs bosons to explain observed phenomena, technicolor models were introduced to dynamically generate masses for the W and Z bosons through new gauge interactions. Although asymptotically free at very high energies, these interactions must become strong and confining (and hence unobservable) at lower energies that have been experimentally probed. This dynamical approach is natural and avoids issues of Quantum triviality and the hierarchy problem of the Standard Model. However, since the Higgs boson discovery at the CERN LHC in 2012, the original models are largely ruled out. Nonetheless, it remains a possibility that the Higgs boson is a composite state. In order to produce quark and lepton masses, technicolor or composite Higgs models have to be "extended" by additional gauge interactions. Particularly when modelled on QCD, extended technicolor was challenged by experimental constraints on flavor-changing neutral current and precision electroweak measurements. The specific extensions of particle dynamics for technicolor or composite Higgs bosons are unknown. Much technicolor research focuses on exploring strongly interacting gauge theories other than QCD, in order to evade some of these challenges. A particularly active framework is "walking" technicolor, which exhibits nearly conformal behavior caused by an infrared fixed point with strength just above that necessary for spontaneous chiral symmetry breaking. Whether walking can occur and lead to agreement with precision electroweak measurements is being studied through non-perturbative lattice simulations. Experiments at the Large Hadron Collider have discovered the mechanism responsible for electroweak symmetry breaking, i.e., the Higgs boson, with mass approximately 125 GeV/c2; such a particle is not generically predicted by technicolor models. However, the Higgs boson may be a composite state, e.g., built of top and anti-top quarks as in the Bardeen–Hill–Lindner theory. Composite Higgs models are generally solved by the top quark infrared fixed point, and may require a new dynamics at extremely high energies such as topcolor.
  • 846
  • 28 Sep 2022
Topic Review
Corium (Nuclear Reactor)
Corium, also called fuel-containing material (FCM) or lava-like fuel-containing material (LFCM), is a material that is created in the core of a nuclear reactor during a meltdown accident. It resembles natural lava in its consistency. It consists of a mixture of nuclear fuel, fission products, control rods, structural materials from the affected parts of the reactor, products of their chemical reaction with air, water and steam, and, in the event that the reactor vessel is breached, molten concrete from the floor of the reactor room.
  • 828
  • 28 Oct 2022
Topic Review
Control and Upgradation of Indoor Air Quality
Due to increasing health and environmental issues, indoor air quality (IAQ) has garnered much research attention with regard to incorporating advanced clean air technologies. Various physicochemical air treatments have been used to monitor, control, and manage air contaminants, such as monitoring devices (gas sensors and internet of things-based systems), filtration (mechanical and electrical), adsorption, UV disinfection, UV photocatalysts, a non-thermal plasma approach, air conditioning systems, and green technologies (green plants and algae).
  • 785
  • 24 Feb 2023
Topic Review
Phase Curve
In astronomy a phase curve describes the brightness of a reflecting body as a function of its phase angle. The brightness usually refers the object's absolute magnitude, which, in turn, is its apparent magnitude at a distance of astronomical unit from the Earth and Sun. The phase angle equals the arc subtended by the observer and the sun as measured at the body. The phase curve is useful for characterizing an object's regolith (soil) and atmosphere. It is also the basis for computing the geometrical albedo and the Bond albedo of the body. In ephemeris generation, the phase curve is used in conjunction with the distances from the object to the Sun and the Earth to calculate the apparent magnitude.
  • 781
  • 15 Nov 2022
Topic Review
Photoemission Electron Microscopy
Photoemission electron microscopy (PEEM, also called photoelectron microscopy, PEM) is a type of electron microscopy that utilizes local variations in electron emission to generate image contrast. The excitation is usually produced by ultraviolet light, synchrotron radiation or X-ray sources. PEEM measures the coefficient indirectly by collecting the emitted secondary electrons generated in the electron cascade that follows the creation of the primary core hole in the absorption process. PEEM is a surface sensitive technique because the emitted electrons originate from a shallow layer. In physics, this technique is referred to as PEEM, which goes together naturally with low-energy electron diffraction (LEED), and low-energy electron microscopy (LEEM). In biology, it is called photoelectron microscopy (PEM), which fits with photoelectron spectroscopy (PES), transmission electron microscopy (TEM), and scanning electron microscopy (SEM).
  • 780
  • 19 Oct 2022
Topic Review
Naturalness
In physics, naturalness is the property that the dimensionless ratios between free parameters or physical constants appearing in a physical theory should take values "of order 1" and that free parameters are not fine-tuned. That is, a natural theory would have parameter ratios with values like 2.34 rather than 234000 or 0.000234. The requirement that satisfactory theories should be "natural" in this sense is a current of thought initiated around the 1960s in particle physics. It is a criterion that arises from the seeming non-naturalness of the standard model and the broader topics of the hierarchy problem, fine-tuning, and the anthropic principle. However it does tend to suggest a possible area of weakness or future development for current theories such as the Standard Model, where some parameters vary by many orders of magnitude, and which require extensive "fine-tuning" of their current values of the models concerned. The concern is that it is not yet clear whether these seemingly exact values we currently recognize, have arisen by chance (based upon the anthropic principle or similar) or whether they arise from a more advanced theory not yet developed, in which these turn out to be expected and well-explained, because of other factors not yet part of particle physics models. The concept of naturalness is not always compatible with Occam's razor, since many instances of "natural" theories have more parameters than "fine-tuned" theories such as the Standard Model. Naturalness in physics is closely related to the issue of fine-tuning, and over the past decade many scientists argued that the principle of naturalness is a specific application of Bayesian statistics. In the history of particle physics, the naturalness principle has given correct predictions three times - in the case of electron self-energy, pion mass difference and kaon mass difference.
  • 760
  • 28 Oct 2022
Topic Review
Radiolabeled Gold Nanoseeds and Glioblastoma Multiforme
Glioblastoma multiforme (GBM), classified as a grade IV brain tumor, represents the most frequent brain tumor, accounting for approximately 12–15% of all intracranial neoplasms. Current therapeutic strategies for GBM rely on open surgery, chemotherapy and radiotherapy. Despite some progress in the past 30 years, the overall survival of patients with glioblastoma remains extremely poor. The average lifespan is approximately 15 months after diagnosis, with most patients experiencing tumor relapse and outgrowth within 7–10 months of initial radiation therapy.
  • 742
  • 14 Jan 2022
Topic Review
Biological Small-Angle Scattering
Biological small-angle scattering is a small-angle scattering method for structure analysis of biological materials. Small-angle scattering is used to study the structure of a variety of objects such as solutions of biological macromolecules, nanocomposites, alloys, and synthetic polymers. Small-angle X-ray scattering (SAXS) and small-angle neutron scattering (SANS) are the two complementary techniques known jointly as small-angle scattering (SAS). SAS is an analogous method to X-ray and neutron diffraction, wide angle X-ray scattering, as well as to static light scattering. In contrast to other X-ray and neutron scattering methods, SAS yields information on the sizes and shapes of both crystalline and non-crystalline particles. When used to study biological materials, which are very often in aqueous solution, the scattering pattern is orientation averaged. SAS patterns are collected at small angles of a few degrees. SAS is capable of delivering structural information in the resolution range between 1 and 25 nm, and of repeat distances in partially ordered systems of up to 150 nm in size. Ultra small-angle scattering (USAS) can resolve even larger dimensions. The grazing-incidence small-angle scattering (GISAS) is a powerful technique for studying of biological molecule layers on surfaces. In biological applications SAS is used to determine the structure of a particle in terms of average particle size and shape. One can also get information on the surface-to-volume ratio. Typically, the biological macromolecules are dispersed in a liquid. The method is accurate, mostly non-destructive and usually requires only a minimum of sample preparation. However, biological molecules are always susceptible to radiation damage. In comparison to other structure determination methods, such as solution NMR or X-ray crystallography, SAS allows one to overcome some restraints. For example, solution NMR is limited to protein size, whereas SAS can be used for small molecules as well as for large multi-molecular assemblies. Solid-State NMR is still an indispensable tool for determining atomic level information of macromolecules greater than 40 kDa or non-crystalline samples such as amyloid fibrils. Structure determination by X-ray crystallography may take several weeks or even years, whereas SAS measurements take days. SAS can also be coupled to other analytical techniques like size-exclusion chromatography to study heterogeneous samples. However, with SAS it is not possible to measure the positions of the atoms within the molecule.
  • 725
  • 09 Oct 2022
Topic Review
Personal RF Safety Monitors
Electromagnetic field densitometers measure the exposure to electromagnetic radiation in certain ranges of the electromagnetic spectrum. This article concentrates on densitometers used in the telecommunication industry, which measure exposure to radio spectrum radiation. Other densitometers, like extremely low frequency densitometers which measure exposure to radiation from electric power lines, also exist. The major difference between a "Densitometer" and a "Dosimeter" is that a Dosimeter can measure the absorbed dose, which does not exist for RF Monitors. Monitors are also separated by "RF Monitors" that simply measure fields and "RF Personal Monitors" that are designed to function while mounted on the human body.
  • 713
  • 14 Oct 2022
Topic Review
Cross Section
In physics, the cross section is a measure of the probability that a specific process will take place when some kind of radiant excitation (e.g. a particle beam, sound wave, light, or an X-ray) intersects a localized phenomenon (e.g. a particle or density fluctuation). For example, the Rutherford cross-section is a measure of probability that an alpha particle will be deflected by a given angle during an interaction with an atomic nucleus. Cross section is typically denoted σ (sigma) and is expressed in units of area, more specifically in barns. In a way, it can be thought of as the size of the object that the excitation must hit in order for the process to occur, but more exactly, it is a parameter of a stochastic process. In classical physics, this probability often converges to a deterministic proportion of excitation energy involved in the process, so that, for example, with light scattering off of a particle, the cross section specifies the amount of optical power scattered from light of a given irradiance (power per area). It is important to note that although the cross section has the same units as area, the cross section may not necessarily correspond to the actual physical size of the target given by other forms of measurement. It is not uncommon for the actual cross-sectional area of a scattering object to be much larger or smaller than the cross section relative to some physical process. For example, plasmonic nanoparticles can have light scattering cross sections for particular frequencies that are much larger than their actual cross-sectional areas. When two discrete particles interact in classical physics, their mutual cross section is the area transverse to their relative motion within which they must meet in order to scatter from each other. If the particles are hard inelastic spheres that interact only upon contact, their scattering cross section is related to their geometric size. If the particles interact through some action-at-a-distance force, such as electromagnetism or gravity, their scattering cross section is generally larger than their geometric size. When a cross section is specified as the differential limit of a function of some final-state variable, such as particle angle or energy, it is called a differential cross section (see detailed discussion below). When a cross section is integrated over all scattering angles (and possibly other variables), it is called a total cross section or integrated total cross section. For example, in Rayleigh scattering, the intensity scattered at the forward and backward angles is greater than the intensity scattered sideways, so the forward differential scattering cross section is greater than the perpendicular differential cross section, and by adding all of the infinitesimal cross sections over the whole range of angles with integral calculus, we can find the total cross section. Scattering cross sections may be defined in nuclear, atomic, and particle physics for collisions of accelerated beams of one type of particle with targets (either stationary or moving) of a second type of particle. The probability for any given reaction to occur is in proportion to its cross section. Thus, specifying the cross section for a given reaction is a proxy for stating the probability that a given scattering process will occur. The measured reaction rate of a given process depends strongly on experimental variables such as the density of the target material, the intensity of the beam, the detection efficiency of the apparatus, or the angle setting of the detection apparatus. However, these quantities can be factored away, allowing measurement of the underlying two-particle collisional cross section. Differential and total scattering cross sections are among the most important measurable quantities in nuclear, atomic, and particle physics.
  • 712
  • 18 Nov 2022
Topic Review
Jpsi Meson
The J/ψ (J/psi) meson /ˈdʒeɪ ˈsaɪ ˈmiːzɒn/ or psion is a subatomic particle, a flavor-neutral meson consisting of a charm quark and a charm antiquark. Mesons formed by a bound state of a charm quark and a charm anti-quark are generally known as "charmonium". The J/ψ is the most common form of charmonium, due to its spin of 1 and its low rest mass. The J/ψ has a rest mass of 3.0969 GeV/c2, just above that of the ηc (2.9836 GeV/c2), and a mean lifetime of 7.2×10−21 s. This lifetime was about a thousand times longer than expected. Its discovery was made independently by two research groups, one at the Stanford Linear Accelerator Center, headed by Burton Richter, and one at the Brookhaven National Laboratory, headed by Samuel Ting of MIT. They discovered they had actually found the same particle, and both announced their discoveries on 11 November 1974. The importance of this discovery is highlighted by the fact that the subsequent, rapid changes in high-energy physics at the time have become collectively known as the "November Revolution". Richter and Ting were awarded the 1976 Nobel Prize in Physics.
  • 683
  • 29 Nov 2022
Topic Review
DUNE Experiment
The Deep Underground Neutrino Experiment (DUNE) is a neutrino experiment under construction, with a near detector at Fermilab and a far detector at the Sanford Underground Research Facility that will observe neutrinos produced at Fermilab. An intense beam of trillions of neutrinos from the production facility at Fermilab (in Illinois) will be sent over a distance of 1,300 kilometers (810 mi) with the goal of understanding the role of neutrinos in the universe. More than 1,000 collaborators work on the project. The experiment is designed for a 20-year period of data collection. The primary science objectives of DUNE are The science goals are so compelling that the 2014 Particle Physics Project Prioritization Panel (P5) ranked this as "the highest priority project in its timeframe" (recommendation 13). The importance of these goals has led to proposals for competing projects in other countries, particularly the Hyper-Kamiokande experiment in Japan, scheduled to begin data-taking in 2027. The DUNE project, overseen by Fermilab, has suffered delays to its schedule and growth of cost from less than $2B to $3B, leading to articles in the journals Science and Scientific American described the project as "troubled." As of 2022, the DUNE experiment has a neutrino-beam start-date in the early-2030's, and the project is now phased.
  • 646
  • 24 Oct 2022
Topic Review
Isotopes of Osmium
Osmium (76Os) has seven naturally occurring isotopes, five of which are stable: 187Os, 188Os, 189Os, 190Os, and (most abundant) 192Os. The other natural isotopes, 184Os, and 186Os, have extremely long half-life (1.12×1013 years and 2×1015 years, respectively) and for practical purposes can be considered to be stable as well. 187Os is the daughter of 187Re (half-life 4.56×1010 years) and is most often measured in an 187Os/188Os ratio. This ratio, as well as the 187Re/188Os ratio, have been used extensively in dating terrestrial as well as meteoric rocks. It has also been used to measure the intensity of continental weathering over geologic time and to fix minimum ages for stabilization of the mantle roots of continental cratons. However, the most notable application of Os in dating has been in conjunction with iridium, to analyze the layer of shocked quartz along the Cretaceous–Paleogene boundary that marks the extinction of the dinosaurs 66 million years ago. There are also 30 artificial radioisotopes, the longest-lived of which is 194Os with a half-life of six years; all others have half-lives under 94 days. There are also nine known nuclear isomers, the longest-lived of which is 191mOs with a half-life of 13.10 hours. All isotopes and nuclear isomers of osmium are either radioactive or observationally stable, meaning that they are predicted to be radioactive but no actual decay has been observed.
  • 630
  • 27 Oct 2022
Topic Review
Synchrotron Light Source
A synchrotron light source is a source of electromagnetic radiation (EM) usually produced by a storage ring, for scientific and technical purposes. First observed in synchrotrons, synchrotron light is now produced by storage rings and other specialized particle accelerators, typically accelerating electrons. Once the high-energy electron beam has been generated, it is directed into auxiliary components such as bending magnets and insertion devices (undulators or wigglers) in storage rings and free electron lasers. These supply the strong magnetic fields perpendicular to the beam which are needed to convert high energy electrons into photons. The major applications of synchrotron light are in condensed matter physics, materials science, biology and medicine. A large fraction of experiments using synchrotron light involve probing the structure of matter from the sub-nanometer level of electronic structure to the micrometer and millimeter level important in medical imaging. An example of a practical industrial application is the manufacturing of microstructures by the LIGA process.
  • 622
  • 11 Oct 2022
Topic Review
Osmium-188
Osmium (76Os) has seven naturally occurring isotopes, five of which are stable: 187Os, 188Os, 189Os, 190Os, and (most abundant) 192Os. The other natural isotopes, 184Os, and 186Os, have extremely long half-life (1.12×1013 years and 2×1015 years, respectively) and for practical purposes can be considered to be stable as well. 187Os is the daughter of 187Re (half-life 4.56×1010 years) and is most often measured in an 187Os/188Os ratio. This ratio, as well as the 187Re/188Os ratio, have been used extensively in dating terrestrial as well as meteoric rocks. It has also been used to measure the intensity of continental weathering over geologic time and to fix minimum ages for stabilization of the mantle roots of continental cratons. However, the most notable application of Os in dating has been in conjunction with iridium, to analyze the layer of shocked quartz along the Cretaceous–Paleogene boundary that marks the extinction of the dinosaurs 66 million years ago. There are also 30 artificial radioisotopes, the longest-lived of which is 194Os with a half-life of six years; all others have half-lives under 94 days. There are also nine known nuclear isomers, the longest-lived of which is 191mOs with a half-life of 13.10 hours. All isotopes and nuclear isomers of osmium are either radioactive or observationally stable, meaning that they are predicted to be radioactive but no actual decay has been observed.
  • 597
  • 30 Sep 2022
Biography
Karl Wirtz
Karl Eugen Julius Wirtz (24 April 1910 – 12 February 1994) was a German nuclear physicist, born in Cologne. He was arrested by the allied British and American Armed Forces and incarcerated at Farm Hall for six months in 1945 under Operation Epsilon. From 1929 to 1934, Wirtz studied physics, chemistry, and mathematics at the University of Bonn, the Albert Ludwigs University of Freiburg, and
  • 571
  • 08 Dec 2022
Topic Review
Types of Compton Cameras
A Compton camera is a promising γ-ray detector that operates in the wide energy range of a few tens of keV to MeV. The γ-ray detection method of a Compton camera is based on Compton scattering kinematics, which is used to determine the direction and energy of the γ-rays without using a mechanical collimator. Although the Compton camera was originally designed for astrophysical applications, it was later applied in medical imaging as well. Moreover, its application in environmental radiation measurements is also under study.
  • 553
  • 18 Oct 2022
Topic Review
High-Throughput Screening Methods for Radiosensitivity and Resistance
The biological impact of ionizing radiation (IR) on humans depends not only on the physical properties and absorbed dose of radiation but also on the unique susceptibility of the exposed individual. A critical target of IR is DNA, and the DNA damage response is a safeguard mechanism for maintaining genomic integrity in response to the induced cellular stress. Unrepaired DNA lesions lead to various mutations, contributing to adverse health effects.
  • 547
  • 19 Aug 2022
Topic Review
Large Extra Dimension
In particle physics and string theory (M-theory), the ADD model, also known as the model with large extra dimensions (LED), is a model framework that attempts to solve the hierarchy problem. (Why is the force of gravity so weak compared to the electromagnetic force and the other fundamental forces?) The model tries to explain this problem by postulating that our universe, with its four dimensions (three spatial ones plus time), exists on a so called membrane floating in 11-dimensional space. It is then suggested that the other forces of nature (the electromagnetic force, strong interaction, and weak interaction) operate within this membrane and its four dimensions, while gravity can operate across all 11 dimensions. This would explain why gravity is very weak compared to the other fundamental forces. This is a radical theory given that the other 7 dimensions, which we do not observe, previously have been assumed to be very small (about a planck-length), while this theory asserts that they might be very large. The model was proposed by Nima Arkani-Hamed, Savas Dimopoulos, and Gia Dvali in 1998. Attempts to test the theory are executed by smashing together two protons in the Large Hadron Collider so that they disperse and release elementary particles. If a postulated graviton appeared after a collision, for such a particle to disappear, and its disappearance be observed, that would suggest that the graviton had escaped into other dimensions beyond our universe's observable four. No experiments from the Large Hadron Collider have been decisive thus far. However, the operation range of the LHC (13 TeV collision energy) covers only a small part of the predicted range in which evidence for LED would be recorded (a few TeV to 1016 TeV). This suggests that the theory might be more thoroughly tested with advanced technology.
  • 540
  • 01 Dec 2022
  • Page
  • of
  • 5
ScholarVision Creations