Topic Review
Weak Interaction
In nuclear physics and particle physics, the weak interaction, which is also often called the weak force or weak nuclear force, is one of the four known fundamental interactions, with the others being electromagnetism, the strong interaction, and gravitation. It is the mechanism of interaction between subatomic particles that is responsible for the radioactive decay of atoms. The weak interaction participates in nuclear fission, and the theory describing its behaviour and effects is sometimes called quantum flavourdynamics (QFD). However, the term QFD is rarely used, because the weak force is better understood by electroweak theory (EWT). The effective range of the weak force is limited to subatomic distances, and is less than the diameter of a proton.
  • 2.9K
  • 27 Oct 2022
Topic Review
Void Coefficient
In nuclear engineering, the void coefficient (more properly called void coefficient of reactivity) is a number that can be used to estimate how much the reactivity of a nuclear reactor changes as voids (typically steam bubbles) form in the reactor moderator or coolant. Net reactivity in a reactor is the sum total of all these contributions, of which the void coefficient is but one. Reactors in which either the moderator or the coolant is a liquid typically will have a void coefficient value that is either negative (if the reactor is under-moderated) or positive (if the reactor is over-moderated). Reactors in which neither the moderator nor the coolant is a liquid (e.g., a graphite-moderated, gas-cooled reactor) will have a void coefficient value equal to zero. It is unclear how the definition of 'void' coefficient applies to reactors in which the moderator/coolant is neither liquid nor gas (supercritical water reactor).
  • 2.2K
  • 22 Nov 2022
Topic Review
Types of Compton Cameras
A Compton camera is a promising γ-ray detector that operates in the wide energy range of a few tens of keV to MeV. The γ-ray detection method of a Compton camera is based on Compton scattering kinematics, which is used to determine the direction and energy of the γ-rays without using a mechanical collimator. Although the Compton camera was originally designed for astrophysical applications, it was later applied in medical imaging as well. Moreover, its application in environmental radiation measurements is also under study.
  • 373
  • 18 Oct 2022
Topic Review
Triple-Alpha Process
The triple-alpha process is a set of nuclear fusion reactions by which three helium-4 nuclei (alpha particles) are transformed into carbon.
  • 1.8K
  • 22 Nov 2022
Topic Review
Transmission Electron Microscopy DNA Sequencing
Transmission electron microscopy DNA sequencing is a single-molecule sequencing technology that uses transmission electron microscopy techniques. The method was conceived and developed in the 1960s and 70s, but lost favor when the extent of damage to the sample was recognized. In order for DNA to be clearly visualized under an electron microscope, it must be labeled with heavy atoms. In addition, specialized imaging techniques and aberration corrected optics are beneficial for obtaining the resolution required to image the labeled DNA molecule. In theory, transmission electron microscopy DNA sequencing could provide extremely long read lengths, but the issue of electron beam damage may still remain and the technology has not yet been commercially developed.
  • 800
  • 31 Oct 2022
Topic Review
Transmission Electron Microscopy
Transmission electron microscopy (TEM) is a microscopy technique in which a beam of electrons is transmitted through a specimen to form an image. The specimen is most often an ultrathin section less than 100 nm thick or a suspension on a grid. An image is formed from the interaction of the electrons with the sample as the beam is transmitted through the specimen. The image is then magnified and focused onto an imaging device, such as a fluorescent screen, a layer of photographic film, or a sensor such as a scintillator attached to a charge-coupled device. Transmission electron microscopes are capable of imaging at a significantly higher resolution than light microscopes, owing to the smaller de Broglie wavelength of electrons. This enables the instrument to capture fine detail—even as small as a single column of atoms, which is thousands of times smaller than a resolvable object seen in a light microscope. Transmission electron microscopy is a major analytical method in the physical, chemical and biological sciences. TEMs find application in cancer research, virology, and materials science as well as pollution, nanotechnology and semiconductor research, but also in other fields such as paleontology and palynology. TEM instruments have multiple operating modes including conventional imaging, scanning TEM imaging (STEM), diffraction, spectroscopy, and combinations of these. Even within conventional imaging, there are many fundamentally different ways that contrast is produced, called "image contrast mechanisms". Contrast can arise from position-to-position differences in the thickness or density ("mass-thickness contrast"), atomic number ("Z contrast", referring to the common abbreviation Z for atomic number), crystal structure or orientation ("crystallographic contrast" or "diffraction contrast"), the slight quantum-mechanical phase shifts that individual atoms produce in electrons that pass through them ("phase contrast"), the energy lost by electrons on passing through the sample ("spectrum imaging") and more. Each mechanism tells the user a different kind of information, depending not only on the contrast mechanism but on how the microscope is used—the settings of lenses, apertures, and detectors. What this means is that a TEM is capable of returning an extraordinary variety of nanometer- and atomic-resolution information, in ideal cases revealing not only where all the atoms are but what kinds of atoms they are and how they are bonded to each other. For this reason TEM is regarded as an essential tool for nanoscience in both biological and materials fields. The first TEM was demonstrated by Max Knoll and Ernst Ruska in 1931, with this group developing the first TEM with resolution greater than that of light in 1933 and the first commercial TEM in 1939. In 1986, Ruska was awarded the Nobel Prize in physics for the development of transmission electron microscopy.
  • 2.5K
  • 05 Dec 2022
Topic Review
Transient Reactor Test Facility (TREAT)
Coordinates: 43°41′11″N 112°45′36″W / 43.68647°N 112.75998°W / 43.68647; -112.75998 Lua error in Module:Location_map at line 522: Unable to find the specified location map definition: "Module:Location map/data/Idaho" does not exist. The Transient Reactor Test Facility (TREAT) is an air-cooled, graphite moderated, thermal spectrum test nuclear reactor designed to test reactor fuels and structural materials. Constructed in 1958, and operated from 1959 until 1994, TREAT was built to conduct transient reactor tests where the test material is subjected to neutron pulses that can simulate conditions ranging from mild transients to reactor accidents. TREAT was designed by Argonne National Laboratory, and is located at the Idaho National Laboratory. Since original construction, the facility had additions or systems upgrades in 1963, 1972, 1982, and 1988. The 1988 addition was extensive, and included upgrades of most of the instrumentation and control systems. The U.S. Department of Energy (DOE) has decided to resume a program of transient testing, and plans to invest about $75 million to restart the TREAT facility by 2018. The renewed interest in TREAT was sparked by the 2011 Fukushima Daiichi nuclear disaster, which prompted the shutdown of Japan's and Germany's nuclear plants. One use for TREAT is planned to be testing of new accident tolerant fuel for nuclear reactors. TREAT was successfully restarted in November 2017.
  • 270
  • 13 Oct 2022
Topic Review
Transactinide Element
In chemistry, transactinide elements (also, transactinides, or super-heavy elements) are the chemical elements with atomic numbers from 104 to 120. Their atomic numbers are immediately greater than those of the actinides, the heaviest of which is lawrencium (atomic number 103). Glenn T. Seaborg first proposed the actinide concept, which led to the acceptance of the actinide series. He also proposed the transactinide series ranging from element 104 to 121 and the superactinide series approximately spanning elements 122 to 153. The transactinide seaborgium was named in his honor. By definition, transactinide elements are also transuranic elements, i.e. have an atomic number greater than uranium (92). The transactinide elements all have electrons in the 6d subshell in their ground state. Except for rutherfordium and dubnium, even the longest-lasting isotopes of transactinide elements have extremely short half-lives, measured in seconds, or smaller units. The element naming controversy involved the first five or six transactinide elements. These elements thus used systematic names for many years after their discovery had been confirmed. (Usually the systematic names are replaced with permanent names proposed by the discoverers relatively shortly after a discovery has been confirmed.) Transactinides are radioactive and have only been obtained synthetically in laboratories. None of these elements has ever been collected in a macroscopic sample. Transactinide elements are all named after physicists and chemists or important locations involved in the synthesis of the elements. IUPAC defines an element to exist if its lifetime is longer than 10−14 seconds, which is the time it takes for the nucleus to form an electron cloud.
  • 2.7K
  • 01 Dec 2022
Topic Review
Technicolor
Technicolor theories are models of physics beyond the Standard Model that address electroweak gauge symmetry breaking, the mechanism through which W and Z bosons acquire masses. Early technicolor theories were modelled on quantum chromodynamics (QCD), the "color" theory of the strong nuclear force, which inspired their name. Instead of introducing elementary Higgs bosons to explain observed phenomena, technicolor models were introduced to dynamically generate masses for the W and Z bosons through new gauge interactions. Although asymptotically free at very high energies, these interactions must become strong and confining (and hence unobservable) at lower energies that have been experimentally probed. This dynamical approach is natural and avoids issues of Quantum triviality and the hierarchy problem of the Standard Model. However, since the Higgs boson discovery at the CERN LHC in 2012, the original models are largely ruled out. Nonetheless, it remains a possibility that the Higgs boson is a composite state. In order to produce quark and lepton masses, technicolor or composite Higgs models have to be "extended" by additional gauge interactions. Particularly when modelled on QCD, extended technicolor was challenged by experimental constraints on flavor-changing neutral current and precision electroweak measurements. The specific extensions of particle dynamics for technicolor or composite Higgs bosons are unknown. Much technicolor research focuses on exploring strongly interacting gauge theories other than QCD, in order to evade some of these challenges. A particularly active framework is "walking" technicolor, which exhibits nearly conformal behavior caused by an infrared fixed point with strength just above that necessary for spontaneous chiral symmetry breaking. Whether walking can occur and lead to agreement with precision electroweak measurements is being studied through non-perturbative lattice simulations. Experiments at the Large Hadron Collider have discovered the mechanism responsible for electroweak symmetry breaking, i.e., the Higgs boson, with mass approximately 125 GeV/c2; such a particle is not generically predicted by technicolor models. However, the Higgs boson may be a composite state, e.g., built of top and anti-top quarks as in the Bardeen–Hill–Lindner theory. Composite Higgs models are generally solved by the top quark infrared fixed point, and may require a new dynamics at extremely high energies such as topcolor.
  • 567
  • 28 Sep 2022
Topic Review
Synchrotron Light Source
A synchrotron light source is a source of electromagnetic radiation (EM) usually produced by a storage ring, for scientific and technical purposes. First observed in synchrotrons, synchrotron light is now produced by storage rings and other specialized particle accelerators, typically accelerating electrons. Once the high-energy electron beam has been generated, it is directed into auxiliary components such as bending magnets and insertion devices (undulators or wigglers) in storage rings and free electron lasers. These supply the strong magnetic fields perpendicular to the beam which are needed to convert high energy electrons into photons. The major applications of synchrotron light are in condensed matter physics, materials science, biology and medicine. A large fraction of experiments using synchrotron light involve probing the structure of matter from the sub-nanometer level of electronic structure to the micrometer and millimeter level important in medical imaging. An example of a practical industrial application is the manufacturing of microstructures by the LIGA process.
  • 501
  • 11 Oct 2022
  • Page
  • of
  • 9