Topic Review
Kikuchi Lines
Kikuchi lines are patterns of electrons formed by scattering. They pair up to form bands in electron diffraction from single crystal specimens, there to serve as "roads in orientation-space" for microscopists uncertain of what they are looking at. In transmission electron microscopes, they are easily seen in diffraction from regions of the specimen thick enough for multiple scattering. Unlike diffraction spots, which blink on and off as one tilts the crystal, Kikuchi bands mark orientation space with well-defined intersections (called zones or poles) as well as paths connecting one intersection to the next. Experimental and theoretical maps of Kikuchi band geometry, as well as their direct-space analogs e.g. bend contours, electron channeling patterns, and fringe visibility maps are increasingly useful tools in electron microscopy of crystalline and nanocrystalline materials. Because each Kikuchi line is associated with Bragg diffraction from one side of a single set of lattice planes, these lines can be labeled with the same Miller or reciprocal-lattice indices that are used to identify individual diffraction spots. Kikuchi band intersections, or zones, on the other hand are indexed with direct-lattice indices i.e. indices which represent integer multiples of the lattice basis vectors a, b and c. Kikuchi lines are formed in diffraction patterns by diffusely scattered electrons, e.g. as a result of thermal atom vibrations. The main features of their geometry can be deduced from a simple elastic mechanism proposed in 1928 by Seishi Kikuchi, although the dynamical theory of diffuse inelastic scattering is needed to understand them quantitatively. In x-ray scattering, these lines are referred to as Kossel lines (named after Walther Kossel).
  • 1.8K
  • 09 Nov 2022
Topic Review
Photon
A photon (from grc φῶς, φωτός  'light') is an elementary particle that is a quantum of the electromagnetic field, including electromagnetic radiation such as light and radio waves, and the force carrier for the electromagnetic force. Photons are massless,[lower-alpha 1] so they always move at the speed of light in vacuum, 299792458 m/s (or about 186,282 mi/s). The photon belongs to the class of bosons. Like all elementary particles, photons are currently best explained by quantum mechanics, and exhibit wave–particle duality, their behavior featuring properties of both waves and particles. The modern photon concept originated during the first two decades of the 20th century with the work of Albert Einstein, who built upon the research of Max Planck. While trying to explain how matter and electromagnetic radiation could be in thermal equilibrium with one another, Planck proposed that the energy stored within a material object should be regarded as composed of an integer number of discrete, equal-sized parts. To explain the photoelectric effect, Einstein introduced the idea that light itself is made of discrete units of energy. In 1926, Gilbert N. Lewis popularized the term photon for these energy units. Subsequently, many other experiments validated Einstein's approach. In the Standard Model of particle physics, photons and other elementary particles are described as a necessary consequence of physical laws having a certain symmetry at every point in spacetime. The intrinsic properties of particles, such as charge, mass, and spin, are determined by gauge symmetry. The photon concept has led to momentous advances in experimental and theoretical physics, including lasers, Bose–Einstein condensation, quantum field theory, and the probabilistic interpretation of quantum mechanics. It has been applied to photochemistry, high-resolution microscopy, and measurements of molecular distances. Moreover, photons have been studied as elements of quantum computers, and for applications in optical imaging and optical communication such as quantum cryptography.
  • 1.8K
  • 23 Oct 2022
Topic Review
Position and Momentum Space
In physics and geometry, there are two closely related vector spaces, usually three-dimensional but in general could be any finite number of dimensions. Position space (also real space or coordinate space) is the set of all position vectors r in space, and has dimensions of length. A position vector defines a point in space. If the position vector of a point particle varies with time it will trace out a path, the trajectory of a particle. Momentum space is the set of all momentum vectors p a physical system can have. The momentum vector of a particle corresponds to its motion, with units of [mass][length][time]−1. Mathematically, the duality between position and momentum is an example of Pontryagin duality. In particular, if a function is given in position space, f(r), then its Fourier transform obtains the function in momentum space, φ(p). Conversely, the inverse Fourier transform of a momentum space function is a position space function. These quantities and ideas transcend all of classical and quantum physics, and a physical system can be described using either the positions of the constituent particles, or their momenta, both formulations equivalently provide the same information about the system in consideration. Another quantity is useful to define in the context of waves. The wave vector k (or simply "k-vector") has dimensions of reciprocal length, making it an analogue of angular frequency ω which has dimensions of reciprocal time. The set of all wave vectors is k-space. Usually r is more intuitive and simpler than k, though the converse can also be true, such as in solid-state physics. Quantum mechanics provides two fundamental examples of the duality between position and momentum, the Heisenberg uncertainty principle ΔxΔp ≥ ħ/2 stating that position and momentum cannot be simultaneously known to arbitrary precision, and the de Broglie relation p = ħk which states the momentum and wavevector of a free particle are proportional to each other. In this context, when it is unambiguous, the terms "momentum" and "wavevector" are used interchangeably. However, the de Broglie relation is not true in a crystal.
  • 1.8K
  • 24 Nov 2022
Topic Review
CNO Cycle
The CNO cycle (for carbon–nitrogen–oxygen; sometimes called Bethe–Weizsäcker cycle after Hans Albrecht Bethe and Carl Friedrich von Weizsäcker) is one of the two known sets of fusion reactions by which stars convert hydrogen to helium, the other being the proton–proton chain reaction (p-p cycle), which is more efficient at the Sun's core temperature. The CNO cycle is hypothesized to be dominant in stars that are more than 1.3 times as massive as the Sun. Unlike the proton-proton reaction, which consumes all its constituents, the CNO cycle is a catalytic cycle. In the CNO cycle, four protons fuse, using carbon, nitrogen, and oxygen isotopes as catalysts, each of which is consumed at one step of the CNO cycle, but re-generated in a later step. The end product is one alpha particle (a stable helium nucleus), two positrons, and two electron neutrinos. There are various alternative paths and catalysts involved in the CNO cycles, all these cycles have the same net result: The positrons will almost instantly annihilate with electrons, releasing energy in the form of gamma rays. The neutrinos escape from the star carrying away some energy. One nucleus goes on to become carbon, nitrogen, and oxygen isotopes through a number of transformations in an endless loop. The proton–proton chain is more prominent in stars the mass of the Sun or less. This difference stems from temperature dependency differences between the two reactions; pp-chain reaction starts at temperatures around 4×106 K (4 megakelvin), making it the dominant energy source in smaller stars. A self-maintaining CNO chain starts at approximately 15×106 K, but its energy output rises much more rapidly with increasing temperatures so that it becomes the dominant source of energy at approximately 17×106 K. The Sun has a core temperature of around 15.7×106 K, and only 1.7% of 4He nuclei produced in the Sun are born in the CNO cycle. The CNO-I process was independently proposed by Carl von Weizsäcker and Hans Bethe in the late 1930s. The first reports of the experimental detection of the neutrinos produced by the CNO cycle in the Sun were published in 2020. This was also the first experimental confirmation that the Sun had a CNO cycle, that the proposed magnitude of the cycle was accurate, and that von Weizsäcker and Bethe were correct.
  • 1.7K
  • 03 Nov 2022
Topic Review
Kaleidoscope
A kaleidoscope (/kəˈlaɪdəskoʊp/) is an optical instrument with two or more reflecting surfaces (or mirrors) tilted to each other at an angle, so that one or more (parts of) objects on one end of the mirrors are seen as a regular symmetrical pattern when viewed from the other end, due to repeated reflection. The reflectors are usually enclosed in a tube, often containing on one end a cell with loose, colored pieces of glass or other transparent (and/or opaque) materials to be reflected into the viewed pattern. Rotation of the cell causes motion of the materials, resulting in an ever-changing view being presented.
  • 1.7K
  • 04 Nov 2022
Topic Review
Laser-Induced Breakdown Spectroscopy
Laser-Induced Breakdown Spectroscopy (LIBS) has been firstly introduced and proposed for analytical applications almost immediately after the invention of the laser in 1960. Since then, it has been proposed and today is widely used as an alternative analytical method for numerous applications. The operating principle of LIBS is quite simple and is based on the interaction of a powerful enough laser beam, focused usually on or in a sample, inducing a dielectric breakdown of the material, thus resulting in plasma formation consisting of excited and non-excited atoms and molecules, fragments of molecular species, electrons and ions, and emitting characteristic radiations, whose spectroscopic analysis can in principle provide the elemental composition fingerprint of the material. The required instrumentation consisting basically of a laser source, and a spectrometer/monochromator equipped with the appropriate light detector (nowadays being almost exclusively some CCD or ICCD type detector) is relatively simple and economically affordable, while significant progresses have been achieved to small size and/or portable equipment, facilitating largely the in situ operation.
  • 1.7K
  • 31 Aug 2021
Topic Review
Dye-Sensitized Solar Cells
Dye-sensitized solar cells (DSSCs) have emerged as promising alternatives to traditional silicon-based solar cells due to their relatively high conversion efficiency, low cost, flexibility, and environmentally benign fabrication processes.
  • 1.7K
  • 25 Oct 2020
Topic Review
Tensile Testing
Tensile testing, also known as tension testing, is a fundamental materials science and engineering test in which a sample is subjected to a controlled tension until failure. Properties that are directly measured via a tensile test are ultimate tensile strength, breaking strength, maximum elongation and reduction in area. From these measurements the following properties can also be determined: Young's modulus, Poisson's ratio, yield strength, and strain-hardening characteristics. Uniaxial tensile testing is the most commonly used for obtaining the mechanical characteristics of isotropic materials. Some materials use biaxial tensile testing. The main difference between these testing machines being how load is applied on the materials.
  • 1.7K
  • 23 Nov 2022
Topic Review
Plasticity
In physics and materials science, plasticity, also known as plastic deformation, is the ability of a solid material to undergo permanent deformation, a non-reversible change of shape in response to applied forces. For example, a solid piece of metal being bent or pounded into a new shape displays plasticity as permanent changes occur within the material itself. In engineering, the transition from elastic behavior to plastic behavior is known as yielding. Plastic deformation is observed in most materials, particularly metals, soils, rocks, concrete, and foams. However, the physical mechanisms that cause plastic deformation can vary widely. At a crystalline scale, plasticity in metals is usually a consequence of dislocations. Such defects are relatively rare in most crystalline materials, but are numerous in some and part of their crystal structure; in such cases, plastic crystallinity can result. In brittle materials such as rock, concrete and bone, plasticity is caused predominantly by slip at microcracks. In cellular materials such as liquid foams or biological tissues, plasticity is mainly a consequence of bubble or cell rearrangements, notably T1 processes. For many ductile metals, tensile loading applied to a sample will cause it to behave in an elastic manner. Each increment of load is accompanied by a proportional increment in extension. When the load is removed, the piece returns to its original size. However, once the load exceeds a threshold – the yield strength – the extension increases more rapidly than in the elastic region; now when the load is removed, some degree of extension will remain. Elastic deformation, however, is an approximation and its quality depends on the time frame considered and loading speed. If, as indicated in the graph opposite, the deformation includes elastic deformation, it is also often referred to as "elasto-plastic deformation" or "elastic-plastic deformation". Perfect plasticity is a property of materials to undergo irreversible deformation without any increase in stresses or loads. Plastic materials that have been hardened by prior deformation, such as cold forming, may need increasingly higher stresses to deform further. Generally, plastic deformation is also dependent on the deformation speed, i.e. higher stresses usually have to be applied to increase the rate of deformation. Such materials are said to deform visco-plastically.
  • 1.7K
  • 10 Nov 2022
Topic Review
List of Meteor Air Bursts
Many explosions recorded in Earth's atmosphere are likely to be caused by the air bursts that result from meteors exploding as they hit the thicker part of the atmosphere. These types of meteors are also known as fireballs or bolides with the brightest known as superbolides. Before entering Earth's atmosphere, these larger meteors were originally asteroids and comets of a few to several tens of metres in diameter, contrasting with the much smaller and much more common "shooting stars". The most powerful recorded air burst is the 1908 Tunguska event. Extremely bright fireballs traveling across the sky are often witnessed from a distance, such as the 1947 Sikhote-Alin meteor and the 2013 Chelyabinsk meteor, both in Russia. If the bolide is large enough, fragments may survive such as the Chelyabinsk meteorite. Modern developments in infrasound detection by the Comprehensive Nuclear-Test-Ban Treaty Organization and infrared Defense Support Program satellite technology have increased the likelihood of detecting airbursts.
  • 1.7K
  • 11 Nov 2022
  • Page
  • of
  • 130
Video Production Service