Encyclopedia
Scholarly Community
Encyclopedia
Entry
Video
Image
Journal
Book
News
About
Log in/Sign up
Submit
Entry
Video
Image
and
or
not
All
${ type }
To
Search
Subject:
All Disciplines
Arts & Humanities
Biology & Life Sciences
Business & Economics
Chemistry & Materials Science
Computer Science & Mathematics
Engineering
Environmental & Earth Sciences
Medicine & Pharmacology
Physical Sciences
Public Health & Healthcare
Social Sciences
Sort:
Most Viewed
Latest
Alphabetical (A-Z)
Alphabetical (Z-A)
Filter:
All
Topic Review
Biography
Peer Reviewed Entry
Video Entry
Topic Review
Argonaut Class Reactor
The Argonaut class reactor is a design of small nuclear research reactor. Many have been built throughout the world, over a wide range of power levels. Its functions are to teach nuclear reactor theory, nuclear physics and for use in engineering laboratory experiments.
1.3K
15 Nov 2022
Topic Review
Photometry
Photometry, from Greek photo- ("light") and -metry ("measure"), is a technique used in astronomy that is concerned with measuring the flux or intensity of light radiated by astronomical objects. This light is measured through a telescope using a photometer, often made using electronic devices such as a CCD photometer or a photoelectric photometer that converts light into an electric current by the photoelectric effect. When calibrated against standard stars (or other light sources) of known intensity and colour, photometers can measure the brightness or apparent magnitude of celestial objects. The methods used to perform photometry depend on the wavelength region under study. At its most basic, photometry is conducted by gathering light and passing it through specialized photometric optical bandpass filters, and then capturing and recording the light energy with a photosensitive instrument. Standard sets of passbands (called a photometric system) are defined to allow accurate comparison of observations. A more advanced technique is spectrophotometry that is measured with a spectrophotometer and observes both the amount of radiation and its detailed spectral distribution. Photometry is also used in the observation of variable stars, by various techniques such as, differential photometry that simultaneously measures the brightness of a target object and nearby stars in the starfield or relative photometry by comparing the brightness of the target object to stars with known fixed magnitudes. Using multiple bandpass filters with relative photometry is termed absolute photometry. A plot of magnitude against time produces a light curve, yielding considerable information about the physical process causing the brightness changes. Precision photoelectric photometers can measure starlight around 0.001 magnitude. The technique of surface photometry can also be used with extended objects like planets, comets, nebulae or galaxies that measures the apparent magnitude in terms of magnitudes per square arcsecond. Knowing the area of the object and the average intensity of light across the astronomical object determines the surface brightness in terms of magnitudes per square arcsecond, while integrating the total light of the extended object can then calculate brightness in terms of its total magnitude, energy output or luminosity per unit surface area.
1.3K
11 Nov 2022
Topic Review
Radiation Portal Monitor
Radiation Portal Monitors (RPMs) are passive radiation detection devices used for the screening of individuals, vehicles, cargo or other vectors for detection of illicit sources such as at borders or secure facilities. Fear of terrorist attacks with radiological weapons spurred RPM deployment for cargo scanning since 9/11, particularly in the United States.
1.3K
14 Nov 2022
Topic Review
Electron Cloud Densitometry
Electron cloud densitometry is an interdisciplinary technology that uses the principles of quantum mechanics by the electron beam shifting effect. The effect is that the electron beam passing through the electron cloud, in accordance with the general principle of superposition of the system, changes its intensity in proportion to the probability density of the electron cloud. It gives direct visualization of the individual shapes of atoms, molecules and chemical bonds.
1.2K
18 Nov 2022
Topic Review
Auger Electron Spectroscopy
thumb|A Hanford scientist uses an Auger electron spectrometer to determine the elemental composition of surfaces. Auger electron spectroscopy (AES; pronounced [oʒe] in French) is a common analytical technique used specifically in the study of surfaces and, more generally, in the area of materials science. Underlying the spectroscopic technique is the Auger effect, as it has come to be called, which is based on the analysis of energetic electrons emitted from an excited atom after a series of internal relaxation events. The Auger effect was discovered independently by both Lise Meitner and Pierre Auger in the 1920s. Though the discovery was made by Meitner and initially reported in the journal Zeitschrift für Physik in 1922, Auger is credited with the discovery in most of the scientific community. Until the early 1950s Auger transitions were considered nuisance effects by spectroscopists, not containing much relevant material information, but studied so as to explain anomalies in X-ray spectroscopy data. Since 1953 however, AES has become a practical and straightforward characterization technique for probing chemical and compositional surface environments and has found applications in metallurgy, gas-phase chemistry, and throughout the microelectronics industry.
1.2K
01 Nov 2022
Topic Review
Electrostatic Nuclear Accelerator
An electrostatic nuclear accelerator is one of the two main types of particle accelerators, where charged particles can be accelerated by subjection to a static high voltage potential. The static high voltage method is contrasted with the dynamic fields used in oscillating field particle accelerators. Owing to their simpler design, historically these accelerators were developed earlier. These machines are operated at lower energy than some larger oscillating field accelerators, and to the extent that the energy regime scales with the cost of these machines, in broad terms these machines are less expensive than higher energy machines, and as such they are much more common. Many universities worldwide have electrostatic accelerators for research purposes.
1.2K
25 Oct 2022
Topic Review
Inorganic Scintillation Crystals
Scintillators play a crucial role as radiation detection materials in various nuclear technologies and radiation applications, such as medical imaging, well logging, homeland security, marine and space exploration, and high energy physics (HEP).
1.2K
25 Jun 2021
Topic Review
Scanning Transmission Electron Microscopy
A scanning transmission electron microscope (STEM) is a type of transmission electron microscope (TEM). Pronunciation is [stɛm] or [ɛsti:i:ɛm]. As with a conventional transmission electron microscope (CTEM), images are formed by electrons passing through a sufficiently thin specimen. However, unlike CTEM, in STEM the electron beam is focused to a fine spot (with the typical spot size 0.05 – 0.2 nm) which is then scanned over the sample in a raster illumination system constructed so that the sample is illuminated at each point with the beam parallel to the optical axis. The rastering of the beam across the sample makes STEM suitable for analytical techniques such as Z-contrast annular dark-field imaging, and spectroscopic mapping by energy dispersive X-ray (EDX) spectroscopy, or electron energy loss spectroscopy (EELS). These signals can be obtained simultaneously, allowing direct correlation of images and spectroscopic data. A typical STEM is a conventional transmission electron microscope equipped with additional scanning coils, detectors, and necessary circuitry, which allows it to switch between operating as a STEM, or a CTEM; however, dedicated STEMs are also manufactured. High-resolution scanning transmission electron microscopes require exceptionally stable room environments. In order to obtain atomic resolution images in STEM, the level of vibration, temperature fluctuations, electromagnetic waves, and acoustic waves must be limited in the room housing the microscope.
1.2K
29 Nov 2022
Topic Review
High-Resolution Transmission Electron Microscopy
High-resolution transmission electron microscopy is an imaging mode of specialized transmission electron microscopes that allows for direct imaging of the atomic structure of samples. It is a powerful tool to study properties of materials on the atomic scale, such as semiconductors, metals, nanoparticles and sp2-bonded carbon (e.g., graphene, C nanotubes). While this term is often also used to refer to high resolution scanning transmission electron microscopy, mostly in high angle annular dark field mode, this article describes mainly the imaging of an object by recording the two-dimensional spatial wave amplitude distribution in the image plane, in analogy to a "classic" light microscope. For disambiguation, the technique is also often referred to as phase contrast transmission electron microscopy. At present, the highest point resolution realised in phase contrast transmission electron microscopy is around 0.5 ångströms (0.050 nm). At these small scales, individual atoms of a crystal and its defects can be resolved. For 3-dimensional crystals, it may be necessary to combine several views, taken from different angles, into a 3D map. This technique is called electron crystallography. One of the difficulties with high resolution transmission electron microscopy is that image formation relies on phase contrast. In phase-contrast imaging, contrast is not intuitively interpretable, as the image is influenced by aberrations of the imaging lenses in the microscope. The largest contributions for uncorrected instruments typically come from defocus and astigmatism. The latter can be estimated from the so-called Thon ring pattern appearing in the Fourier transform modulus of an image of a thin amorphous film.
1.2K
15 Nov 2022
Topic Review
Magic Number
In nuclear physics, a magic number is a number of nucleons (either protons or neutrons, separately) such that they are arranged into complete shells within the atomic nucleus. As a result, atomic nuclei with a 'magic' number of protons or neutrons are much more stable than other nuclei. The seven most widely recognized magic numbers as of 2019 are 2, 8, 20, 28, 50, 82, and 126 (sequence A018226 in the OEIS). For protons, this corresponds to the elements helium, oxygen, calcium, nickel, tin, lead and the hypothetical unbihexium, although 126 is so far only known to be a magic number for neutrons. Atomic nuclei consisting of such a magic number of nucleons have a higher average binding energy per nucleon than one would expect based upon predictions such as the semi-empirical mass formula and are hence more stable against nuclear decay. The unusual stability of isotopes having magic numbers means that transuranium elements could theoretically be created with extremely large nuclei and yet not be subject to the extremely rapid radioactive decay normally associated with high atomic numbers. Large isotopes with magic numbers of nucleons are said to exist in an island of stability. Unlike the magic numbers 2–126, which are realized in spherical nuclei, theoretical calculations predict that nuclei in the island of stability are deformed. Before this was realized, higher magic numbers, such as 184, 258, 350, and 462 (sequence A033547 in the OEIS), were predicted based on simple calculations that assumed spherical shapes: these are generated by the formula [math]\displaystyle{ 2(\tbinom n1+ \tbinom n2+\tbinom n3) }[/math] (see Binomial coefficient). It is now believed that the sequence of spherical magic numbers cannot be extended in this way. Further predicted magic numbers are 114, 122, 124, and 164 for protons as well as 184, 196, 236, and 318 for neutrons. However, more modern calculations predict 228 and 308 for neutrons, along with 184 and 196.
1.1K
03 Nov 2022
Topic Review
Strangeness Production
Strangeness production is a signature and a diagnostic tool of quark–gluon plasma (or QGP) formation and properties. Unlike up and down quarks, from which everyday matter is made, strange quarks are formed in pair-production processes in collisions between constituents of the plasma. The dominant mechanism of production involves gluons only present when matter has become a quark–gluon plasma. When quark–gluon plasma disassembles into hadrons in a breakup process, the high availability of strange antiquarks helps to produce antimatter containing multiple strange quarks, which is otherwise rarely made. Similar considerations are at present made for the heavier charm flavor, which is made at the beginning of the collision process in the first interactions and is only abundant in the high-energy environments of CERN's Large Hadron Collider.
1.1K
02 Dec 2022
Topic Review
Low-Energy Electron Microscopy
Low-energy electron microscopy, or LEEM, is an analytical surface science technique used to image atomically clean surfaces, atom-surface interactions, and thin (crystalline) films. In LEEM, high-energy electrons (15-20 keV) are emitted from an electron gun, focused using a set of condenser optics, and sent through a magnetic beam deflector (usually 60˚ or 90˚). The “fast” electrons travel through an objective lens and begin decelerating to low energies (1-100 eV) near the sample surface because the sample is held at a potential near that of the gun. The low-energy electrons are now termed “surface-sensitive” and the near-surface sampling depth can be varied by tuning the energy of the incident electrons (difference between the sample and gun potentials minus the work functions of the sample and system). The low-energy elastically backscattered electrons travel back through the objective lens, reaccelerate to the gun voltage (because the objective lens is grounded), and pass through the beam separator again. However, now the electrons travel away from the condenser optics and into the projector lenses. Imaging of the back focal plane of the objective lens into the object plane of the projector lens (using an intermediate lens) produces a diffraction pattern (low-energy electron diffraction, LEED) at the imaging plane and recorded in a number of different ways. The intensity distribution of the diffraction pattern will depend on the periodicity at the sample surface and is a direct result of the wave nature of the electrons. One can produce individual images of the diffraction pattern spot intensities by turning off the intermediate lens and inserting a contrast aperture in the back focal plane of the objective lens (or, in state-of-the-art instruments, in the center of the separator, as chosen by the excitation of the objective lens), thus allowing for real-time observations of dynamic processes at surfaces. Such phenomena include (but are not limited to): tomography, phase transitions, adsorption, reaction, segregation, thin film growth, etching, strain relief, sublimation, and magnetic microstructure. These investigations are only possible because of the accessibility of the sample; allowing for a wide variety of in situ studies over a wide temperature range. LEEM was invented by Ernst Bauer in 1962; however, not fully developed (by Ernst Bauer and Wolfgang Telieps) until 1985.
1.1K
15 Nov 2022
Topic Review
Isotopes of Unbiunium
Check temperatures Ubu: no input for C, K, F. Check temperatures Ubu: no input for C, K, F. Unbiunium, also known as eka-actinium or simply element 121, is the hypothetical chemical element with symbol Ubu and atomic number 121. Unbiunium and Ubu are the temporary systematic IUPAC name and symbol respectively, until a permanent name is decided upon. In the periodic table of the elements, it is expected to be the first of the superactinides, and the third element in the eighth period: analogously to lanthanum and actinium, it could be considered the fifth member of group 3 and the first member of the fifth-row transition metals. It has attracted attention because of some predictions that it may be in the island of stability, although newer calculations expect the island to actually occur at a slightly lower atomic number, closer to copernicium and flerovium. Unbiunium has not yet been synthesized. Nevertheless, because it is only three elements away from the heaviest known element, oganesson (element 118), its synthesis may come in the near future; it is expected to be one of the last few reachable elements with current technology, and the limit may be anywhere between element 120 and 124. It will also likely be far more difficult to synthesize than the elements known so far up to 118, and still more difficult than elements 119 and 120. The team at RIKEN in Japan has plans to attempt the synthesis of element 121 in the future after its attempts on elements 119 and 120. The position of unbiunium in the periodic table suggests that it would have similar properties to its lighter congeners, scandium, yttrium, lanthanum, and actinium; however, relativistic effects may cause some of its properties to differ from those expected from a straight application of periodic trends. For example, unbiunium is expected to have a s2p valence electron configuration instead of the s2d of its lighter congeners in group 3, but this is not expected to significantly affect its chemistry, which is predicted to be that of a normal group 3 element; it would on the other hand significantly lower its first ionisation energy beyond what would be expected from periodic trends.
1.1K
26 Oct 2022
Topic Review
Transmission Electron Microscopy DNA Sequencing
Transmission electron microscopy DNA sequencing is a single-molecule sequencing technology that uses transmission electron microscopy techniques. The method was conceived and developed in the 1960s and 70s, but lost favor when the extent of damage to the sample was recognized. In order for DNA to be clearly visualized under an electron microscope, it must be labeled with heavy atoms. In addition, specialized imaging techniques and aberration corrected optics are beneficial for obtaining the resolution required to image the labeled DNA molecule. In theory, transmission electron microscopy DNA sequencing could provide extremely long read lengths, but the issue of electron beam damage may still remain and the technology has not yet been commercially developed.
1.1K
31 Oct 2022
Topic Review
Deep Underground Neutrino Experiment
The Deep Underground Neutrino Experiment (DUNE) is a neutrino experiment under construction, with a near detector at Fermilab and a far detector at the Sanford Underground Research Facility that will observe neutrinos produced at Fermilab. An intense beam of trillions of neutrinos from the production facility at Fermilab (in Illinois) will be sent over a distance of 1,300 kilometers (810 mi) with the goal of understanding the role of neutrinos in the universe. More than 1,000 collaborators work on the project. The experiment is designed for a 20-year period of data collection. The primary science objectives of DUNE are The science goals are so compelling that the 2014 Particle Physics Project Prioritization Panel (P5) ranked this as "the highest priority project in its timeframe" (recommendation 13).[10] The importance of these goals has led to proposals for competing projects in other countries, particularly the Hyper-Kamiokande experiment in Japan, scheduled to begin data-taking in 2027. The DUNE project, overseen by Fermilab, has suffered delays to its schedule and growth of cost from less than $2B to $3B, leading to articles in the journals Science and Scientific American described the project as "troubled."[11][12] As of 2022, the DUNE experiment has a neutrino-beam start-date in the early-2030's, and the project is now phased.[11][12]
1.0K
02 Oct 2022
Topic Review
Fundamental Interaction
In physics, the fundamental interactions, also known as fundamental forces, are the interactions that do not appear to be reducible to more basic interactions. There are four fundamental interactions known to exist: the gravitational and electromagnetic interactions, which produce significant long-range forces whose effects can be seen directly in everyday life, and the strong and weak interactions, which produce forces at minuscule, subatomic distances and govern nuclear interactions. Some scientists hypothesize that a fifth force might exist, but these hypotheses remain speculative. Each of the known fundamental interactions can be described mathematically as a field. The gravitational force is attributed to the curvature of spacetime, described by Einstein's general theory of relativity. The other three are discrete quantum fields, and their interactions are mediated by elementary particles described by the Standard Model of particle physics. Within the Standard Model, the strong interaction is carried by a particle called the gluon, and is responsible for quarks binding together to form hadrons, such as protons and neutrons. As a residual effect, it creates the nuclear force that binds the latter particles to form atomic nuclei. The weak interaction is carried by particles called W and Z bosons, and also acts on the nucleus of atoms, mediating radioactive decay. The electromagnetic force, carried by the photon, creates electric and magnetic fields, which are responsible for the attraction between orbital electrons and atomic nuclei which holds atoms together, as well as chemical bonding and electromagnetic waves, including visible light, and forms the basis for electrical technology. Although the electromagnetic force is far stronger than gravity, it tends to cancel itself out within large objects, so over large (astronomical) distances gravity tends to be the dominant force, and is responsible for holding together the large scale structures in the universe, such as planets, stars, and galaxies. Many theoretical physicists believe these fundamental forces to be related and to become unified into a single force at very high energies on a minuscule scale, the Planck scale, but particle accelerators cannot produce the enormous energies required to experimentally probe this. Devising a common theoretical framework that would explain the relation between the forces in a single theory is perhaps the greatest goal of today's theoretical physicists. The weak and electromagnetic forces have already been unified with the electroweak theory of Sheldon Glashow, Abdus Salam, and Steven Weinberg for which they received the 1979 Nobel Prize in physics. Some physicists seek to unite the electroweak and strong fields within what is called a Grand Unified Theory (GUT). An even bigger challenge is to find a way to quantize the gravitational field, resulting in a theory of quantum gravity (QG) which would unite gravity in a common theoretical framework with the other three forces. Some theories, notably string theory, seek both QG and GUT within one framework, unifying all four fundamental interactions along with mass generation within a theory of everything (ToE).
1.0K
15 Nov 2022
Topic Review
Hyperpolarization
Hyperpolarization is the nuclear spin polarization of a material in a magnetic field far beyond thermal equilibrium conditions determined by the Boltzmann distribution. It can be applied to gases such as 129Xe and 3He, and small molecules where the polarization levels can be enhanced by a factor of 104-105 above thermal equilibrium levels. Hyperpolarized noble gases are typically used in magnetic resonance imaging (MRI) of the lungs. Hyperpolarized small molecules are typically used for in vivo metabolic imaging. For example, a hyperpolarized metabolite can be injected into animals or patients and the metabolic conversion can be tracked in real-time. Other applications include determining the function of the neutron spin-structures by scattering polarized electrons from a very polarized target (3He), surface interaction studies, and neutron polarizing experiments.
1.0K
03 Nov 2022
Topic Review
Osmium-191
Osmium (76Os) has seven naturally occurring isotopes, five of which are stable: 187Os, 188Os, 189Os, 190Os, and (most abundant) 192Os. The other natural isotopes, 184Os, and 186Os, have extremely long half-life (1.12×1013 years and 2×1015 years, respectively) and for practical purposes can be considered to be stable as well. 187Os is the daughter of 187Re (half-life 4.56×1010 years) and is most often measured in an 187Os/188Os ratio. This ratio, as well as the 187Re/188Os ratio, have been used extensively in dating terrestrial as well as meteoric rocks. It has also been used to measure the intensity of continental weathering over geologic time and to fix minimum ages for stabilization of the mantle roots of continental cratons. However, the most notable application of Os in dating has been in conjunction with iridium, to analyze the layer of shocked quartz along the Cretaceous–Paleogene boundary that marks the extinction of the dinosaurs 66 million years ago. There are also 30 artificial radioisotopes, the longest-lived of which is 194Os with a half-life of six years; all others have half-lives under 94 days. There are also nine known nuclear isomers, the longest-lived of which is 191mOs with a half-life of 13.10 hours. All isotopes and nuclear isomers of osmium are either radioactive or observationally stable, meaning that they are predicted to be radioactive but no actual decay has been observed.
994
08 Nov 2022
Topic Review
Fouling Prevention in Membranes by Radiation-Induced Graft Copolymerization
The application of membrane processes in various fields has now undergone accelerated developments, despite the presence of some hurdles impacting the process efficiency. Fouling is arguably the main hindrance for a wider implementation of polymeric membranes, particularly in pressure-driven membrane processes, causing higher costs of energy, operation, and maintenance. Radiation induced graft copolymerization (RIGC) is a powerful versatile technique for covalently imparting selected chemical functionalities to membranes’ surfaces, providing a potential solution to fouling problems.
975
17 Jan 2022
Topic Review
Radiological Investigation of French Fossils
The accurate investigation of uranium-containing fossils in a museum of the City of Basel showed the following crucial points: 1 Storage of uranium-containing minerals or fossils may lead to elevated radon exposures for the staff. The monitoring of the room air is recommended: Eventual taken measures will reduce the exposure to radon (permanent air ventilation, airtight storage of contaminated objects). 2 The handling and preparation of such contaminated objects may lead to significant doses by inhalation of radioactive dust. Such work should not be envisaged without special precautions. These could be wearing protective mask and gloves, working in a clean-bench.
942
04 Jul 2021
Page
of
5
Featured Entry Collections
>>
Featured Books
>>
Encyclopedia of Social Sciences
Chief Editor:
Kum Fai Yuen
Encyclopedia of COVID-19
Chief Editor:
Stephen Bustin
Encyclopedia of Fungi
Chief Editor:
Luis V. Lopez-Llorca
Encyclopedia of Digital Society, Industry 5.0 and Smart City
Chief Editor:
Sandro Serpa
Entry
Video
Image
Journal
Book
News
About
Log in/Sign up
New Entry
New Video
New Images
About
Terms and Conditions
Privacy Policy
Advisory Board
Contact
Partner
ScholarVision Creations
Feedback
Top
Feedback
×
Help Center
Browse our user manual, common Q&A, author guidelines, etc.
Rate your experience
Let us know your experience and what we could improve.
Report an error
Is something wrong? Please let us know!
Other feedback
Other feedback you would like to report.
×
Did you find what you were looking for?
Love
Like
Neutral
Dislike
Hate
0
/500
Email
Do you agree to share your valuable feedback publicly on
Encyclopedia
’s homepage?
Yes, I agree. Encyclopedia can post it.
No, I do not agree. I would not like to post my testimonial.
Webpage
Upload a screenshot
(Max file size 2MB)
Submit
Back
Close
×