Topic Review
Multispectral Image
Multispectral imaging captures image data within specific wavelength ranges across the electromagnetic spectrum. The wavelengths may be separated by filters or detected with the use of instruments that are sensitive to particular wavelengths, including light from frequencies beyond the visible light range, i.e. infrared and ultra-violet. Spectral imaging can allow extraction of additional information the human eye fails to capture with its visible receptors for red, green and blue. It was originally developed for military target identification and reconnaissance. Early space-based imaging platforms incorporated multispectral imaging technology to map details of the Earth related to coastal boundaries, vegetation, and landforms. Multispectral imaging has also found use in document and painting analysis. Multispectral imaging measures light in a small number (typically 3 to 15) of spectral bands. Hyperspectral imaging is a special case of spectral imaging where often hundreds of contiguous spectral bands are available.
  • 1.1K
  • 21 Oct 2022
Topic Review
European Green Deal
The European Green Deal, approved 2020, is a set of policy initiatives by the European Commission with the overarching aim of making the European Union (EU) climate neutral in 2050. An impact assessed plan will also be presented to increase the EU's greenhouse gas emission reductions target for 2030 to at least 50% and towards 55% compared with 1990 levels. The plan is to review each existing law on its climate merits, and also introduce new legislation on the circular economy, building renovation, biodiversity, farming and innovation. The president of the European Commission, Ursula von der Leyen, stated that the European Green Deal would be Europe's "man on the moon moment". Von der Leyen appointed Frans Timmermans as Executive Vice President of the European Commission for the European Green Deal. On 13 December 2019, the European Council decided to press ahead with the plan, with an opt-out for Poland . On 15 January 2020, the European Parliament voted to support the deal as well, with requests for higher ambition. The European Commission's climate change strategy, launched in 2020, is focused on a promise to make Europe a net-zero emitter of greenhouse gases by 2050 and to demonstrate that economies will develop without increasing resource usage. However, the Green Deal has measures to ensure that nations that are already reliant on fossil fuels are not left behind in the transition to renewable energy.
  • 748
  • 21 Oct 2022
Topic Review
Last Glacial Period
The last glacial period occurred from the end of the Eemian interglacial to the end of the Younger Dryas, encompassing the period c. 115,000 – c. 11,700 years ago. This most recent glacial period is part of a larger pattern of glacial and interglacial periods known as the Quaternary glaciation extending from c. 2,588,000 years ago to present. The definition of the Quaternary as beginning 2.58 Ma is based on the formation of the Arctic ice cap. The Antarctic ice sheet began to form earlier, at about 34 Ma, in the mid-Cenozoic (Eocene–Oligocene extinction event). The term Late Cenozoic Ice Age is used to include this early phase. During this last glacial period there were alternating episodes of glacier advance and retreat. Within the last glacial period the Last Glacial Maximum was approximately 22,000 years ago. While the general pattern of global cooling and glacier advance was similar, local differences in the development of glacier advance and retreat make it difficult to compare the details from continent to continent (see picture of ice core data below for differences). Approximately 13,000 years ago, the Late Glacial Maximum began. The end of the Younger Dryas about 11,700 years ago marked the beginning of the Holocene geological epoch, which includes the Holocene glacial retreat. From the point of view of human archaeology, the last glacial period falls in the Paleolithic and early Mesolithic periods. When the glaciation event started, Homo sapiens were confined to lower latitudes and used tools comparable to those used by Neanderthals in western and central Eurasia and by Homo erectus in Asia. Near the end of the event, Homo sapiens migrated into Eurasia and Australia. Archaeological and genetic data suggest that the source populations of Paleolithic humans survived the last glacial period in sparsely wooded areas and dispersed through areas of high primary productivity while avoiding dense forest cover. The retreat of the glaciers 15,000 years ago allowed groups of humans from Asia to migrate to the Americas.
  • 5.2K
  • 21 Oct 2022
Topic Review
Polymeric Biodiesel
Biodiesel industry is expanding rapidly in accordance with the high energy demand and environmental deterioration related to the combustion of fossil fuel. However, poor physicochemical properties and the malperformance of biodiesel fuel still concern the researchers. In this flow, polymers were introduced in biodiesel industry to overcome such drawbacks. This article introduces polymeric biodiesel which is Hydroxyalkanoates methyl ester (HAME) and hydroxybutyrate methyl ester (HBME) that are sourced from carbon-enriched polymers with the help of microbial activity. Composition, production techniques, characteristics, and limitations of polymeric biodiese were explored. 
  • 837
  • 21 Oct 2022
Topic Review
Techno-Economic and Life Cycle Cost Analysis
The techno-economic analysis (TEA) and the life cycle cost analysis (LCCA) are the most widely used approaches for modeling and calculating processes’ economic impacts. A simulation-based TEA is a cost-benefit analysis that simultaneously considers technical and economic factors. In addition, the method facilitates the development of the entire project and provides a systematic approach for examining the interrelationships between economic and technological aspects. When it comes to economic studies, it is intimately bonded with uncertainty. There are numerous uncertainty sources, classified in various ways. The uncertainty reflects “an inability to determine the precise value of one or more parameters affecting a system.” The variability refers to the different values a given parameter may take. This implies that a probability density function (PDF), for instance, can be employed to estimate and quantify the variability of a given parameter. The bias refers to “assumptions that skew an analysis in a certain direction while ignoring other legitimate alternatives, factors, or data.” 
  • 1.9K
  • 21 Oct 2022
Topic Review
Frame of Reference
In physics, a frame of reference (or reference frame) consists of an abstract coordinate system and the set of physical reference points that uniquely fix (locate and orient) the coordinate system and standardize measurements within that frame. For n dimensions, n + 1 reference points are sufficient to fully define a reference frame. Using rectangular (Cartesian) coordinates, a reference frame may be defined with a reference point at the origin and a reference point at one unit distance along each of the n coordinate axes. In Einsteinian relativity, reference frames are used to specify the relationship between a moving observer and the phenomenon or phenomena under observation. In this context, the phrase often becomes "observational frame of reference" (or "observational reference frame"), which implies that the observer is at rest in the frame, although not necessarily located at its origin. A relativistic reference frame includes (or implies) the coordinate time, which does not correspond across different frames moving relatively to each other. The situation thus differs from Galilean relativity, where all possible coordinate times are essentially equivalent.
  • 2.9K
  • 21 Oct 2022
Topic Review
Floating Island
A floating island is a mass of floating aquatic plants, mud, and peat ranging in thickness from several centimeters to a few meters. Floating islands are a common natural phenomenon that are found in many parts of the world. They exist less commonly as an artificial phenomenon. Floating islands are generally found on marshlands, lakes, and similar wetland locations, and can be many hectares in size.
  • 3.4K
  • 21 Oct 2022
Topic Review
Hockey Stick Controversy
In the hockey stick controversy, the data and methods used in reconstructions of the temperature record of the past 1000 years have been disputed. Reconstructions have consistently shown that the rise in the instrumental temperature record of the past 150 years is not matched in earlier centuries, and the name "hockey stick graph" was coined for figures showing a long-term decline followed by an abrupt rise in temperatures. These graphs were publicised to explain the scientific findings of climatology, and in addition to scientific debate over the reconstructions, they have been the topic of political dispute. The issue is part of the global warming controversy and has been one focus of political responses to reports by the Intergovernmental Panel on Climate Change (IPCC). Arguments over the reconstructions have been taken up by fossil fuel industry–funded lobbying groups attempting to cast doubt on climate science. The use of proxy indicators to get quantitative estimates of the temperature record of past centuries was developed from the 1990s onwards, and found indications that recent warming was exceptional. The Bradley & Jones 1993 reconstruction introduced the "Composite Plus Scaling" (CPS) method used by most later large-scale reconstructions, and its findings were disputed by Patrick Michaels at the United States House Committee on Science. In 1998, Michael E. Mann, Raymond S. Bradley and Malcolm K. Hughes developed new statistical techniques to produce Mann, Bradley & Hughes 1998 (MBH98), the first eigenvector-based climate field reconstruction (CFR). This showed global patterns of annual surface temperature, and included a graph of average hemispheric temperatures back to 1400. In Mann, Bradley & Hughes 1999 (MBH99) the methodology was extended back to 1000. The term hockey stick was coined by the climatologist Jerry D. Mahlman, to describe the pattern this showed, envisaging a graph that is relatively flat to 1900 as forming an ice hockey stick's "shaft", followed by a sharp increase corresponding to the "blade". A version of this graph was featured prominently in the 2001 IPCC Third Assessment Report (TAR), along with four other reconstructions supporting the same conclusion. The graph was publicised, and became a focus of dispute for those opposed to the strengthening scientific consensus that late 20th-century warmth was exceptional. Those disputing the graph included Pat Michaels, the George C. Marshall Institute and Fred Singer. A paper by Willie Soon and Sallie Baliunas claiming greater medieval warmth was used by the Bush administration chief of staff Philip Cooney to justify altering the first Environmental Protection Agency Report on the Environment. The paper was quickly dismissed by scientists in the Soon and Baliunas controversy, but on July 28, Republican Jim Inhofe spoke in the Senate citing it to claim "that man-made global warming is the greatest hoax ever perpetrated on the American people". Later in 2003, a paper by Steve McIntyre and Ross McKitrick disputing the data used in MBH98 paper was publicised by the George C. Marshall Institute and the Competitive Enterprise Institute. In 2004, Hans von Storch published criticism of the statistical techniques as tending to underplay variations in earlier parts of the graph, though this was disputed and he later accepted that the effect was very small. In 2005, McIntyre and McKitrick published criticisms of the principal component analysis methodology as used in MBH98 and MBH99. The analysis therein was subsequently disputed by published papers, including Huybers 2005 and Wahl & Ammann 2007, which pointed to errors in the McIntyre and McKitrick methodology. In June 2005, Rep. Joe Barton launched what Sherwood Boehlert, chairman of the House Science Committee, called a "misguided and illegitimate investigation" into the data, methods and personal information of Mann, Bradley and Hughes. At Boehlert's request, a panel of scientists convened by the National Research Council was set up, which reported in 2006, supporting Mann's findings with some qualifications, including agreeing that there were some statistical failings but these had little effect on the result. Barton and U.S. Rep. Ed Whitfield requested Edward Wegman to set up a team of statisticians to investigate, and they supported McIntyre and McKitrick's view that there were statistical failings, although they did not quantify whether there was any significant effect. They also produced an extensive network analysis which has been discredited by expert opinion and found to have issues of plagiarism. Arguments against the MBH studies were reintroduced as part of the Climatic Research Unit email controversy, but dismissed by eight independent investigations. More than two dozen reconstructions, using various statistical methods and combinations of proxy records, have supported the broad consensus shown in the original 1998 hockey-stick graph, with variations in how flat the pre-20th century "shaft" appears. The 2007 IPCC Fourth Assessment Report cited 14 reconstructions, 10 of which covered 1,000 years or longer, to support its strengthened conclusion that it was likely that Northern Hemisphere temperatures during the 20th century were the highest in at least the past 1,300 years. Over a dozen subsequent reconstructions, including Mann et al. 2008 and PAGES 2k Consortium 2013, have supported these general conclusions.
  • 3.5K
  • 21 Oct 2022
Topic Review
National Emissions Standards Act
The National Emissions Standards Act, officially known as the Motor Vehicle Air Pollution Control Act (Pub.L. 89–272), is a 1965 amendment to the U.S. Clean Air Act of 1963. The amendment set the first federal vehicle emissions standards, beginning with the 1968 models. These standards were reductions from the 1963 emissions: 72% reduction for hydrocarbons, 56% reduction for carbon monoxide, and 100% reduction for crankcase hydrocarbons. The impact the regulatory standards will have on air quality in the future, as well as the potential characteristics of the vehicle fleet can be analyzed with the use of roadway air dispersion models. The U.S. Environmental Protection Agency (EPA) is a department specific to the Clean Air Act. Its purpose is to make sure the amount of air pollution emitted stays inside the standards set by the U.S. Each state is required to have a state implementation plan (SIPs) that clearly indicates how it will enforce the regulations of the Clean Air Act. The states have to create regulations of their own that also adhere to the guidelines of the U.S. regulations; in order to do so, they must hold hearings so the public can contribute ideas and provide feedback.
  • 288
  • 21 Oct 2022
Topic Review
“Every Earthquake a Precursor According to Scale” Model
The observation that major earthquakes are generally preceded by an increase in the seismicity rate on a timescale from months to decades was embedded in the “Every Earthquake a Precursor According to Scale” (EEPAS) model. EEPAS has since been successfully applied to regional real-world and synthetic earthquake catalogues to forecast future earthquake occurrence rates with time horizons up to a few decades. When combined with aftershock models, its forecasting performance is improved for short time horizons. As a result, EEPAS has been included as the medium-term component in public earthquake forecasts in New Zealand. EEPAS has been modified to advance its forecasting performance despite data limitations. One modification is to compensate for missing precursory earthquakes. Precursory earthquakes can be missing because of the time-lag between the end of a catalogue and the time at which a forecast applies or the limited lead time from the start of the catalogue to a target earthquake. An observed space-time trade-off in precursory seismicity, which affects the EEPAS scaling parameters for area and time, also can be used to improve forecasting performance. Systematic analysis of EEPAS performance on synthetic catalogues suggests that regional variations in EEPAS parameters can be explained by regional variations in the long-term earthquake rate.
  • 815
  • 21 Oct 2022
  • Page
  • of
  • 270
Video Production Service