Topic Review
CCR Model (DEA)
The first Data Envelopment Analysis (DEA) model developed by Charnes, Cooper and Rhodes (1978) under the assumption of a Constant Returns to Scale production technology, i.e.,  when an increase in the production resources results in a proportional increase in the output.
  • 16.3K
  • 30 May 2021
Topic Review
Data Envelopment Analysis (DEA)
Data Envelopment Analysis (DEA) is a non-parametric methodology for measuring the efficiency of Decision Making Units (DMUs) using multiple inputs to outputs configurations. This is the most commonly used tool for frontier estimations in assessments of productivity and efficiency applied to all fields of economic activities.
  • 11.4K
  • 28 Jan 2022
Topic Review
Transmission Dynamics of COVID-19
COVID-19 is pneumonia caused by a novel coronavirus which is an emerging infectious disease, and outbreaks in more than 200 countries around the world. Consequently, the spread principles and prevention and control measures of COVID-19 have become a global problem to be solved. Here, we pose a series of dynamical models to reveal the transmission mechanisms of COVID-19. Based on these mathematical models, data fitting and spread trend of COVID-19 are explored to show the propagation law between human populations. We hope that our work may provide some useful insights for effective control of the COVID-19.
  • 5.1K
  • 28 Oct 2020
Topic Review
Pan-Tompkins Algorithm
The Pan-Tompkins algorithm is commonly used to detect QRS complexes in electrocardiographic signals (ECG). The QRS complex represents the ventricular depolarization and the main spike visible in an ECG signal (see figure). This feature makes it particularly suitable for measuring heart rate, the first way to assess the heart health state. In the first derivation of Einthoven of a physiological heart, the QRS complex is composed by a downward deflection (Q wave), an high upward deflection (R wave) and a final downward deflection (S wave). The Pan-Tompkins algorithm applies a series of filters to highlight the frequency content of this rapid heart depolarization and removes the background noise. Then, it squares the signal to amplify the QRS contribute. Finally, it applies adaptive thresholds to detect the peaks of the filtered signal. The algorithm was proposed by Jiapu Pan and Willis J. Tompkins in 1985, in the journal IEEE Transactions on Biomedical Engineering . The performance of the method was tested on an annotated arrhythmia database (MIT/BIH) and evaluated also in presence of noise. Pan and Tompkins reported that the 99.3 percent of QRS complexes was correctly detected.
  • 4.3K
  • 24 Oct 2022
Topic Review
Satellite-Based Active Fire Detection
Detection of an active wildfire in a satellite image scene relies on an accurate estimation of the background temperature of the scene, which must be compared to the observed temperature, to decide on the presence of fire. The expected background temperature of a pixel is commonly derived based on spatial-contextual information. Multi-temporal information and multi-spectral information have also been exploited in estimation of the background temperature of a pixel. This review discusses different approaches of estimation of background temperature and highlights the potentiality of the estimation of the background temperature using the multi-temporal data for early fire detection and real-time fire monitoring. The perspectives of a proposed multi-temporal approach are also outlined. 
  • 3.5K
  • 06 Dec 2020
Topic Review
Deep Reinforcement Learning in Economics
The popularity of deep reinforcement learning (DRL) applications in economics has increased exponentially. DRL, through a wide range of capabilities from reinforcement learning (RL) to deep learning (DL), offers vast opportunities for handling sophisticated dynamic economics systems. DRL is characterized by scalability with the potential to be applied to high-dimensional problems in conjunction with noisy and nonlinear patterns of economic data. In this paper, we initially consider a brief review of DL, RL, and deep RL methods in diverse applications in economics, providing an in-depth insight into the state-of-the-art. Furthermore, the architecture of DRL applied to economic applications is investigated in order to highlight the complexity, robustness, accuracy, performance, computational tasks, risk constraints, and profitability. The survey results indicate that DRL can provide better performance and higher efficiency as compared to the traditional algorithms while facing real economic problems in the presence of risk parameters and the ever-increasing uncertainties. View Full-Text
  • 3.4K
  • 08 Apr 2021
Topic Review
Incompressible Navier Stokes Equations
A closely related problem to The Clay Math Institute "Navier-Stokes, breakdown of smooth solutions here on an arbitrary cube subset of three dimensional space with periodic boundary conditions is examined. The incompressible Navier-Stokes Equations are presented in a new and conventionally different way here, by naturally reducing them to an operator form which is then further analyzed. It is shown that a reduction to a general 2D N-S system decoupled from a 1D non-linear partial differential equation is possible to obtain. This is executed using integration over n-dimensional compact intervals which allows decoupling. Here we extract the measure-zero points in the domain where singularities may occur and are left with a pde that exhibits finite time singularity. The operator form is considered in a physical geometric vorticity case, and a more general case. In the general case, the solution is revealed to have smooth solutions which exhibit finite-time blowup on a fine measure zero set and using the Gagliardo-Nirenberg inequalities it is shown that for any non zero measure set in the form of cube subset of 3D there is finite time blowup.
  • 2.5K
  • 10 Dec 2020
Topic Review
Mathematical Economics
Mathematical economics is the application of mathematical methods to represent theories and analyze problems in economics. Often, these applied methods are beyond simple geometry, and may include differential and integral calculus, difference and differential equations, matrix algebra, mathematical programming, or other computational methods. Proponents of this approach claim that it allows the formulation of theoretical relationships with rigor, generality, and simplicity. Mathematics allows economists to form meaningful, testable propositions about wide-ranging and complex subjects which could less easily be expressed informally. Further, the language of mathematics allows economists to make specific, positive claims about controversial or contentious subjects that would be impossible without mathematics. Much of economic theory is currently presented in terms of mathematical economic models, a set of stylized and simplified mathematical relationships asserted to clarify assumptions and implications. Formal economic modeling began in the 19th century with the use of differential calculus to represent and explain economic behavior, such as utility maximization, an early economic application of mathematical optimization. Economics became more mathematical as a discipline throughout the first half of the 20th century, but introduction of new and generalized techniques in the period around the Second World War, as in game theory, would greatly broaden the use of mathematical formulations in economics. This rapid systematizing of economics alarmed critics of the discipline as well as some noted economists. John Maynard Keynes, Robert Heilbroner, Friedrich Hayek and others have criticized the broad use of mathematical models for human behavior, arguing that some human choices are irreducible to mathematics.
  • 2.2K
  • 31 Oct 2022
Topic Review
Maze Solving Algorithm
There are a number of different maze solving algorithms, that is, automated methods for the solving of mazes. The random mouse, wall follower, Pledge, and Trémaux's algorithms are designed to be used inside the maze by a traveler with no prior knowledge of the maze, whereas the dead-end filling and shortest path algorithms are designed to be used by a person or computer program that can see the whole maze at once. Mazes containing no loops are known as "simply connected", or "perfect" mazes, and are equivalent to a tree in graph theory. Thus many maze solving algorithms are closely related to graph theory. Intuitively, if one pulled and stretched out the paths in the maze in the proper way, the result could be made to resemble a tree.
  • 2.0K
  • 08 Nov 2022
Topic Review
Sustainable Digital Innovation
Digital innovation is referring to a product, process or business model that is new or requires significant changes and it is enabled by IT. Sustainable digital innovation supports the digitalization process of the economy in a green, long-lasting and organic way. Thus, it serves the need of a sustainable future. The regular digital innovation addresses performances, costs, technology and attractiveness to customers and business, while the sustainable one also addresses to the environmental and social factors. Sustainable digital innovation tries to create value for all the stakeholders involved in the production and distribution process, it is inspired by nature, not only by technology, it reduces resources waste and targets the societal goals, not only the commercial and business goals.
  • 1.9K
  • 17 Jun 2021
Topic Review
Multiple Traveling Salesperson Problems
Multiple traveling salesperson problems (mTSP) are a collection of problems that generalize the classical traveling salesperson problem (TSP). In a nutshell, an mTSP variant seeks a minimum-cost collection of m paths that visit all vertices of a given weighted complete graph. Conceptually, mTSP lies between TSP and vehicle routing problems (VRP).
  • 1.6K
  • 20 Jul 2023
Topic Review
Mathematical Modeling to Estimate Photosynthesis
Photosynthesis is a process that indicates the productivity of crops. The estimation of this variable can be achieved through methods based on mathematical models.
  • 1.3K
  • 22 Jun 2022
Topic Review
Journal Axioms
Axioms (ISSN 2075-1680) is an international, peer-reviewed, open access journal of mathematics, mathematical logic and mathematical physics, published quarterly online by MDPI. It's now indexed within SCIE (Web of Science), Scopus, dblp, and other databases.
  • 1.3K
  • 26 Sep 2021
Topic Review
Random Number Generation
Ever since the antiquity, random number generation has played an important role both in common everyday life activities, such as leisure games, as well as in the advancement of science. Such means as dice and coins have been employed since the ancient times in order to generate random numbers that were used for gambling, dispute resolution, leisure games, and perhaps even fortune-telling. The theory behind the generation of random numbers, as well as the ability to potentially predict the outcome of this process, has been heavily studied and exploited by mathematics, in an attempt to either ensure the randomness of the process, to gain an advantage in correctly predicting its future outcomes, or to approximate the results of rather complicated computations. Especially in cryptography, random numbers are used due to the aforementioned properties, so that attackers have no other option but to guess the secret. This fact, in conjunction with the ongoing digitalisation of our world, has led to an interest in random number generation within the framework of computer science. In this context, random number generation systems are classified into two main categories: pseudorandom number generators and true random number generators, with the former generating sequences of numbers that appear to be random, but are in fact completely predictable when the initial value (being referred to as the seed) and conditions used for the number generation process are known, and with the latter generating truly random sequences of numbers that can only be predicted (correctly) with negligible probability, even if the initial value and conditions are known. 
  • 1.2K
  • 24 Mar 2023
Topic Review
Expenditures and Oscillation Analysis
The main purpose is to identify among variables that constitute water resources consumption at public schools, the link between consumption and expenditures oscillations. It was obtained a theoretical model of how oscillations patterns are originated and how time lengths have an important role over expenditures oscillations ergodicity and non-ergodicity.
  • 1.2K
  • 01 Nov 2020
Topic Review
Conditional Frontier Analysis (DEA)
Conditional Frontier Analysis is part of the Nonparametric Robust Estimators proposed to overcome some drawbacks in the traditional Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH) measures for the technical efficiency. In special, this methodology extends the nonparametric input/output production technology to robustly account for extreme values or outliers in the data, and allow measuring the effect of external environmental variables on the efficiency of Decision Making Units (DMUs). 
  • 1.1K
  • 01 May 2021
Topic Review
Convolutional Neural Network
Convolutional neural network (CNN)-based deep learning (DL) has a wide variety of applications in the geospatial and remote sensing (RS) sciences, and consequently has been a focus of many recent studies. 
  • 1.0K
  • 29 Jan 2022
Topic Review Video Peer Reviewed
Geometry-Based Deep Learning in the Natural Sciences
Nature is composed of elements at various spatial scales, ranging from the atomic to the astronomical level. In general, human sensory experience is limited to the mid-range of these spatial scales, in that the scales which represent the world of the very small or very large are generally apart from our sensory experiences. Furthermore, the complexities of Nature and its underlying elements are not tractable nor easily recognized by the traditional forms of human reasoning. Instead, the natural and mathematical sciences have emerged to model the complexities of Nature, leading to knowledge of the physical world. This level of predictiveness far exceeds any mere visual representations as naively formed in the Mind. In particular, geometry has served an outsized role in the mathematical representations of Nature, such as in the explanation of the movement of planets across the night sky. Geometry not only provides a framework for knowledge of the myriad of natural processes, but also as a mechanism for the theoretical understanding of those natural processes not yet observed, leading to visualization, abstraction, and models with insight and explanatory power. Without these tools, human experience would be limited to sensory feedback, which reflects a very small fraction of the properties of objects that exist in the natural world. As a consequence, as taught during the times of antiquity, geometry is essential for forming knowledge and differentiating opinion from true belief. It not only provides a framework for understanding astronomy, classical mechanics, and relativistic physics, but also the morphological evolution of living organisms, along with the complexities of the cognitive systems. Geometry also has a role in the information sciences, where it has explanatory power in visualizing the flow, structure, and organization of information in a system. This role further impacts the explanations of the internals of deep learning systems as developed in the fields of computer science and engineering.
  • 979
  • 21 Jun 2023
Topic Review
Complexity of Needs Model (DEA)
Data Envelopment Analysis (DEA) is a powerful non-parametric engineering tool for estimating technical efficiency and the production capacity of service units. The Complex-of-Needs Allocation Model proposed by Nepomuceno et al. (2020) is a two-step methodology for prioritizing hospital bed vacancy and reallocation during the COVID-19 pandemic. The framework determines the production capacity of hospitals through Data Envelopment Analysis and incorporates the Complexity of Needs in two categories for the reallocation of beds throughout the medical specialties. As a result, we have a set of inefficient health-care units presenting less complex bed slacks to be reduced, i.e. to be allocated for patients presenting more severe conditions.
  • 964
  • 08 Apr 2021
Topic Review
COVID-19 Pandemic Prediction
Several epidemiological models are being used around the world to project the number of infected individuals and the mortality rates of the COVID-19 outbreak. Advancing accurate prediction models is of utmost importance to take proper actions. Due to the lack of essential data and uncertainty, the epidemiological models have been challenged regarding the delivery of higher accuracy for long-term prediction. As an alternative to the susceptible-infected-resistant (SIR)-based models, this study proposes a hybrid machine learning approach to predict the COVID-19, and we exemplify its potential using data from Hungary. The hybrid machine learning methods of adaptive network-based fuzzy inference system (ANFIS) and multi-layered perceptron-imperialist competitive algorithm (MLP-ICA) are proposed to predict time series of infected individuals and mortality rate. The models predict that by late May, the outbreak and the total morality will drop substantially. The validation is performed for 9 days with promising results, which confirms the model accuracy. It is expected that the model maintains its accuracy as long as no significant interruption occurs. This paper provides an initial benchmarking to demonstrate the potential of machine learning for future research.
  • 899
  • 02 Feb 2021
  • Page
  • of
  • 3
ScholarVision Creations