Topic Review
Kübler-Ross Model
The Kübler-Ross model, or the Five Stages of Grief, postulates a series of emotions experienced by terminally ill patients prior to death, or people who have lost a loved one, wherein the five stages are: denial, anger, bargaining, depression, and acceptance. Although commonly referenced in popular media, the existence of these stages has not been empirically demonstrated and the model is not considered helpful in explaining the grieving process. It is considered to be of historical value but outdated in scientific terms and in clinical practice. The model was first introduced by Swiss-American psychiatrist Elisabeth Kübler-Ross in her 1969 book On Death and Dying, and was inspired by her work with terminally ill patients. Motivated by the lack of instruction in medical schools on the subject of death and dying, Kübler-Ross examined death and those faced with it at the University of Chicago's medical school. Kübler-Ross's project evolved into a series of seminars which, along with patient interviews and previous research, became the foundation for her book. Although Kübler-Ross is commonly credited with creating stage models, earlier bereavement theorists and clinicians such as Erich Lindemann, Collin Murray Parkes, and John Bowlby used similar models of stages of phases as early as the 1940s. Later in her life, Kübler-Ross noted that the stages are not a linear and predictable progression and that she regretted writing them in a way that was misunderstood. "Kübler-Ross originally saw these stages as reflecting how people cope with illness and dying," observed grief researcher Kenneth J. Doka, "not as reflections of how people grieve."
  • 2.3K
  • 03 Nov 2022
Topic Review
Prediction of Water Quality Classification using Machine Learning
Machine Learning (ML) has been used for a long time and has gained wide attention over the last several years. It can handle a large amount of data and allow non-linear structures by using complex mathematical computations. However, traditional ML models do suffer some problems, such as high bias and overfitting. Therefore, this has resulted in the advancement and improvement of ML techniques, such as the bagging and boosting approach, to address these problems.
  • 2.3K
  • 21 Jun 2022
Topic Review
Guccifer 2.0
"Guccifer 2.0" is a persona which claimed to be the hacker(s) that hacked into the Democratic National Committee (DNC) computer network and then leaked its documents to the media, the website WikiLeaks, and a conference event. Some of the documents "Guccifer 2.0" released to the media appear to be forgeries cobbled together from public information and previous hacks, which had been mixed with disinformation. According to indictments in February 2018, the persona is operated by Russian military intelligence agency GRU. On July 13, 2018, Special Counsel Robert Mueller indicted 12 GRU agents for allegedly perpetrating the cyberattacks. The U.S. Intelligence Community concluded that some of the genuine leaks from "Guccifer 2.0" were part of a series of cyberattacks on the DNC committed by two Russian military intelligence groups, and that "Guccifer 2.0" is actually a persona created by Russian intelligence services to cover for their interference in the 2016 U.S. presidential election. This conclusion is based on analyses conducted by various private sector cybersecurity individuals and firms, including CrowdStrike, Fidelis Cybersecurity, FireEye's Mandiant, SecureWorks, ThreatConnect, Trend Micro, and the security editor for Ars Technica. The Russian government denies involvement in the theft, and "Guccifer 2.0" denied links to Russia. WikiLeaks founder Julian Assange said multiple parties had access to DNC emails and that there was "no proof" Russia was behind the attack. In March 2018, Special Counsel Robert Mueller took over investigation of Guccifer 2.0 from the FBI while it was reported that forensic determination had found the Guccifer 2.0 persona to be a "particular military intelligence directorate (GRU) officer working out of the agency's headquarters on Grizodubovoy Street in Moscow".
  • 2.3K
  • 10 Oct 2022
Topic Review
Cyclic Quadrilateral
In Euclidean geometry, a cyclic quadrilateral or inscribed quadrilateral is a quadrilateral whose vertices all lie on a single circle. This circle is called the circumcircle or circumscribed circle, and the vertices are said to be concyclic. The center of the circle and its radius are called the circumcenter and the circumradius respectively. Other names for these quadrilaterals are concyclic quadrilateral and chordal quadrilateral, the latter since the sides of the quadrilateral are chords of the circumcircle. Usually the quadrilateral is assumed to be convex, but there are also crossed cyclic quadrilaterals. The formulas and properties given below are valid in the convex case. The word cyclic is from the Ancient Greek κύκλος (kuklos) which means "circle" or "wheel". All triangles have a circumcircle, but not all quadrilaterals do. An example of a quadrilateral that cannot be cyclic is a non-square rhombus. The section characterizations below states what necessary and sufficient conditions a quadrilateral must satisfy to have a circumcircle.
  • 2.3K
  • 27 Oct 2022
Topic Review
Markov Chain Applications to Education
The theory of Markov chains is a smart combination of Linear Algebra and Probability theory offering ideal conditions for modelling situations depending on random variables. Markov chains have found important applications to many sectors of the human activity. In this work a finite Markov chain is introduced representing mathematically the teaching process which is based on the ideas of constructivism for learning. Interesting conclusions are derived and a measure is obtained for the teaching effectiveness. An example on teaching the derivative to fresher university students is also presented illustrating our results.
  • 2.3K
  • 03 Dec 2020
Topic Review
Cybersecurity
Cyberspace has become an indispensable factor for all areas of the modern world. The world is becoming more and more dependent on the internet for everyday living. The increasing dependency on the internet has also widened the risks of malicious threats. On account of growing cybersecurity risks, cybersecurity has become the most pivotal element in the cyber world to battle against all cyber threats, attacks, and frauds. The expanding cyberspace is highly exposed to the intensifying possibility of being attacked by interminable cyber threats. The objective of this survey is to bestow a brief review of different machine learning (ML) techniques to get to the bottom of all the developments made in detection methods for potential cybersecurity risks. These cybersecurity risk detection methods mainly comprise of fraud detection, intrusion detection, spam detection, and malware detection. In this review paper, we build upon the existing literature of applications of ML models in cybersecurity and provide a comprehensive review of ML techniques in cybersecurity. To the best of our knowledge, we have made the first attempt to give a comparison of the time complexity of commonly used ML models in cybersecurity. We have comprehensively compared each classifier’s performance based on frequently used datasets and sub-domains of cyber threats. This work also provides a brief introduction of machine learning models besides commonly used security datasets. Despite having all the primary precedence, cybersecurity has its constraints compromises, and challenges. This work also expounds on the enormous current challenges and limitations faced during the application of machine learning techniques in cybersecurity.
  • 2.3K
  • 10 Feb 2021
Topic Review
Agent-Based Programming
Intelligent and autonomous agents is a subarea of symbolic artificial intelligence where these agents decide, either reactively or proactively, upon a course of action by reasoning about the information that is available about the world (including the environment, the agent itself, and other agents). It encompasses a multitude of techniques, such as negotiation protocols, agent simulation, multi-agent argumentation, multi-agent planning, and many others. In an agent-based programming language, agents are the building blocks, and programs are obtained by programming their behaviours (how an agent reasons), their goals (what an agent aims to achieve) and their interoperation (how agents collaborate to solve a task).
  • 2.3K
  • 08 Mar 2021
Topic Review
Forensic Statistics
Forensic statistics is the application of probability models and statistical techniques to scientific evidence, such as DNA evidence, and the law. In contrast to "everyday" statistics, to not engender bias or unduly draw conclusions, forensic statisticians report likelihoods as likelihood ratios (LR). This ratio of probabilities is then used by juries or judges to draw inferences or conclusions and decide legal matters. Jurors and judges rely on the strength of a DNA match, given by statistics, to make conclusions and determine guilt or innocence in legal matters. In forensic science, the DNA evidence received for DNA profiling often contains a mixture of more than one person's DNA. DNA profiles are generated using a set procedure, however, the interpretation of a DNA profile becomes more complicated when the sample contains a mixture of DNA. Regardless of the number of contributors to the forensic sample, statistics and probabilities must be used to provide weight to the evidence and to describe what the results of the DNA evidence mean. In a single-source DNA profile, the statistic used is termed a random match probability (RMP). RMPs can also be used in certain situations to describe the results of the interpretation of a DNA mixture. Other statistical tools to describe DNA mixture profiles include likelihood ratios (LR) and combined probability of inclusion (CPI), also known as random man not excluded (RMNE). Computer programs have been implemented with forensic DNA statistics for assessing the biological relationships between two or more people. Forensic science uses several approaches for DNA statistics with computer programs such as; match probability, exclusion probability, likelihood ratios, Bayesian approaches, and paternity and kinship testing. Although the precise origin of this term remains unclear, it is apparent that the term was used in the 1980s and 1990s. Among the first forensic statistics conferences were two held in 1991 and 1993.
  • 2.3K
  • 12 Oct 2022
Topic Review
List of International Mathematical Olympiad Participants
The International Mathematical Olympiad (IMO) is an annual international high school mathematics competition focused primarily on pre-collegiate mathematics, and is the oldest of the international science olympiads. The awards for exceptional performance include medals for roughly the top half participants, and honorable mentions for participants who solve at least one problem perfectly. This is a list of participants who have achieved notability. This includes participants that went on to become notable mathematicians, participants who won medals at an exceptionally young age, or participants who scored highly.
  • 2.3K
  • 26 Oct 2022
Topic Review
Videos Data Augmentation for Deep Learning Models
In most Computer Vision applications, Deep Learning models achieve state-of-the-art performances. One drawback of Deep Learning is the large amount of data needed to train the models. Unfortunately, in many applications, data are difficult or expensive to collect. Data augmentation can alleviate the problem, generating new data from a smaller initial dataset. Geometric and color space image augmentation methods can increase accuracy of Deep Learning models but are often not enough. More advanced solutions are Domain Randomization methods or the use of simulation to artificially generate the missing data. Data augmentation algorithms are usually specifically designed for single images. Most recently, Deep Learning models have been applied to the analysis of video sequences.
  • 2.3K
  • 25 Mar 2022
  • Page
  • of
  • 371
Video Production Service