In real estate markets, accuracy in property valuations is a fundamental element for making informed decisions and effective investment strategies. The complex dynamics that characterize real estate markets, together with the high differentiation of properties, make the adoption of advanced approaches crucial to obtaining accurate valuations.
1. Introduction
In real estate markets, accuracy in property valuations is a fundamental element for making informed decisions and effective investment strategies. The complex dynamics that characterize real estate markets, together with the high differentiation of properties, make the adoption of advanced approaches crucial to obtaining accurate valuations [
1,
2].
However, this is countered by a frequent scarcity of real estate data and opaqueness of related markets, problems that can be found in various territorial contexts. The causes of these phenomena can be traced back to a series of factors, including resistance to change in the real estate sector, the lack of standardization in registration practices, the absence of regulatory requirements that mandate complete disclosure of information, the limitations or incompleteness of information in public data, and the reticence of private individuals in disclosing transaction prices. The relevance of the negative impacts of all these factors is evident in the understanding of real estate markets, property valuations, and investment decisions within the real estate sector due to information asymmetries [
2].
In this framework, the Principle of Maximum Entropy emerges as a powerful tool, offering a new paradigm to address the challenges of real estate valuations.
Entropy is a fundamental concept in information theory and is closely associated with the idea of measuring uncertainty or randomness in a system [
3]. The Maximum Entropy Principle proposes to select the probability distribution that reflects the maximum uncertainty, given a set of observed constraints. In other words, it involves choosing a distribution that is as neutral as possible with respect to the known information. Applying this principle to the field of real estate valuations entails balancing the complexity and variety of data, allowing statistical models to adapt naturally, guided by the maximum possible entropy [
4].
The Maximum Entropy approach moves away from assuming additional information not supported by observed data, providing valuations that are, by definition, the result of an inference process based on maximum uncertainty. In the real estate field, this approach allows you to flexibly integrate different sources of information, reflecting a variety of variables that can influence the value of a property.
When dealing with optimization problems with constraints, as in the case of the general formulation of Maximum Entropy Principle, the Lagrange multipliers are often used to incorporate these constraints into the objective function. The goal is to find the maximum of objective function subject to the given constraints. Thus, the integration of the Maximum Entropy Principle with Lagrange multipliers enables the handling of constraints in probability appraisal. This approach allows to find the probability distribution that maximizes the entropy given the constrained knowledge of the system, ensuring consistency with the available information [
5,
6].
From a logical point of view, the proposed methodological approach, not unlike other procedures, leads to determining the market value or income of a property through a comparison with prices of properties that have similar characteristics to the one being estimated. A prerequisite is that the comparative real estate data occurred recently in relation to the time of the valuation. While it is logical to assume that a greater number of comparison data leads to a better estimate result, the conducted experimentation considers a small sample of real estate sales sufficient for arriving at a reliable estimated value. Under this last aspect, the approach effectively addresses the challenge posed by the scarcity of data that characterizes real estate markets. The method in question estimates the value of a property by comparing its characteristics with those of comparable properties, in accordance with the “similia similibus aestimentur” criterion [
7].
2. Real Estate Valuations with Small Dataset
The word “entropy” first appeared in 1864 in the context of a thermodynamics treatise by Rudolf Clausius, where it represents a state function that quantifies the unavailability of a system to produce work (in variational form, it is equal to the ratio between the amount of heat absorbed or released reversibly and isothermally by the system at a certain temperature considered). In accordance with its original definition, entropy, therefore, indicates which processes can occur spontaneously: the evolution of a system always proceeds in the direction of increasing entropy [
8].
In 1870, with the development of statistical mechanics, J.W. Gibbs gave a new meaning to entropy, linked to the possible molecular arrangements of a system of particles. The Gibbs entropy (
S) is defined as [
9]:
where
kB is the Boltzmann constant and
pi is the probability that the system is in the
i-th microstate. Maximizing the entropy function (
S), the system reaches its equilibrium state. Equation (1) can be regarded as the fundamental definition of entropy, as all other expressions of
S can be derived from (1) but not vice versa.
Subsequently, Boltzmann reworked Gibbs’s concept, defining entropy as the measure of the number of possible microstates of a system, given its macroscopic thermodynamic properties [
10].
In 1948, Shannon introduced the concept of information entropy, demonstrating how it was possible to quantify the information contained in a message emitted by a source. He completely disregarded the semantic content of the term entropy, considering the quantity of information solely in probabilistic terms. The information is quantified through a function that measures the uncertainty of
X, namely entropy, defined as [
3]:
where
K is a positive and arbitrary constant that depends on the logarithmic base, and (
p1, …,
pp) are the probabilities of a set of possible events. In this case, entropy measures the amount of uncertainty or information present in a random signal.
Starting in 1957, Jaynes dedicated himself to demonstrating the connection between the physical concept of entropy and that of information theory, developing the Principle of Maximum Entropy. Through this principle, Jaynes showed how it was possible to determine probability distributions of a configuration from partial information. The basic idea is to leverage the available information and impose that the sought distribution is the one that maximizes Shannon’s entropy, as a measure of uncertainty and information quantity [
4].
Over time, the concept of entropy has been employed in applications across various scientific domains [
11], including econometrics [
12,
13], decision-making under uncertainty [
14,
15], market behaviors [
16], the performance of stakeholders connected to informational advantages [
17,
18], investments, asset and portfolio returns, financial time-series forecasting, and valuation of manufacturing yields [
19,
20,
21].
From these studies, it is inferred that the reduction of entropy can represent a concept of fundamental importance in the economic domain. Indeed, low entropy can govern economic values [
22,
23] or measure the scarcity and value of economic goods [
24]. Similarly, the economic value of a good, incorporating complex, indeterminate, and anthropic features, derives from the law of entropy [
25].
International studies specifically focused on the application of the concept of entropy in the real estate field are very limited.
Brown [
26] first investigated the effectiveness of entropy in explaining the inefficiency of the real estate market, followed by Chen et al. [
27].
The paper by Ge and Du (2007) investigates the main variables that influence residential property values in the Auckland property market (New Zealand) and ranks the variables using the Entropy method [
28].
Lam et al. proposed in 2008 a mathematical model for predicting housing prices in Hong Kong based on the integration of entropy and artificial neural networks [
29]. Subsequently, in 2009, the same authors implemented artificial neural networks with support vector machines to enhance the accuracy of real estate assessments in Hong Kong and mainland China. The identification of key real estate variables, which could influence property prices, has been addressed through an entropy-based rating and weighting method aimed at providing objective and reasonable weights [
30].
In 2009, Zhou et al. dealt with a complex problem of multi-objective decision making in the real estate venture capital sector, where the weight was assigned based on base points and maximum entropy [
31].
Salois and Moss developed a dynamic information measure in 2011 to examine the informational content of farmland values and farm income in explaining the distribution of farmland values over time [
32].
The primary goal of Gnat’s 2019 study was a proposal to modify the classical entropy measure, enhancing its ability to accurately reflect the specificity of assessing the homogeneity of valued areas in the context of property market analysis [
33].
In 2020, Kostic and Jevremovic addressed the topic of property attractiveness, where property image features are used to describe specific attributes and examine the influence of visual factors on the price or duration of real estate listings. They considered a set of techniques for extracting visual features for efficient numerical inclusion in modern predictive algorithms, including Shannon’s entropy, center of gravity calculation, image segmentation, and the use of Convolutional Neural Networks. They concluded that the employed techniques can effectively describe visible features, thus introducing perceived attractiveness as a quantitative measure in predictive modeling of housing [
34].
The study by Basse et al. (2020) utilizes the concept of transfer entropy to examine the relationship between the US National Association of Home Builders Index and the S&P CoreLogic Case-Shiller 20-City Composite Home Price Index. The empirical evidence suggests that the survey data can contribute to predicting US house prices [
35].
The last work in chronological order is Özdilek’s addressing in 2023 the incorporation of entropy measurements into real estate valuation, modifying and integrating triadic estimates of price, cost, and income; his results have significantly improved the precision of value measurement [
11].
The above studies, where entropy is applied to various aspects or issues of the real estate markets, all highlight a common theme: a significant improvement in the predictive accuracy of the measured values.
This entry is adapted from the peer-reviewed paper 10.3390/realestate1010003