Real Estate Valuations with Small Dataset: Comparison
Please note this is a comparison between Version 2 by Camila Xu and Version 1 by PIERFRANCESCO DE PAOLA.

In real estate markets, accuracy in property valuations is a fundamental element for making informed decisions and effective investment strategies. The complex dynamics that characterize real estate markets, together with the high differentiation of properties, make the adoption of advanced approaches crucial to obtaining accurate valuations.

  • real estate valuation
  • properties valuation methods
  • small real estate sample
  • entropy

1. Introduction

In real estate markets, accuracy in property valuations is a fundamental element for making informed decisions and effective investment strategies. The complex dynamics that characterize real estate markets, together with the high differentiation of properties, make the adoption of advanced approaches crucial to obtaining accurate valuations [1,2][1][2].
However, this is countered by a frequent scarcity of real estate data and opaqueness of related markets, problems that can be found in various territorial contexts. The causes of these phenomena can be traced back to a series of factors, including resistance to change in the real estate sector, the lack of standardization in registration practices, the absence of regulatory requirements that mandate complete disclosure of information, the limitations or incompleteness of information in public data, and the reticence of private individuals in disclosing transaction prices. The relevance of the negative impacts of all these factors is evident in the understanding of real estate markets, property valuations, and investment decisions within the real estate sector due to information asymmetries [2].
In this framework, the Principle of Maximum Entropy emerges as a powerful tool, offering a new paradigm to address the challenges of real estate valuations.
Entropy is a fundamental concept in information theory and is closely associated with the idea of measuring uncertainty or randomness in a system [3]. The Maximum Entropy Principle proposes to select the probability distribution that reflects the maximum uncertainty, given a set of observed constraints. In other words, it involves choosing a distribution that is as neutral as possible with respect to the known information. Applying this principle to the field of real estate valuations entails balancing the complexity and variety of data, allowing statistical models to adapt naturally, guided by the maximum possible entropy [4].
The Maximum Entropy approach moves away from assuming additional information not supported by observed data, providing valuations that are, by definition, the result of an inference process based on maximum uncertainty. In the real estate field, this approach allows you to flexibly integrate different sources of information, reflecting a variety of variables that can influence the value of a property.
When dealing with optimization problems with constraints, as in the case of the general formulation of Maximum Entropy Principle, the Lagrange multipliers are often used to incorporate these constraints into the objective function. The goal is to find the maximum of objective function subject to the given constraints. Thus, the integration of the Maximum Entropy Principle with Lagrange multipliers enables the handling of constraints in probability appraisal. This approach allows to find the probability distribution that maximizes the entropy given the constrained knowledge of the system, ensuring consistency with the available information [5,6][5][6].
From a logical point of view, the proposed methodological approach, not unlike other procedures, leads to determining the market value or income of a property through a comparison with prices of properties that have similar characteristics to the one being estimated. A prerequisite is that the comparative real estate data occurred recently in relation to the time of the valuation. While it is logical to assume that a greater number of comparison data leads to a better estimate result, the conducted experimentation considers a small sample of real estate sales sufficient for arriving at a reliable estimated value. Under this last aspect, the approach effectively addresses the challenge posed by the scarcity of data that characterizes real estate markets. The method in question estimates the value of a property by comparing its characteristics with those of comparable properties, in accordance with the “similia similibus aestimentur” criterion [7].

2. Real Estate Valuations with Small Dataset

The word “entropy” first appeared in 1864 in the context of a thermodynamics treatise by Rudolf Clausius, where it represents a state function that quantifies the unavailability of a system to produce work (in variational form, it is equal to the ratio between the amount of heat absorbed or released reversibly and isothermally by the system at a certain temperature considered). In accordance with its original definition, entropy, therefore, indicates which processes can occur spontaneously: the evolution of a system always proceeds in the direction of increasing entropy [8]. In 1870, with the development of statistical mechanics, J.W. Gibbs gave a new meaning to entropy, linked to the possible molecular arrangements of a system of particles. The Gibbs entropy (S) is defined as [9]:
where kB is the Boltzmann constant and pi is the probability that the system is in the i-th microstate. Maximizing the entropy function (S), the system reaches its equilibrium state. Equation (1) can be regarded as the fundamental definition of entropy, as all other expressions of S can be derived from (1) but not vice versa. Subsequently, Boltzmann reworked Gibbs’s concept, defining entropy as the measure of the number of possible microstates of a system, given its macroscopic thermodynamic properties [10]. In 1948, Shannon introduced the concept of information entropy, demonstrating how it was possible to quantify the information contained in a message emitted by a source. He completely disregarded the semantic content of the term entropy, considering the quantity of information solely in probabilistic terms. The information is quantified through a function that measures the uncertainty of X, namely entropy, defined as [3]:
where K is a positive and arbitrary constant that depends on the logarithmic base, and (p1, …, pp) are the probabilities of a set of possible events. In this case, entropy measures the amount of uncertainty or information present in a random signal. Starting in 1957, Jaynes dedicated himself to demonstrating the connection between the physical concept of entropy and that of information theory, developing the Principle of Maximum Entropy. Through this principle, Jaynes showed how it was possible to determine probability distributions of a configuration from partial information. The basic idea is to leverage the available information and impose that the sought distribution is the one that maximizes Shannon’s entropy, as a measure of uncertainty and information quantity [4]. Over time, the concept of entropy has been employed in applications across various scientific domains [11], including econometrics [12[12][13],13], decision-making under uncertainty [14[14][15],15], market behaviors [16], the performance of stakeholders connected to informational advantages [17,18][17][18], investments, asset and portfolio returns, financial time-series forecasting, and valuation of manufacturing yields [19,20,21][19][20][21]. From these studies, it is inferred that the reduction of entropy can represent a concept of fundamental importance in the economic domain. Indeed, low entropy can govern economic values [22,23][22][23] or measure the scarcity and value of economic goods [24]. Similarly, the economic value of a good, incorporating complex, indeterminate, and anthropic features, derives from the law of entropy [25]. International studies specifically focused on the application of the concept of entropy in the real estate field are very limited. Brown [26] first investigated the effectiveness of entropy in explaining the inefficiency of the real estate market, followed by Chen et al. [27]. The paper by Ge and Du (2007) investigates the main variables that influence residential property values in the Auckland property market (New Zealand) and ranks the variables using the Entropy method [28]. Lam et al. proposed in 2008 a mathematical model for predicting housing prices in Hong Kong based on the integration of entropy and artificial neural networks [29]. Subsequently, in 2009, the same authors implemented artificial neural networks with support vector machines to enhance the accuracy of real estate assessments in Hong Kong and mainland China. The identification of key real estate variables, which could influence property prices, has been addressed through an entropy-based rating and weighting method aimed at providing objective and reasonable weights [30]. In 2009, Zhou et al. dealt with a complex problem of multi-objective decision making in the real estate venture capital sector, where the weight was assigned based on base points and maximum entropy [31]. Salois and Moss developed a dynamic information measure in 2011 to examine the informational content of farmland values and farm income in explaining the distribution of farmland values over time [32]. The primary goal of Gnat’s 2019 study was a proposal to modify the classical entropy measure, enhancing its ability to accurately reflect the specificity of assessing the homogeneity of valued areas in the context of property market analysis [33]. In 2020, Kostic and Jevremovic addressed the topic of property attractiveness, where property image features are used to describe specific attributes and examine the influence of visual factors on the price or duration of real estate listings. They considered a set of techniques for extracting visual features for efficient numerical inclusion in modern predictive algorithms, including Shannon’s entropy, center of gravity calculation, image segmentation, and the use of Convolutional Neural Networks. They concluded that the employed techniques can effectively describe visible features, thus introducing perceived attractiveness as a quantitative measure in predictive modeling of housing [34]. The study by Basse et al. (2020) utilizes the concept of transfer entropy to examine the relationship between the US National Association of Home Builders Index and the S&P CoreLogic Case-Shiller 20-City Composite Home Price Index. The empirical evidence suggests that the survey data can contribute to predicting US house prices [35]. The last work in chronological order is Özdilek’s addressing in 2023 the incorporation of entropy measurements into real estate valuation, modifying and integrating triadic estimates of price, cost, and income; his results have significantly improved the precision of value measurement [11]. The above studies, where entropy is applied to various aspects or issues of the real estate markets, all highlight a common theme: a significant improvement in the predictive accuracy of the measured values.

References

  1. Del Giudice, V.; Manganelli, B.; De Paola, P. Hedonic Analysis of Housing Sales Prices with Semiparametric Methods. Int. J. Agric. Environ. Inf. Syst. 2017, 8, 65–77.
  2. De Paola, P.; Previtera, S.; Manganelli, B.; Forte, F.; Del Giudice, F.P. Interpreting Housing Prices with a MultidisciplinaryApproach Based on Nature-Inspired Algorithms and Quantum Computing. Buildings 2023, 13, 1603.
  3. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Techn. J. 1948, 27, 379–423.
  4. Jaynes, E.T. Information Theory and Statistical Mechanics. Phys. Rev. Ser. 1957, 106, 620.
  5. Jaynes, E.T. Where do we stand on maximum entropy? In The Maximum Entropy Formalism; Levine, R.D., Tribus, M., Eds.; MIT Press: Cambridge, MA, USA, 1978; pp. 15–118.
  6. Jaynes, E.T. On the rationale of maximum-entropy methods. Proc. IEEE 1982, 70, 939–952.
  7. Smith, H.C.; Belloit, J.D. Real Estate Appraisal; Century VII Publishing Company: Los Angeles, CA, USA, 1987; ISBN 9780939787012.
  8. Clausius, R. Abhandlungen uber die Mechanische Warmetheorie; F. Vieweg und Sohn: Braunschweig, Germany, 1864.
  9. Gibbs, J.W. On the equilibrium of heterogeneous substances. Trans. Conn. Acad. Arts Sci. 1879, 3, 108–248.
  10. Boltzmann, L. Über die Beziehung zwischen dem zweiten Hauptsatze der mechanischen Wärmetheorie und der Wahrscheinlichkeitsrechnung respektive den Sätzen über das Wärmegleichgewicht. Wien. Ber. 1877, 76, 373–435.
  11. Özdilek, Ü. The Role of Thermodynamic and Informational Entropy in Improving Real Estate Valuation Methods. Entropy 2023, 25, 907.
  12. Golan, A. Information and entropy econometrics: A review and synthesis. Found. Trends Econ. 2006, 2, 1–145.
  13. Bretó, C.; Espinosa, P.; Hernández, P.; Pavía, J.M. An Entropy-Based Machine Learning Algorithm for Combining Macroeconomic Forecasts. Entropy 2009, 21, 1015.
  14. Dyer, J.A.; Sarin, R.K. Measurable multi-attribute value functions. Oper. Res. 1979, 27, 810–822.
  15. Zanakis, S.H.; Solomon, A.; Wishart, N.; Dublish, S. Multi-attribute decision making: A simulation comparison of select methods. Eur. J. Oper. Res. 1998, 107, 507–529.
  16. Grossman, S.; Stiglitz, J. On the impossibility of informationally efficient markets. Am. Econ. Rev. 1980, 70, 393–408.
  17. Fama, E.F. The Behavior of Stock-Market Prices. J. Bus. 1965, 38, 34–105.
  18. Ivkovic, Z.; Weisbenner, S. Local does as local is: Information content of the geography of individual investors’ common stock investments. J. Financ. 2005, 60, 267–306.
  19. Molgedey, L.; Ebeling, W. Local order, entropy and predictability of financial time series. Eur. Phys. J. B 2000, 107, 733–737.
  20. Bentes, S.; Menezes, R.; Mendes, D.A. Long memory and volatility clustering: Is the empirical evidence consistent across stock markets? Phys. A 2008, 387, 3826–3830.
  21. Zhou, R.; Cai, R.; Tong, G. Applications of Entropy in Finance: A Review. Entropy 2013, 15, 4909–4931.
  22. Schrödinger, E. What Is Life? And Mind and Matter; Cambridge University Press: Cambridge, UK, 1967.
  23. Jeffery, K.; Pollack, R.; Rovelli, C. On the Statistical Mechanics of Life: Schrödinger Revisited. Entropy 2019, 21, 1211.
  24. Applebaum, D. Probability and Information, an Integrated Approach; Cambridge University Press: Cambridge, UK, 1996.
  25. Georgescu-Roegen, N. The Entropy Law and the Economic Process; Harvard University Press: Cambridge, MA, USA, 1971.
  26. Brown, R.J. Entropy–what kind of bet is real estate–really? J. Prop. Invest. Financ. 2017, 35, 341–351.
  27. Chen, Y.; Cai, Y.; Zheng, C. Efficiency of Chinese Real Estate Market Based on Complexity-Entropy Binary Causal Plane Method. Complexity 2020, 2020, 2791352.
  28. Lam, K.C.; Yu, C.Y.; Lam, K.Y. An Artificial Neural Network and Entropy Model for Residential Property Price Forecasting in Hong Kong. J. Prop. Res. 2008, 25, 321–342.
  29. Ge, X.J.; Du, Y. Main variables influencing residential property values using the entropy method–The case of Auckland. In Proceedings of the 5th International Structural Engineering and Construction Conference; Springer Nature; Berlin/Heidelberg, Germany, 2007.
  30. Lam, K.C.; Yu, C.Y.; Lam, K.Y. Support vector machine and entropy based decision support system for property valuation. J. Prop. Res. 2009, 26, 213–233.
  31. Zhou, S.; Wang, F.; Zhang, Z. Evaluation of Real Estate Venture Capital Based on Entropy Double Base Points Method. In Proceedings of the 2009 International Conference on Electronic Commerce and Business Intelligence, Beijing, China, 6–7 June 2009.
  32. Salois, M.J.; Moss, C.B. An Information Approach to the Dynamics in Farm Income: Implications for Farmland Markets. Entropy 2011, 13, 38–52.
  33. Gnat, S. Measurement of entropy in the assessment of homogeneity of areas valued with the Szczecin Algorithm of Real Estate Mass Appraisal. J. Econ. Manag. 2019, 38, 89–106.
  34. Kostic, Z.; Jevremovic, A. What Image Features Boost Housing Market Predictions? IEEE Trans. Multimed. 2020, 22, 1904–1916.
  35. Basse, T.; Desmyter, S.; Saft, D.; Wegener, C. Leading indicators for the US housing market: New empirical evidence and thoughts about implications for risk managers and ESG investors. Int. Rev. Financ. Anal. 2023, 89, 102765.
More
ScholarVision Creations