Dark Matter and the information content of the Universe: History Edit
Subjects: Others

Landauer’s principle formulated in 1961 states that computer logical irreversibility implies physical irreversibility and demonstrated that information is physical. Here we formulate a new hypothesis proposing that a bit of information is not only physical as already demonstrated, but also it carries a finite non-zero mass while it stores information. In this framework, it is shown that all the informational content of the baryonic matter in the Universe could be stored in bits of information, each having a mass of 2.91 x 10-40 Kg at the average 2.73K temperature of the Universe. It is estimated that around 52 x 1093 bits would be sufficient to account for all the missing Dark Matter in the visible Universe. To test the hypothesis we propose here a simple experiment, predicting that the mass of a digital data storage device would increase by a small amount when is full of digital information relative to its mass in erased state.

1. Introduction

 One of the greatest curiosities of modern physics is understanding the nature of the mysterious sub­stance known as “dark matter”. Dark matter was first suggested in 1920s to explain observed anomalies in stellar velocities [1], and later in 1930s, when some unseen dark matter was again required to explain the dynamics and stability of cluster of galaxies [2-4]. However, the strongest necessity of dark matter’s presence came in 1970s from researching the galaxy rotation curves, which are diagrams representing the orbital velocity of gas and stars in galaxies as functions of their distance to the galactic center [5,6]. The orbital velocity of a rotating disk of gas and stars is expected to obey Kepler's second law so that the rotation velocities will decline with distance from the center. Experimental observations indicate that the rotation curves of galaxies remain flat as distance from the center increases [7, 8]. Since there is more gravitational pull than expected only from the observed light / baryonic matter of a galaxy (figure 1), the flat rotation velocity curves are a strong argument that dark matter should exist.  Although the existence of dark matter is generally accepted, a significant community of scientists are working on alternative explanations that do not require the existence of dark matter. There are various theoretical approaches, but they usually involve modifications of existing established theories such as modified Newtonian dynamics, modified general relativity, entropic gravity, tensor-vector-scalar gravity, and so on [9-14].  Today many physicists are trying to identify the nature of dark matter by a variety of means, but the consensus is that dark matter is composed primarily of a not yet discovered subatomic particle [15]. Unfortunately, all efforts to isolate or detect dark matter have failed so far. In this article, we propose a radical new idea that could explain the nature of the dark matter in the Universe. This involves the assumption that information is physical, and a bit of information has a finite non-zero mass, as demonstrated in detail in the following sections.

 2. Information entropy

 Shannon gave the mathematical formulation of the amount of information extracted from observing the occurrence of an event in his 1948 seminal paper [16]. Ignoring any particular features of the event, the observer, or the observation method, Shannon developed his theory using an axiomatic approach in which he defined information (I) extracted from observing an event as a function of the probability (p) of the event to occur or not, I(p). The second axiomatic property is that the information measure is a continuous positive function of the probability I(p) ≥ 0. An event that is certain, i.e. p = 1, gives therefore no information from its occurrence, so I(1) = 0. Assuming that for n independent events of individual probabilities pi the joint probability p is the product of their individual probabilities, then the information we get from observing the set of n events is the sum of the individual event’s information, . Shannon identified that the only function satisfying these axiomatic properties is a logarithmic function and, for an event whose probability of occurring is p, the information extracted from observing the event is:

 

                                                                                 (1)

 

where b is an arbitrary base, which gives the units of information. For binary bits of information, b = 2. Let us assume a set of n independent and distinctive events  having a probability distribution on X, so that each event xj has a probability of occurring, where pj ≥ 0 and . According to Shannon [1], the average information per event, or the number of bits of information per event, one can extract when observing the set of events X once is:

 

                                                                                             (2)

 

The function H(X) resembles an information entropy function and it is maximum when the events xj have equal probabilities of occurring,, so . When observing N sets of events X, or equivalently observing N times the set of events X, the number of bits of information extracted from the observation is N·H(X). The number of possible states, also known as distinct messages in Shannon’s original formalism, is equivalent to the number of information bearing microstates, Ω, compatible with the macro-state:    

  

                                                                                                               (3)

 

This allows the introduction of entropy of the information bearing states, using Boltzmann thermodynamic entropy:

 

                                                                           (4)

 

where kb = 1.38064 x 10-23 J/K is the Boltzmann constant. Let us examine the specific case of digital information, implying b = 2, and two possible distinctive events / states occurring, so n = 2 and. If we assume no biasing or external work on the system, then the two events / states have equal probabilities of occurring, so that  and, then using (2) it can be shown that H(X) = 1. The meaning of this result is that 1 bit of information is required to encode one letter message, or conversely, observing the above event generates 1 bit of information. Using this result in (4) we obtain the information entropy of one bit .

A computational process creates digital information via some sort of physical process, which obeys physical laws, including thermodynamics. Hence, there must be a direct connection between the process of creating, manipulating or erasing information and thermodynamics. In 1961, Landauer first proposed a link between thermodynamics and information by postulating that logical irreversibility of a computational process implies physical irreversibility [17]. Since irreversible processes are dissipative, it results that logical irreversibility is also a dissipative process and, by extrapolation, information is physical [18]. An example of logical irreversible process is the operation “erase” of a memory device. A memory device is a distinct finite array of N binary elements, which can hold information without dissipation. Let us consider an isolated physical system that works as a digital memory device consisting of an array of N bits. Using (3) we can calculate that there are 2N possible microstates and the initial information entropy of the system is. The total entropy of the system consists of the physical entropy, Siphys related to the non-information bearing states, and the information entropy, characteristic to the information bearing states. Performing an irreversible logical operation like “erase” brings the system into one of the three equivalent erased states as exemplified in fig. 2 for an array of 8 bits, also known as a byte. The erased state defined by Landauer is in fact a reset operation with all bits in 1 (fig. 2.b) or 0 (fig. 2.d) state, but they are equivalent to a true “erased” state that is neither 0, nor 1 as in fig.2.c. An example of true erased state would be an array of bits in a magnetic data storage memory, in which the erase operation does not imply reset of all bits to identical magnetized state, but total demagnetization of each bit, so neither 1, nor 0 could be identified in any of the bits. This implies that the system has only one possible information state, n = 1, so using (2) we get H(X) = 0 and Sinfo(erased) = Sfinfo = 0. Hence, the “erase” operation decreases the information entropy of the system, . Since the second law of thermodynamics states that the total entropy change cannot decrease over time, , then the irreversible computation must reduce the information entropy of the information bearing states by increasing the entropy of the non-information bearing states via a thermal dissipation of energy . For one bit of information lost irreversibly, then the entropy of the system must increase with an absolute value of heat released per bit lost,, known as Landauer’s principle [17,18]. Although the Landauer’s principle has been the matter of some controversy, today the scientific community widely accepts its validity, and we refer the reader to the recent experimental confirmations of the Landauer’s principle [19-22], as well as various theoretical arguments in its support [23].

 3. The mass of a bit of information

 We have already seen that the process of creating information requires work externally applied to modify the physical system and to create a bit of information, while the process of erasing a bit of information generates  heat energy released to the environment, as already confirmed experimentally [21,22]. However, once a bit of information is created, in the absence of any external perturbations, it can stay like this indefinitely without any energy dissipation. In this paper a radical idea is proposed, in which the process of holding information indefinitely without energy dissipation can be explained by the fact that a bit of information has a finite mass, mbit, which emerges in the process of the information bit creation. This is the equivalent mass of the excess energy created in the process of lowering the information entropy when a bit of information is erased. Using the mass-energy equivalence principle, the mass of a bit of information is:

 

                                                                                                                                            (5)

where c is the speed of light and T is temperature at which the bit of information is stored. Having the information content stored in a physical mass allows keeping the information without energy dissipation indefinitely. Erasing the information requires input external work and the mass mbit is converted back into energy / heat. The implications of this rationale are that the equivalence mass – energy principle inferred from the special relativity can be extrapolated to the mass – energy – information equivalence principle as depicted in figure 3. Furthermore, the information depends on the temperature at which the information bit exists. From (5), mbit = 0 at T = 0K, so as expected, no information can exist at zero absolute.

Within the concept of a digital Universe [24], each degree of freedom required to define an elementary particle must have an associated bit of information with it, so every particle of baryonic matter carries an amount of information associated to it. Hence, all the baryonic matter in the visible Universe would require a huge amount of information associated to it. The information itself is a very abstract concept that could be seen as the blueprint of an informational digital Universe [24], containing a large amount of “hidden” energy / mass in the form of information (see figure 4). This has a baryonic mass manifestation via the gravitational interactions, but it is impossible to detect, as a bit of information would not interact with the electromagnetic radiation. These are in fact the characteristics of the elusive “dark matter” whose presence is inferred from the rotational dynamics of the galaxies [25,26], but has never been observed or detected.

 4. Dark Matter estimations

 Assuming constant temperature and without making any considerations of where this information mass is localized in space-time, a rough estimation could be performed. Taking the average temperature of the Universe, T = 2.73K and using (5), the estimated mass of a bit of information, is mbit = 2.91 x 10-40 Kg. It is well accepted that the matter distribution in the Universe is ~ 5% ordinary baryonic matter, ~27% dark matter and ~68% dark energy [27]. This implies that there is about 5.4 times more dark matter than visible ordinary matter. Taking the estimated mass of our Milky Way galaxy as ~ 7 x 1011 MQ solar masses [28], and using the mass of the sun MQ ~ 2 x 1030 Kg, then the estimated dark matter mass in our galaxy is MDark_Matter ~ 3.78 x 1012 MQ = 7.56 x 1042 Kg. Assuming that all the missing dark matter is made up of bits of information, then the entire Milky Way galaxy has Nbits (Milky Way) = MDark_Matter / mbit(T = 2.73K) = 2.59 x 1082 bits. The estimated number of galaxies in the visible Universe is ~ 2 x 1012 [29], so the estimated total number of bits of information in the visible Universe is ~ 52 x 1093 bits. Remarkably, this number is reasonably close to another estimate of the Universe information bit content of ~1087 given by Gough in 2008 using Landauer’s principle via a different approach [30].

 5. Proposed experiment and conclusions

 In what follows, we propose a simple experiment capable of testing this theory by physically measuring the mass of digital information. This consists of an ultra accurate mass measurement of a data storage device when all its memory bits are in fully erased state. This is then followed by the operation of recording digital data on all of its memory bits until is at full capacity, followed by another accurate mass measurement. If the proposed hypothesis is correct, then the data storage device should be heavier when information is stored on it than when it is in fully erased state. One could easily estimate the mass difference, Dm. Using relation (5) at room temperature (T = 300K), the estimated mass of a bit is ~ 3.19 x 10-38 Kg. Assuming a memory device of 1Tb storage capacity, then the total number of memory bits is 1012 bytes = 8 x 1012 bits, as 1 byte = 8 bits. Hence the predicted mass change in this experiment is Dm = 2.5 x 10-25 Kg. The proposed experiment is simple in terms of physical complexity, but very challenging overall as the success depends on one’s ability to measure accurately mass changes in the order of ~10-25 Kg. The required measurement sensitivity could be reduced by a factor f if the amount of data storage under test is increased from 1Tb to f x 1Tb. However, a successful experiment would offer a direct experimental confirmation of the mass - energy - information equivalence principle with far reaching implications in physics, cosmology, computation and technologies. In fact, we could argue that information is a distinct form of matter, or the fifth state, along the other four observable solid, liquid, gas, and plasma states of matter. Although the proposed theory has speculative aspects and gaps, it has the virtue of being verifiable in a laboratory environment and we actually proposed an experiment here. It is expected that this work will stimulate further theoretical and experimental research, bringing the scientific community one-step closer to understanding the abstract nature of matter, dark matter, energy and information in the Universe.