The concept of information has been skirting around the borders of physics for over a century, trying to get in. It first started causing trouble when Maxwell (1867) devised a thought experiment with a rectangular box separated into two by a partition, with molecules in it moving around at various speeds (see the red dots, Fig.1). The partition has a door at the bottom, beside which stands a 'Demon' (in blue). Every time a faster-moving molecule approaches the door going right, the demon opens it and lets it through, so that eventually all the fast molecules are on the right hand side of the partition. This implies that if you have detailed information about the molecules, then you can violate the second law of thermodynamics because the entropy of the box has decreased: where the temperature was uniform, there is now a gradient. This was a big problem, because the 2nd law is a pretty big law to violate.

Leo Szilard (1929), with a more practical bent, then showed that you should be able to get energy out of information in this way (Szilard's engine). Fig.2 shows a cylinder with a partition, one molecule bouncing around inside it. If you have a bit of information telling you which end of the cylinder the molecule is in, say it is in the right hand side, then you can put down the partition trapping the molecule there, advance the left hand piston (frictionlessly and without resistance from the molecule), remove the partition and allow the molecule to push the left hand piston back. Thus you have generated energy to move the piston solely from the information you had about the molecule's position. If you have one bit of information it turns out you can get kTlog2 Joules of energy out, where k is Boltzmann's constant and T is the ambient temperature. Szilard's Engine has recently been realised experimentally (see Toyabe et al., 2010).

As an aside, I have an amusing (to me), version of this, that I thought of when watching a comedy routine by Dudley Moore and Peter Cook (One Leg Too Few, 1964). If you happen to have a one-legged chicken, and you have information about which leg is missing, then you can attach a string to the appropriate side and generate energy as it falls down.. apologies to chickens everywhere.

The huge problem of the violation of the 2nd law in these scenarios was finally resolved by Landauer (1961) who was a computer scientist, very familiar with information. He realised that the memory system of a computer is also a physical system, so the 2nd law should apply. A computer uses binary digits, eg: 010011, but in fact the 0s and 1s correspond to real physical attributes in the solid state memory, so when computer memory is erased (changing it from say 010011 to 000000) this represents a very real elimination of physical patterns and therefore a reduction of entropy, violating the second law of thermodynamics. To preserve the 2nd law Landauer proposed that enough heat must be released to increase the entropy of the cosmos, to more than offset the decrease in entropy of the memory storage device. This implies that any deletion of information must lead to a release of heat and this is now called Landauer's principle.

All this makes clearer something I've been trying to do for a long time: to show that another way of thinking about MiHsC is to regard it as a conversion of information to energy, and that what is being conserved in nature is not mass-energy, but EMI (energy+mass+information). I've recently managed to show that when an object accelerates, information of a particular kind is deleted and the amount of energy released looks very much like MiHsC. I'm unwilling to say more now because I've just submitted a paper on this (McCulloch, 2015), but this is an exciting new development, bringing information theory into the mix, and it's nice to be able to derive MiHsC in two different ways: 1) from the fitting of Unruh waves into horizons and 2) from information loss.

**References**

Landauer, R., 1961. Irreversibility and heat generation in the computing process. IBM Journal of Research and Development, 5 (3): 183–191.

Toyabe, S., T. Sagawa, M. Ueda, E. Muneyuki, M. Sano, 2010. Information heat engine: converting information to energy by feedback control. Nature Physics 6 (12): 988–992. arXiv:1009.5287

McCulloch, M.E., 2015. Inertial mass from information loss. Submitted to

*EPL*..

Moore, D., P. Cook, 1964. One Leg Too Few. https://www.youtube.com/watch?v=lbnkY1tBvMU

## 16 comments:

I always wondered this about energy: Energy can be transformed from one 'form' to another 'form'. If we take 'information' as a fundamental physical element into consideration, then I wonder why it shouldn't be possible to, for instance, directly convert electrical energy into kinetic energy (with some extra energy required to enable that 'information conversion').

By my knowledge, there is not a single law in the whole of physics that forbids any such conversion by principle. I hence tend to think that we are just too blind to construct a device that enables such conversion. I have a small hope that the EM-drive type construct might tap into a new energy conversion mechanism, direcly going from EM energy to kinetic energy.

On a side note, I believe that technology is the art of constructing topological configurations of matter and energy, that are, in comparison to 'naturally'/statistically forming objects in 'free nature', extremely unlikely to exist in the first place. By creating these highly improbable objects, we can in turn manifest things that are normally invisible to us. Just think about Bose-Einstein condensates or particle colliders producing particles that decay so fast that they simply don't stably exist in our 'normal' spacetime. Technology enables access to physical processes, that are otherwise cloaked. It would be very exciting if it turned out that a simple metallic frustum could be used for (EM energy)->(kinetic energy) conversion. Let's see what the DIYers and Eagle Works boys can come up with.

nice blog !! i was looking for blogs related of Physics Lab Equipment Manufacturer . then i found this blog, this is really nice and interested to read. thanks to author for sharing this type of information.

@ Mike:

Are you sure it is

Energy + Mass + Information = constant

and not

Energy^2 + Mass^2 + Information^2 = constant^2

Just wondering, because I can also interpret this constant as a norm over a 3D vector in a 3D parameter space (^^) .

Best regards

I can see your point, but with a simpler linear approach the maths comes out quite easily with something like MiHsC, but not (so far) with a squared approach (I just tried it). There are some other reasons for preferring a linear approach that I'd rather not mention until publication, but I'm not so sure at this stage so I'd encourage you to try it yourself.. Thanks for the suggestion.

Information is in the eye of the beholder.

Consider the text: MQLPZA

This could be a random collection of letters or the password to my Swiss bank account or a substitution code for a 6 letter word in various languages or an acronym or references to various scenes in the plays of Shakespeare. Each of these has differing amounts of information.

The energy changes cited by Landauer are not due to "loss of information" but rather the sum of the energy required to change the state of discrete storage units.

Dear Roy: The letters MQLPZA do not in themselves have the information about being a bank account or elements of Shakespeare. To be useful in that way would need extra information to be given, such as: 'It's my bank account' or 'P means The Tempest, Scene 2'. I need to think about your 2nd point, make sure I've got it right.. Thanks.

Just to be totally clear..

- Data is anything that can be measured and stored

- Information is the

context specific interpretationof any recorded dataMike,

The point is that there is no objective entity that can be labeled as "information". The information is strictly what is perceived by a sentient being.

Roy. I would say that there is the usual type of information we are familiar with that requires context, but there is a deeper type of information that is objective, and that Shannon knew about. For example, the sequence 010101 contains less information than, say, the series 011010 because the former is easier to predict (repeat 01 three times). This is automate-able, so must be objective.

Mike,

Does 75724709 contain more information than 010101?

Aha, some digits from e :) I get your point but you still have to write an algorithm to calculate e and then cut out the digits 38-45, so yes, more information.

So, what would the clerk at the corner store say? Different answer entirely. If different observers do not agree on what they see then it is doubtful that the thing is objectively real.

It might be better to think in terms of

differences of state, instead of information. Like the gravitation between two bodies could be interpreted as a pulling force between the bodies, or as an outside pushing force. It can't be both at the same time, so at least one interpretation is wrong or inaccurate. I personally think it's quantum vacuum pushing from outside the two bodies (denser virtual spectrum than in between the bodies), but what do I know.. :-) ."The point is that there is no objective entity that can be labeled as "information". The information is strictly what is perceived by a sentient being."

@Roy

interesting.

It remind me the question of the observer in Copenhague interpretation.

Note that theory of information, of complexity, tries to propose a path to objectivity for information...

It seems that like for quantum measurement, or for private data as French CNIL describ, information id deeply dependent on the post-treatments, on the interpretation.

If you reject that sentient being, or observer is defined, and if you integrate recent experiments with "Schrodinger Kitten", it seems that information is more something about eigenvalues of the final measurement.

Information cannot, like "observer" be measured, except in the context of some material experiment.

It is hard for me to share my intuition, but for me the paradox of the twin particle and FTL decoherence, is not a paradox.

for me the twin particle don't exist independently, and only existence is in the instruments that says that the particle are in "same" state or not.

The decoherence is in the XOR gate , in the control room, not far away where the equation says there is photons.

In a way my vision match parallel world...

The information content of a bitstring can (theoretically) be quantified irrespective of what context it is used in, by considering what is the minimum quantity of information required to enable exact reconstruction of the string. The catch is, *everything* required to perform the reconstruction is included in the information tally.

For example, with the 8 digits of 'e' written above, 75724709, it is probably impossible to write a program to generate these in less space than the numbers themselves. However if you wrote out 10000 digits of e, then one could write such a program - and the length of the smallest possible program would be the information content of the 10000 digit sequence.

Thanks Matthew. This is a more comprehensive way of putting it. As I tried to argue above, there is objective information in any pattern.

Post a Comment