Select Page

The Information Universe (Part 3)

Written by Jeff Drake
2 · 10 · 24

The Information Universe (Part 3)

Can Information Exist Without a Mind?

Remember this age-old riddle: “If a tree falls in the forest and no one is around to hear it, does it make a sound?” It’s a great riddle, on a par with “What is the sound of one hand clapping?” While the answer to this second riddle seems somewhat obvious, i.e., “the sound of silence,” the answer to the first is not as easy.

A similar question could be posed about a book lying unopened on a shelf: The words are printed and epic adventures and poignant romances lie untouched within, but does it only become information when a mind engages with it by reading the contents?

I will preface this discussion by reviewing snippets of the three different definitions of information I described in my post, “The Information Universe (Part 2), Claude Shannon: The Father of the Information Age.”

Definition of information used by mathematics:

In a nutshell, the definition of information for mathematics is exactly that defined by Shannon himself: Information is a measure of uncertainty, or put another way, information is anything that reduces the uncertainty we have about something.

This mathematics definition can be a little confusing. It says that the information offered by something is equivalent to the amount of uncertainty we have about the thing itself. For example, if you use a pick and shovel to harvest a bunch of random rocks because you know that gold can be found in rocks, simply knowing gold exists somewhere doesn’t tell you much about this pile of rocks in front of you. One reason for this is that all you have right now is a general data point about rocks containing gold. This, plus $3.85 will get you a Venti Latte at Starbucks. But, if someone points to a mine they say is known for certain to contain gold, this data point suddenly transforms into important “information” that dramatically reduces the amount of uncertainty you had about any rocks you find in the mine.

Definition of information used by physics:

When there is high disorder in a system, the more information it contains. That’s because the higher the disorder, the more information is required to describe it.

Another definition of information used by physicists says that information is a measure of complexity. In this definition, the more complex something is, the more information it contains.

Physicists also define information as a measure of meaning… In other words, information is the measure of how relevant something is to us. Thus, a news article about the city shutting down the road to your house is more relevant (and has more information) to you than a news article about a basketball game in another state.

Definition of information used by philosophers:

On the one hand, philosophers describe information as a pattern that can be found in data.

Before I go further, please be aware that data and information are different. Put simply, data pertains to raw facts or observations, whereas information is data that has been processed and is considered meaningful. For example, the number 503 is, by itself, just data. But, 503 becomes information when you organize it into the statement, “503 is the area code for Oregon.”

Not afraid of the deep end of the pool, philosophers also define information as a way of conveying “meaning.” The meaning that is conveyed is what makes the information valuable.
And philosophers even go farther, defining information as a way of representation of reality itself. An example of this would be a map. The map contains information about the world.

Clearly, what constitutes ‘information’ depends heavily on the lens we use. But does its fullest power, the kind that shapes stories and triggers change, require the final level of meaning-making only a mind can provide?”

For an everyday spin on this, consider a pile of paper scraps with different food items written on them e.g., eggs, milk, bread, etc. The pile of paper scraps is fairly meaningless as they are mere facts with no potential purpose. In other words, these scraps of paper contain pure data, facts without inherent organization or purpose.

Shannon, using his math perspective, could say that even the pile of paper scraps has some information which could be had. For instance, if you knew that someone always buys the same brand of milk or specific type of bread loaf, this reduces the randomness a tiny amount, although not enough, because we humans would still struggle to fully act on the random scraps of paper.

Now, apply your mind to this example and see what happens:

You ask, “Is this for a single breakfast, a week’s nutrition, a recipe for a party?” The answer dictates needed quantities, brand choices, etc.

You remember what you have already in the fridge. This prevents you double-buying. Or maybe you remember there is a big sale on certain items or that a holiday is coming up this weekend. This information can influence your product selections.

Or perhaps you are going to a store you are very familiar with. Knowing the layout of the products in the store allows you to order the grocery list in a way that will shorten the time required to collect the items.

It should be apparent by now that even simple tasks show how minds can unlock data’s potential and transform it into useful information. Our mental actions appear to change the very nature of the list – is it a tool, a record of a fleeting thought, a promise of a meal shared? The meaning is not inherent in the scraps of papers themselves, but rather, it is born from our engagement with it.

It looks to me that when considering our everyday actions or the very foundations of how we understand the world,  minds  appear to be not just interpreters of information, but creators of it in the deepest sense. While a scrap of paper or a book contain potential, it seems the power to actualize that potential into something meaningful that guides choices,  sparks emotion, or reshapes reality, might well be inseparable from the act of being “read” by a mind.

Admittedly, systems exist in nature that appear to use information with minimal involvement from a conscious mind.   For instance, the “code” contained in DNA holds instructions for building an organism, hinting at a form of information-processing that is distinct from deliberate meaning-making. Even lowly bees through their dances can communicate the location of a pollen-rich flowerbed to the rest of the hive.  Could these examples involve less complex responses hardwired by instinct, rather than the flexible decision-making a human mind brings to even a simple grocery list?

Does every act of understanding, no matter how small or grand, subtly reshape the information that makes up the universe itself?

All of which, begs a bigger question: If information isn’t just “out there” waiting to be found, but is partially created through understanding, might then every mind be like a beacon, reducing the inherent uncertainty of the universe by imposing meaningful patterns upon it?

Hmm.

Please follow and like me:

Let us know what you think…

Comments

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More Like This

Related Posts

The Information Universe (Part 4)

The Information Universe (Part 4)

The Information Universe (Part 4)

Exploring Information as a Fundamental Property of the Universe

Hi Folks!

After a nice respite, I am ready to delve back into my philosophical meanderings, but I promise to write more about my Vietnam War …

read more
The Information Universe (Part 2)

The Information Universe (Part 2)

The Information Universe (Part 2)

Claude Shannon: The Father of the Information Age

Currently, I am enjoying the AIs I use. One is named Claude. I’ll write further about how I use Claude in the future, but I find using …

read more

Author

Jeff Drake

Retired IT consultant, world-traveler, hobby photographer, and philosopher.