Can art exist in a vacuum?
Not a literal vacuum, like interstellar space, but a vacuum of connection and interaction.
I propose that art is a form of communication, like more familiar forms of communication such as the written and spoken word.
And communication requires both a sender and a receiver to mean anything.
But before we get into what it means to mean something, let’s think a bit about communication.
Communication is getting information from one place to another, and is ultimately about transmitting intent.
In our last blog post, we discussed the difference between random and non-random patterns, and the difference has to do with awareness of underlying process.
Since “random” is code for not knowing what is producing a pattern, any knowledge of just who or what is behind the curtain, a la Wizard of Oz, is going to inject a bias into our understanding.
If we know the dice are weighted instead of fair, we can adjust our expectations accordingly and win big, at least until we are thrown out of the casino.
As Goethe said:
Mathematicians translate everything into their own language, at which point it becomes incomprehensible.
I take this as a challenge.
Randomness & The Principle Of Maximum Entropy
Mathematicians use the concept of randomness a great deal. They do this even when underlying processes are in principle understood, but in fact are too difficult to take advantage of or that a precise understanding of all the details doesn’t generate much added insight.
The relation of randomness with ignorance is encoded in the Principle of Maximum Entropy, put forth by the statistician E.T. Jaynes in 1957. This says that a system will be found in its most likely state unless it is biased by some unaccounted for influence.
This principle connected the work of Ludwig Boltzmann about the behavior of gases in the 19th century with a more recent science called information theory.
The connecting concept is that of inference.
Even though we don’t know where the trillions of trillions of atoms in a container of gas are at any given moment, we can *infer* properties about the whole collection of them if we know something about the individual parts and the system constraints.
The concept of inference is remarkably general.
I believe we are inferring all the time, sometimes in concise and expressible ways (“the traffic light is about to turn green.”) and sometimes in unconcise ways (“That piece of art really moves me”)
But in any case, inference is driven by information.
Information theory was first codified in a little book by Claude Shannon in 1948 called The Mathematical Theory of Communication.
In this slim volume, Dr. Shannon thought about the nature of communication and in particular its limits.
Legend has it that when Shannon asked John von Neumann about what he should call this new concept of a measure of information, von Neumann said,
Call it entropy. Nobody understands it.
-John von Neumann
John von Neumann wasn’t just joking.
I suspect he knew what was going to become of the concept, and that it would turn out to be related to the 19th century concept of physical entropy as a measure of disorder in a system.
After all, it was said of von Neumann, “We have evidence that super intelligent aliens exist and are living among us, disguised as Hungarians.” (Neumann’s birth name was Neumann János Lajos)
A Little History
First a little history.
In the late 1830s, telegraph wires were first strung between cities. Communication was at first very primitive, just using the “on” and “off” of electric current.
This was improved by audification, turning on and offs into different sounds, a kind of synesthesia.
Samuel Morse, The Morse Code & Art
But the big leap came when the artist Samuel Morse got involved. He invented what is now called Morse code, in conjunction with a physicist and a mechanical engineer.
But he was first an artist.
His coding method evolved between 1844 and 1865, and is still in use today.
Samuel Morse had the clever idea of statistical compression of information.
He made the representations for the most common letters (like “e”) short and for the most uncommon symbols (Like “z”) long.
He had an intuitive feel for information theory (just how compressible information is without losing any) a century before it was a thing, and managed to do a terrifically efficient job.
Even when Morse code was no longer required by technology, it still made appearances.
Someone musically inclined noticed that the “V for victory” employed by the Allies in World War II was a dot-dot-dot-dash that mapped perfectly onto the opening of Beethoven’s 5th Symphony.
We live in an age of such information overload that it is hard to imagine being limited to dots and dashes, but at the root of all communication is the tension between something and not-something: a dot or a dash, a zero or a one.
It’s about making distinctions because that’s what information means. Inference is what we do with that information.
Information neutralizes entropy, which itself is a measure of unknowing. Information is constraint.
With enough information, our understanding of a system can go from entirely statistical to entirely descriptive and predictive, and the measure of entropy tells us just how much information we need to make that transition.
As you might imagine, it can be a lot sometimes—to specify the position and velocity of every particle in a gas is hard to do!
But information needs more than dots and dashes to have meaning. It needs to have context.
Dots and dashes represent letters, which live in a sociocultural context, and that makes them useful because the words that are spelled out have relevance to both the party creating then and the party receiving them.
A piece of information can mean anything from “turn off the light” to “launch a missile” depending on a mutually understood context.
A Story About Art
This brings to mind a story about art.
Nancy and I were in Rockland, Maine for an art workshop in 2016. There is a small but sophisticated art center there in the old town.
During the summer tourist season, they put on talks and exhibitions for the local community, which is full of artists and art aficionados.
Midcoast Maine has a long history of artistic residents, including their favorite son Andrew Wyeth.
A New York art critic who shall remain nameless was talking about the work of a particular contemporary artist.
According to the critic, a key aspect of this artist’s work was that it had no meaning. Nancy was quick to point out that if it was so important that the work had no meaning, then that itself was meaningful.
We claim that art is material meeting process, constraint militating against randomness, to communicate from the creator to the experiencer.
What exactly is being transmitted?
The answer, for the most part, is variations on, I’ll know it when I see it.
It’s meaningful but it is often hard to say why, because what is being transmitted is highly complex and personal.
It’s an alphabet with a staggering number of symbols, of expression tropes. Instead of few symbols and many words for the written word, it is many symbols and relatively few words.
And what are the words made out of those letters?
They’re hanging on the gallery walls and living in artist’s studios.
With gratitude from our studio to yours,
Nancy & Bruce
My friend Dana Dion got her copy of the new book! Dana is one of the featured artists. It’s a bestseller in Art Study & Teaching, Painting, and Creativity Self Help. It makes a great gift with over 200 photographs and stories from 25 artists.
Get your copy HERE.
P.S. Holiday Special. Create the art of your dreams. Give yourself or the artist in your life the gift of art.
Get started HERE.
Go to https://www.artistsjourney.com/online-courses to choose your course.