Information gently but relentlessly drizzles down on us in an invisible, impalpable electric rain.

As every bookie knows instinctively, a number such as reliability - a qualitative rather than a quantitative measure - is needed to make the valuation of information practically useful.

The smell of subjectivity clings to the mechanical definition of complexity as stubbornly as it sticks to the definition of information.

To put it one way, a collection of Shakespeare's plays is richer than a phone book that uses the same number of letters; to put it another, the essence of information lies in the relationships among bits, not their sheer number.

Both induction and deduction, reasoning from the particular and the general, and back again from the universal to the specific, form the essence to scientific thinking.

Time has been called God's way of making sure that everything doesn't happen at once. In the same spirit, noise is Nature's way of making sure that we don't find out everything that happens. Noise, in short, is the protector of information.

For generations, field guides to plants and animals have sharpened the pleasure of seeing by opening our minds to understanding. Now John Adam has filled a gap in that venerable genre with his painstaking but simple mathematical descriptions of familiar, mundane physical phenomena. This is nothing less than a mathematical field guide to inanimate nature.

Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information.

If the intensity of the material world is plotted along the horizontal axis, and the response of the human mind is on the vertical, the relation between the two is represented by the logarithmic curve. Could this rule provide a clue to the relationship between the objective measure of information, and our subjective perception of it?

We don't know what energy is, any more than we know what information is, but as a now robust scientific concept we can describe it in precise mathematical terms, and as a commodity we can measure, market, regulate and tax it.

Claude Shannon, the founder of information theory, invented a way to measure 'the amount of information' in a message without defining the word 'information' itself, nor even addressing the question of the meaning of the message.

The solution of the Monty Hall problem hinges on the concept of information, and more specifically, on the relationship between added information and probability.

If quantum communication and quantum computation are to flourish, a new information theory will have to be developed.

Paradox is the sharpest scalpel in the satchel of science. Nothing concentrates the mind as effectively, regardless of whether it pits two competing theories against each other, or theory against observation, or a compelling mathematical deduction against ordinary common sense.

This is not what I thought physics was about when I started out: I learned that the idea is to explain nature in terms of clearly understood mathematical laws; but perhaps comparisons are the best we can hope for.

An electron is real; a probability is not.

The switch from 'steam engines' to 'heat engines' signals the transition from engineering practice to theoretical science.

In order to understand information, we must define it; bit in order to define it, we must first understand it. Where to start?

The problem of defining exactly what is meant by the signal velocity, which cropped up as long ago as 1907, has not been solved.

Science has taught us that what we see and touch is not what is really there.

Numbers instill a feeling for the lie of the land, and furnish grist for the mathematical mill that is the physicist's principal tool.

If you don't understand something, break it apart; reduce it to its components. Since they are simpler than the whole,you have a much better chance of understanding them; and when you have succeeded in doing that, put the whole thing back together again.

Underneath the shifting appearances of the world as perceived by our unreliable senses, is there, or is there not, a bedrock of objective reality?

In fact, an information theory that leaves out the issue of noise turns out to have no content.

Follow AzQuotes on Facebook, Twitter and Google+. Every day we present the best quotes! Improve yourself, find your inspiration, share with friends

or simply: