This paper will discuss some aspects of digital information versus analog information. The author has worked in the electronics field for forty years, and understands these formats at the bone level.
First, some definitions. Analog is constantly varying content. It is “real life” since all natural sounds and physical phenomena are analog: when a tree falls in the forest, the waves it generates in the air, whether they make a sound or not, are analog. They vary in amplitude and frequency to exactly reproduce the sound the tree made as it crashed its way through the underbrush and smashed into the ground.
Digital is nearly always man-created. Digital electronics encompass the related areas of binary logic and math. This paper has no space to go into all of that, but you need to know that binary is two-stage only. Unlike decimal, which has 10 discrete steps, or hexadecimal, which has 16. Binary is 1/0, on/off, open/closed, set/reset, and so on. This forms the basis for all computing. The very CPU I am using now to type this uses those simple yes/no, pass/fail kind of decisions to handle the work I am doing. It keeps up with my two finger typing and at the same time it keeps up with various housekeeping like monitoring the I/O ports and keeping the Time-Of-Day. The only advantage a computer has over a human is that the computer is so damn fast: it can easily do a million things while a human (me) is mumblescratching around looking for a key on a keyboard. Of course, computers can’t think for shit.
If you look at the internal working of a running logic system using an oscilloscope, you see this digital signal switching taking place. It is very quick, but luckily, for us, the same electronic advances that allowed the giant CPU’s we have now also allowed much improved test equipment. No human can be expected to see what a CPU is doing but the scope charts time vs. amplitude and puts it on a screen so you can see the microsecond and even nanosecond switching that is taking placed. This is important because you need a scope to understand the process of converting analog to digital and then back to digital again. It lets you see what is happening.
When you want to convert, say, Frank Sinatra crooning (which is, being natural, all analog) into a nice clean digital form, you need something that takes an analog signal and converts it into a digital representation. This converted data is typically stored in the form of a table of values, with each entry describing one sample of the original signal. As you might infer, the device used to convert these is called an “Analog-to-Digital” converter, or A/D, as it is usually written.
The specs of this A/D converter (and the reverse, D/A, that we will cover later) are what determine the quality of the conversion. When I first started in electronics, we were just barely out of the vacuum tube days, and the first transistor circuits were quickly followed by various stages of digital logic integrated circuitry. The density of transistors on a chip advanced in a very few years from Small-Scale-Integration (SSI), which might have only four NAND gates on a chip, to Large-Scale-Integration (LSI), which had millions of transistors on the same size chip, thousands of NAND gate equivalents. This allowed very complex electronic circuits to be designed in a package.
In the case of A/D converters, the accompanying speed increases, partly due to the shorter connection lengths between chips and closer integration with associated logic, made it so you can convert with nearly true quality. The main specs for an A/D converter are sample speed and width. If you sample only ten times a second, your sound quality is going to suck plenty. And if you only do an 8-bit sample, your resolution is going to suck. The fact is that the faster and wider the conversion, the more like Frank’s intent you will approach. Of course, the trade off is storage space, file size, and word size of the memory where you are storing the converted values. If you sample a million times a second, and you have a 16-bit sample width, you are using up 2 Mb (Megabytes) of memory for every second of croon. That means for a typical 3-minute song, you will use 360 Mb. You could only get three songs per Gb (Gigabyte). This is not enough for most people. Some compromise between quality and file size is usually reached.
This is the Meat of where I was getting. Sorry I had to shovel all that tech stuff into your maw. Audiophiles claim they can tell the difference between Frank’s original croon and the one that is converted to digital, then back to analog to get to your Ear. They say it loses something. That is why they avoid digital music like CD’s, MP3s and such. They go with vinyl LP records or audiotape, which are never converted. I call bullshit on this.
It is true that no analog to digital to analog conversion will ever be an exact replica of the original; but I maintain that no Human Ear can tell the difference at the high end of conversion. I wager that a blind taste test, where the audio expert is blindfolded and has their nose pinched off, would show that they cannot tell the difference between a bite of CD and a bite of vinyl record. Hahahahah. No, I mean a blind ear test.
I believe their claim is based on the unavoidable stair-step shape of the re-created analog. During the sample times, when the A/D conversion is taking place, the signal stays the same. Even at the highest rates of conversion, this is true. If you zoom in close enough with your oscilloscope, you will see teensy little steps between samples. But a human ear cannot detect steps this small.
I think what they miss is the noise, the hiss and clang that is unavoidable in analog signals. When you digitize, you clean up. So, you may get what they claim is Franks cracking instead of crooning. You lose the oon and get the ack. And you lose Frank scratching his ass and the trumpet player shuffling her feet. Maybe that is what they miss.
Of course, when you convert Frank to a list of digital values, you have to convert him back to analog before you can hear him. Your ear doesn’t know digital from a hole in the ground. So you have a D/A converter, usually inside the device, like your MP3 player. The specs of the D/A are also important, but less so. If you did not get the music digitized well, you cannot get it turned back to analog well: the data is lost during the A/D conversion. The D/A process reverses the operation and changes your table of binary values back into a pseudo sine wave that can feed speakers or earphones and vibrate to produce sound.
That’s the lesson for today.