How to set bias on a cassette deck
Here is a question to fellow tapeheads: do you know what is a bias adjust knob? Do you use it? Do you think it helps? Let’s find out!
First, what is bias? It is high-frequency current that shakes up the particles of magnetic tape right before they are magnetized. This is similar to shaking iron filings when plotting magnetic field lines around a magnet. Without the shaking it is harder for the filings to orient themselves because of the friction. On the other hand, if the shake is too violent, the filings will be swept off the sheet.
The same applies to tape bias: with not enough bias, tape is not properly magnetized and distortion increases. With too much bias high frequency output is reduced.
Magnetic tape is made to certain standards, which define different characteristics for Type I (ferric), Type II (chrome or ferro-cobalt) and Type IV (metal) tapes. Tape recorders ensure the correct bias for each type of tape. But even within each type, tape from different manufacturers can have slightly different properties, so some cassette decks have a knob to adjust bias within a relatively narrow window.
This adjustment, also known as tape deck calibration, is commonly used to either flatten frequency response, or to tailor it to one’s liking, for example by slightly emphasizing the upper or lower end.
There are many ways to adjust bias, using white noise or pink noise or a pair of test frequencies or a frequency sweep. These signals may be recorded at different levels, and the setup can be purely instrumental, relying on numbers, or subjective, relying on one’s listening.
Calibrating a three-head deck with white or pink noise is relatively simple: you would start recording the noise on tape, and switch between the source and the tape while adjusting the bias knob. The goal is to make the input noise and the recorded noise to sound the same. If you hear loss of treble on the recording, you would reduce bias. If the recorded sound is too bright, you would increase bias.
Things get more complicated on a two-head deck, as you need to record a series of test signals first, then play the recorded tape back and assess the recorded sound at each bias setting. This is harder to do by ear, which is why I was looking for a simple instrumental approach to do such an adjustment.
I searched the Net, and one of the methods used by pros is to use two test frequencies, 1kHz and 10kHz. First set recording level to 0 VU using 1 kHz tone, then use 10 kHz tone, adjust bias and equalize the recording level.
Consumer-grade cassette-based machines need to use lower recording level and probably lower test frequencies.
Some 1990s Sony decks have built-in test tone generator, which produce 400 Hz and 8 kHz tones, and I thought that I can do something similar with my two-head machine.
My plan is to record a test signal with 400 Hz on one channel and 8 kHz on another. I can generate such a test tone in Audacity, save it into a file, drop the file on a smartphone and play it into my cassette deck.
I can even do better by using Keuwlsoft Function generator App for Android, which allows to output different frequency on each channel.
Now the tricky issue of recording level. Clearly, 0 dB is too hot, the output level compression will skew the results. -20 dB is too low for my deck, as it is the lowest indicator on the meter. Some people recommend -10 dB for cassette equipment. I chose -3 dB because it is smack in the middle on the meter of my deck and it is 5 dB below Dolby level, so I hope that this will not sway the result too much.
I am going to record the test signal, changing the bias value from the lowest to the highest every 3 points on the counter. I will test several tapes, three Type II and three Type I and compare the results. If anything, I shall be able to rank these tapes between each other.
So, what does this little experiment show? You can see pretty tables and graphs in the video linked above! The textual summary follows.
True chrome tape loses sensitivity over time, the result that corresponds to tests by other people. The 1976 Sony Chrome cannot be used for a Dolbyized recording as the Dolby tracking will be incorrect on playback, the recording will sound too dull, unless the deck has recording level compensation. It is still usable for recordings without Dolby.
1979 TDK SA Type II tape is in a perfectly usable condition forty years after manufacture. It works just fine with reduced bias. This may have been an intended characteristic, because many 1970s decks did not have Type II selector, so I suppose that this tape would work in a deck that supported Type I tape only.
The 1994 Maxell XLII is from an era when Type II has long become mainstream. It can take extra bias, which presumably helps to reduce distortion. It is interesting to compare the Maxell to the TDK, they are on the different sides of the spectrum, so to speak. This result matches the measurements from the Audiochrome blog. In fact, most of Type II tapes tested by Audiochrome require less bias than the Maxell XLII.
Switching to Type I tapes, the 1994 Maxell UR is a very compatible tape that can benefit from just a tiny bias bump, but works just fine at the neutral setting. Another perfectly compatible tape, 1982 TDK D. And finally, a 1977 Maxell UD XL with almost textbook behavior of treble response steadily decreasing as bias is increased.
I think this just shows that the industry had ferric tapes perfected by the end of 1970s. This is about the same performance that the best of the current crop of tapes can deliver. No Type II or Type IV tapes currently are being manufactured, and the only half-decent cassette deck is expensive and does not include Dolby noise reduction. Does anyone really wants to talk about cassette comeback?