Understanding Sample Rates, Bit Depth, and Bit Rates in Audacity

Audacity Bootcamp

Jun 12 2022 • 16 mins

Computers can't do anything with analog audio. In order for your computer to do anything with the audio you record, that audio has to first be converted into a digital format. Some sort of device has to be connected between your microphone (if that's what you're using to record) and your computer that converts the analog sound waves from your voice into a digital version of those same sound waves. If you're using a USB mic to record, that conversion takes place inside the mic. The USB mic itself has analog-to-digital conversion electronics built in to it that convert your voice to a digital version and then sends that version to your computer where Audacity (or any DAW) recognizes it and knows what to do with it.

If you're using an analog microphone (any XLR-connected mic is an analog mic) you'll need to plug it into an audio interface unit (like a Scarlett Solo or something similar) and that audio interface unit converts the analog audio from your XLR mic into a digital equivalent and sends that digitally sampled audio into your computer where Audacity displays it back to you as a digital representation of your recorded audio. From there, you can manipulate it in post production to add effects, etc.

In this podcast, I talk about 3 things related to digital audio:

  1. Sample Rates
  2. Bit Depth
  3. Bit Rates

Here's the link to the video on this topic that I mention in this episode:

My setup for this episode:

  • Audacity version 3.1.3
  • 2017 MacBook Pro
  • Zoom H6 Audio Recorder/Interface
  • SYNCO D2 Hyper Cardioid Directional Condenser Shotgun Mic

Find me online at https://learnaudacity.com/