And things get worse for the wireless standard on Apple's devices: current iPhone Bluetooth implementation supports a number of codecs, but curiously not the highly regarded aptX codec. This means the best the iPhone can manage is the proprietary 256kbps AAC codec, which can't handle current high-end formats.
So even if you start with a high quality track, it'll end up being compressed. It's unclear what codecs Lightning might use - but given the much greater bandwidth, sending lossless audio should be less of a problem.
Bits and Samples
This data wrangling though doesn't give the full picture, though as it also affects the amount of digital information that can be used to reconstruct sounds in your ears.
The way audio quality is measured is in terms of sample rate and bit-depth. Sample rate is essentially the number of audio samples every second - CD quality is 44100 times per second, for instance - or 44.1khz.
Bit-depth is the amount of data stored in each sample. CDs have 16-bit audio - which works out at over 65,000 bits.
CD quality is all that Bluetooth is really equipped to deal with - but under Apple's Lightning specification, it also supports much more: 24-bit audio, meaning over 16 million bits per sample - and double the sample rate, at 96khz. In other words - Lightning is much better equipped to handle vastly more information.
The reason this matters is that we're about to enter a new world of Hi-Res audio tracks, where the bit rates and sample rates available for our phones are rapidly increasing. The new LG V20 handset, for instance, has a 32-bit quad DAC, massively improving the quality of sound that can come out of your phone - and as services like Tidal offer increasingly high quality tunes, these phones can make use of them.
The rumors that Apple is about to buy Tidal suddenly suddenly make a lot more sense
So assuming that your source audio is high enough quality, Lightning is much better positioned to offer a better listening experience.
There's also the problem of Bluetooth being wireless - and thus liable to interference. Bluetooth uses the same 2.4Ghz spectrum that WiFi and a host of other wireless technologies uses.
If you're in a "noisy" environment, there's a chance that the signal could drop or degrade - which could make listening a frustrating experience.
The Lightning/Bluetooth comparison isn't an easy one - as it depends on what factors you're most interested in. One thing we've failed to mention is the convenience of wireless headphones - there's a freedom to being untethered to your phone that far outweighs sound quality.
But the biggest problem with Lightning is that it only works with Apple products. If you ever want to listen to audio from anything but an iPhone or iPad, you're going to need to start lugging around an extra pair of headphones or a fiddly adaptor… if you can afford one.
And it'll be interesting to see if Apple come up with some sort of solution to get around the fact that Lightning headphones won't be able to be used while charging (without a weird split converter) - as this could be a deal breaker for many users.
Bluetooth also has its challenges: Because it won't draw power from your phone, you'll need to remember to charge up your headphones regularly. And depending on where you are, interference could become a real pain.
But ultimately one thing is clear: if you're an audiophile, and you want the highest quality listening experience there is only one viable winner: the Lightning cable.