View Single Post
Old 09-12-2014, 10:56 PM   #104
clepsydrae
Human being with feelings
 
clepsydrae's Avatar
 
Join Date: Nov 2011
Posts: 3,409
Default

Quote:
Originally Posted by yep View Post
here are some of the places that you could achieve clipping/distortion on, say, a DI bass track, without lighting up the "clip" LED:
Yes -- I'm not debating any of those places as being potentially very relevant to the sound at hot levels, clipping or not, with the possible exception of the ADC you mentioned possibly breakinp up or clipping without the light going off. If you have an example of an ADC being driven within it's stated parameters and without any clip light tripping (where provided) but still sounding different at hot levels, that would be very interesting. But don't take time with the other stages -- I at least am not addressing those -- I know you didn't have time to read the thread (don't blame you) but the scope here is only the ADC analog stage and whether there is isntrinsic value in -18 aside from the other clear benefits it has for level matching and workflow, etc.

(I'm a little suspicious that the -18 recommendation is a tad overblown with some of those other pre-ADC stages, e.g. modern "clean" pre-amps, but I don't the experience to have an actual opinion on that.)

Incidentally, MOTU has gotten back as well (same email as RME et al.). Same story again:

Quote:
Originally Posted by MOTU
With MOTU audio interfaces, the only difference in audio quality at different input levels is dynamic range (noise floor). Distortion and frequency response are the same at any level.

Yes, level matching between analog gear can make a difference to sonics, so the MOTU interface is only half the question. The other half of the question is: Is there an optimum level output from the device that is sending into the MOTU Interface?
clepsydrae is offline   Reply With Quote