Quantum-intensity-correlated twin beams of light can be used to measure absorption with precision beyond the classical shot-noise limit. The degree to which this can be achieved with a given estimator is defined by the quality of the twin-beam intensity correlations, which is quantified by the noise reduction factor. We derive an analytical model of twin-beam experiments, incorporating experimental parameters such as the relative detection efficiency of the beams, uncorrelated optical noise, and uncorrelated detector noise. We show that for twin beams without excessive noise, measured correlations can be improved by increasing the detection efficiency of each beam, notwithstanding this may unbalance detection efficiency. However, for beams with excess intensity or other experimental noise, one should balance detection efficiency, even at the cost of reducing detection efficiency -- we specifically define these noise conditions and verify our results with statistical simulation. This has application in design and optimization of absorption spectroscopy and imaging experiments.