We present here a simple and robust framework for quantifying the effective sensor depth of cosmic ray soil moisture neutron probes such that reliable water fluxes may be computed from a time series of cosmic ray soil moisture. In particular, we describe how the neutron signal depends on three near-surface hydrogen sources: surface water, soil moisture, and lattice water (water in minerals present in soil solids) and also their vertical variations. Through a combined modeling study of one-dimensional water flow in soil and neutron transport in the atmosphere and subsurface, we compare average water content between the simulated soil moisture profiles and the universal calibration equation which is used to estimate water content from neutron counts. By using a linear sensitivity weighting function, we find that during evaporation and drainage periods the RMSE of the two average water contents is 0.0070 m(3) m(-3) with a maximum deviation of 0.010 m(3) m(-3) for a range of soil types. During infiltration, the RMSE is 0.011 m(3) m(-3) with a maximum deviation of 0.020 m(3) m(-3), where piston like flow conditions exists for the homogeneous isotropic media. Because piston flow is unlikely during natural conditions at the horizontal scale of hundreds of meters that is measured by the cosmic ray probe, this modeled deviation of 0.020 m(3) m(-3) represents the worst case scenario for cosmic ray sensing of soil moisture. Comparison of cosmic ray soil moisture data and a distributed sensor soil moisture network in Southern Arizona indicates an RMSE of 0.011 m(3) m(-3) over a 6 month study period.