Ratios of carefully selected line depths are sensitive to stellar effective temperature ( { T _ { \mathrm { eff } } } ) . Relations established between line-depth ratio ( LDR ) and { T _ { \mathrm { eff } } } allow one to determine { T _ { \mathrm { eff } } } precisely . However , LDRs can also depend on metallicity and abundance ratios , which can limit the accuracy of the LDR method unless such effects are properly taken into account . We investigate the metallicity effect using H -band spectra and stellar parameters published by the APOGEE project . We clearly detected the effects of metallicity and abundance ratios ; { T _ { \mathrm { eff } } } derived from a given LDR depends on the metallicity , 100–800 K dex ^ { -1 } , and the dependency on the abundance ratios , 150–1000 K dex ^ { -1 } , also exists when the LDR involves absorption lines of different elements . For the 11 line pairs in the H -band we investigated , the LDR– { T _ { \mathrm { eff } } } relations with abundance-related terms added have scatters as small as 30–90 K within the range of 3700 < { T _ { \mathrm { eff } } } < 5000 K and -0.7 < \mathrm { [ Fe / H ] } < +0.4 dex . By comparing the observed spectra with synthetic ones , we found that saturation of the absorption lines at least partly explains the metallicity effect .