We investigate the impact of dust-induced gas fragmentation on the formation of the first low-mass , metal-poor stars ( < 1 M _ { \odot } ) in the early universe . Previous work has shown the existence of a critical dust-to-gas ratio , below which dust thermal cooling can not cause gas fragmentation . Assuming the first dust is silicon-based , we compute critical dust-to-gas ratios and associated critical silicon abundances ( \mbox { [ Si / H ] } _ { \text { crit } } ) . At the density and temperature associated with protostellar disks , we find that a standard Milky Way grain size distribution gives \mbox { [ Si / H ] } _ { \text { crit } } = -4.5 \pm 0.1 , while smaller grain sizes created in a supernova reverse shock give \mbox { [ Si / H ] } _ { \text { crit } } = -5.3 \pm 0.1 . Other environments are not dense enough to be influenced by dust cooling . We test the silicate dust cooling theory by comparing to silicon abundances observed in the most iron-poor stars ( \mbox { [ Fe / H ] } < -4.0 ) . Several stars have silicon abundances low enough to rule out dust-induced gas fragmentation with a standard grain size distribution . Moreover , two of these stars have such low silicon abundances that even dust with a shocked grain size distribution can not explain their formation . Adding small amounts of carbon dust does not significantly change these conclusions . Additionally , we find that these stars exhibit either high carbon with low silicon abundances or the reverse . A silicate dust scenario thus suggests that the earliest low-mass star formation in the most metal-poor regime may have proceeded through two distinct cooling pathways : fine structure line cooling and dust cooling . This naturally explains both the carbon-rich and carbon-normal stars at extremely low [ Fe/H ] .