We have studied the sensitivity to variations in the the triple alpha and ^ { 12 } C ( \alpha, \gamma ) ^ { 16 } O reaction rates of the production of ^ { 26 } Al , ^ { 44 } Ti , and ^ { 60 } Fe in core-collapse supernovae . We used the KEPLER code to model the evolution of 15 { \mathrm { M } _ { \odot } } , 20 { \mathrm { M } _ { \odot } } , and 25 { \mathrm { M } _ { \odot } } stars to the onset of core collapse and simulated the ensuing supernova explosion using a piston model for the explosion and an explosion energy of 1.2 { \times 10 ^ { 51 } } { \mathrm { erg } } . Calculations were performed for the Anders and Grevesse ( 1989 ) and Lodders ( 2003 ) abundances . Over a range of twice the experimental uncertainty , \sigma , for each helium-burning rate the production of ^ { 26 } Al , ^ { 60 } Fe , and their ratio vary by factors of five or more . For some species , similar variations were observed for much smaller rate changes , 0.5 \sigma or less . The production of ^ { 44 } Ti was less sensitive to changes in the helium-burning rates . Production of all three isotopes depended on the solar abundance set used for the initial stellar composition .