The rate of type Ia supernovae ( SNe Ia ) in a galaxy depends not only on stellar mass , but also on star formation history . Here we show that two simple observational quantities ( g - r or u - r host galaxy color , and r -band luminosity ) , coupled with an assumed delay time distribution ( the rate of SNe Ia as a function of time for an instantaneous burst of star formation ) , are sufficient to accurately determine a galaxy ’ s SN Ia rate , with very little sensitivity to the precise details of the star formation history . Using this result , we compare observed and predicted color distributions of SN Ia hosts for the MENeaCS cluster supernova survey , and for the SDSS Stripe 82 supernova survey . The observations are consistent with a continuous delay time distribution ( DTD ) , without any cutoff . For old progenitor systems the power-law slope for the DTD is found to be -1.50 ^ { +0.19 } _ { -0.15 } . This result favours the double degenerate scenario for SN Ia , though other interpretations are possible . We find that the late-time slopes of the delay time distribution are different at the 1 \sigma level for low and high stretch supernova , which suggest a single degenerate scenario for the latter . However , due to ambiguity in the current models ’ DTD predictions , single degenerate progenitors can neither be confirmed as causing high stretch supernovae nor ruled out from contributing to the overall sample .