The star formation rate ( SFR ) of the Milky Way remains poorly known , with often-quoted values ranging from 1 to 10 M _ { \odot } yr ^ { -1 } . This situation persists despite the potential for the Milky Way to serve as the ultimate SFR calibrator for external galaxies . We show that various estimates for the Galactic SFR are consistent with one another once they have been normalized to the same initial mass function ( IMF ) and massive star models , converging to 1.9 \pm 0.4 M _ { \odot } yr ^ { -1 } . However , standard SFR diagnostics are vulnerable to systematics founded in the use of indirect observational tracers sensitive only to high-mass stars . We find that absolute SFRs measured using resolved low/intermediate-mass stellar populations in Galactic H ii regions are systematically higher by factors of { \sim } 2 - 3 as compared with calibrations for SFRs measured from mid-IR and radio emission . We discuss some potential explanations for this discrepancy and conclude that it could be allayed if ( 1 ) the power-law slope of the IMF for intermediate-mass ( 1.5 M _ { \odot } < m < 5 ~ { } M _ { \odot } ) stars were steeper than the Salpeter slope , or 2 ) a correction factor was applied to the extragalactic 24 \mu m SFR calibrations to account for the duration of star formation in individual mid-IR–bright H ii regions relative to the lifetimes of O stars . Finally , we present some approaches for testing if a Galactic SFR of { \sim } 2 M _ { \odot } yr ^ { -1 } is consistent with what we would measure if we could view the Milky Way as external observers . Using luminous radio supernova remnants and X-ray point sources , we find that the Milky Way deviates from expectations at the 1– 3 \sigma level , hinting that perhaps the Galactic SFR is overestimated or extragalactic SFRs need to be revised upwards .