The empirical reddening function for starburst galaxies generated by Calzetti and her co-workers has proven very successful , and is now used widely in the observational literature . Despite its success , however , the physical basis for this extinction law , or more correctly , attenuation law remains weak . Here we provide a physical explanation for the Calzetti Law based on a turbulent interstellar medium . In essence , this provides a log-normal distribution of column densities , giving a wide range of column densities in the dusty foreground screen . Therefore , extended sources such as starburst regions or HII regions seen through it suffer a point-to-point stochastic extinction and reddening . Regions of high column densities are “ black ” in the UV , but translucent in the IR , which leads to a flatter extinction law , and a larger value of the total to selective extinction , { R _ { V } } . We fit the Calzetti Law , and infer that the variance \sigma of the log-normal distribution lies in the range 0.6 \leq \sigma \leq 2.2 . The absolute to selective extinction { R _ { V } } is found to be in the range 4.3 to 5.2 consistent with { R _ { V } } = 4.05 \pm 0.80 of the Calzetti Law .