We present the most complete study to date of the X-ray emission from star-formation in high redshift ( median z = 0.7 ; z < 1.5 ) , IR-luminous ( L _ { IR } = 10 ^ { 10 } -10 ^ { 13 } L _ { \odot } ) galaxies detected by Herschel ’ s PACS and SPIRE instruments . For our purpose we take advantage of the deepest X-ray data to date , the Chandra deep fields ( North and South ) . Sources which host AGN are removed from our analysis by means of multiple AGN indicators . We find an AGN fraction of 18 \pm 2 per cent amongst our sample and note that AGN entirely dominate at values of log [ L _ { X } / L _ { IR } ] > -3 in both hard and soft X-ray bands . From the sources which are star-formation dominated , only a small fraction are individually X-ray detected and for the bulk of the sample we calculate average X-ray luminosities through stacking . We find an average soft X-ray to infrared ratio of log \langle L _ { SX } / L _ { IR } \rangle = -4.3 and an average hard X-ray to infrared ratio of log \langle L _ { HX } / L _ { IR } \rangle = -3.8 . We report that the X-ray/IR correlation is approximately linear through the entire range of L _ { IR } and z probed and , although broadly consistent with the local ( z < 0.1 ) one , it does display some discrepancies . We suggest that these discrepancies are unlikely to be physical , i.e . due to an intrinsic change in the X-ray properties of star-forming galaxies with cosmic time , as there is no significant evidence for evolution of the L _ { X } / L _ { IR } ratio with redshift . Instead they are possibly due to selection effects and remaining AGN contamination . We also examine whether dust obscuration in the galaxy plays a role in attenuating X-rays from star-formation , by investigating changes in the L _ { X } / L _ { IR } ratio as a function of the average dust temperature . We conclude that X-rays do not suffer any measurable attenuation in the host galaxy .