Single-use radio-attenuating gloves are devices protecting the operator's hand by reducing the exposition to scattered radiations during radiological-guided intervention. Expensive, these devices are important, as they contribute to lessen the annual dose to which personals are exposed to. However, inconsistencies in their protection value have been reported between theoretical and clinical practice. In this study, we aim to highlight the normative bias existing in standard norms for radioprotective assessments. Using thermoluminescent captors positioned inside and outside the glove, we designed an EN1331-1 norm-inspired bench test with five brands of radio-attenuating gloves. At 100 keV and 56 cm away from the source, we measured indirect beam and attenuated doses in a clinical setting. From the attenuation index measured, we deduced attenuation rates. We then compared our results to their commercial technical data sheets. The assessed attenuating rates of the tested references were 20%, 30%, 32%, 15%, and 25%. Theoretical rates are, respectively, 38%, 43%, 30%, 26%, and 35%. Only one reference showed no difference between commercial and technical attenuating values (p = 0.12). The inconsistencies between observed and theoretical attenuation value show the importance of the conditions in which protection is assessed. Without complete information, the choice of a protective device as simple as a glove is biased, and this may lead to inappropriate protection of the operator, which should raise concern, as interventional radiology is now widespread in routine practice.