A Comparison Of The Methodologies Of Intake Measurement And Bioassay For Assessing Exposure To Personnel In Uranium Milling Operations

Society for Mining, Metallurgy & Exploration
J. Kruger P. J. Kruger A. H. Leuschner
Organization:
Society for Mining, Metallurgy & Exploration
Pages:
6
File Size:
272 KB
Publication Date:
Jan 1, 1981

Abstract

INTRODUCTION This paper deals with some practical aspects of the use and interpretation of dosimetric methods for assessing the exposure of workers to natural uranium. Consideration must be given not only to the scientific value of the method of dosimetry, but also to the practicality of the method in the particular working environment. It is generally accepted (IAEA, 1976) that uranium does not present a hazard during the mining and extraction processes up to the final concentration stage, that is precipitation as ADU or calcining to the oxide. Further processing (e.g. conversion to oxides, fluorides, etc.) is generally associated with the nuclear industry. It is customary for uranium processing facilities to be managed according to industrial norms rather than according to those applied for the purpose of radiation protection. This is mainly because of the large quantity of material to be processed and the fact that uranium is considered to be of a low radiotoxicity. Although ventilation is used to some extent, provision is not made for the same level of protection as would be required in a radiochemical facility. It is typical in such a plant to find that the process itself may be enclosed and ventilated, but the ventilation is inadequate to cope with an accidental release of material. The working environment is poorly ventilated, sometimes only by natural ventilation, and one frequently finds areas with high airborne dust levels, designated as mandatory respirator areas. Protection of personnel is dependent on personal protective equipment. Radiation protection is not of prime concern and personnel are not specifically trained in this subject. It is in this type of environment that acceptable monitoring procedures and personnel dosimetry must be established. The effectiveness and practical application of the dosimetry must also be judged against this working environment. DOSIMETRIC TECHNIQUES The methods available for assessing personnel exposure, whether it be in terms of chemical or radiological hazard, include urinalysis, faecal analysis and [in vivo] monitoring. Faecal analysis does not present itself easily as a method for routine use. Techniques for [in vivo] monitoring have been developed recently and as expensive instrumentation is required, it is not generally available. That leaves urinalysis as a dosimetric technique for routine use, and it is probably for this reason that urinalysis is still widely used. Guidelines for the interpretation of urinalysis results, as originally provided by Neuman (Neuman, 1950), are still used in practice, even though it has become clear that this method has severe limitations as regards the assessment of the dose as a result of the intake of class W or class Y compounds (Alexander, 1974). Typically a level of 300 µg [U/C] urine can serve as an indicator of an acute exposure (above which chemical damage to the kidneys may occur) and a level of 100 µg [U/C ] urine can serve as an indicator for an investigation. Such levels are determined and are used in conjunction with several factors, such as mode of intake (ingestion or inhalation), solubility of compound (D, W or Y), means of intake (acute or chronic), environmental monitoring results and frequency of urine sampling. Since the ICRP concept for internal radiation limitation changed from that of the critical organ, i.e. the single organ of greatest significance under the circumstances, to that of effective dose equivalent, i.e. account being taken of the total risk due to the exposure of all tissue, the maximum permissible organ burden (MPOB) was replaced as secondary dose limit by the annual limit on intake (ALI) (ICRP, 1977). The ALI values are calculated from the committed dose equivalents of the various organs, and are used to determine organ burdens, which are then used to interpret the dosimetric results. Using the ICRP model (ICRP, 1979), a comprehensive calculation was made by Johnson (Johnson, 1980), giving organ burdens and excretion data in terms of intake for acute and chronic exposures for different categories of uranium compounds. The question arises as to what extent the direct measurement of intake can be utilised as a method of dosimetry. This will require a method of personal intake measurement for each individual worker, with both ingestion and inhalation being taken into account. A measurement of
Citation

APA: J. Kruger P. J. Kruger A. H. Leuschner  (1981)  A Comparison Of The Methodologies Of Intake Measurement And Bioassay For Assessing Exposure To Personnel In Uranium Milling Operations

MLA: J. Kruger P. J. Kruger A. H. Leuschner A Comparison Of The Methodologies Of Intake Measurement And Bioassay For Assessing Exposure To Personnel In Uranium Milling Operations. Society for Mining, Metallurgy & Exploration, 1981.

Export
Purchase this Article for $25.00

Create a Guest account to purchase this file
- or -
Log in to your existing Guest account