Astronomy

What Types of Radiation Emanate in the Future and are Perceived in the Present?

What Types of Radiation Emanate in the Future and are Perceived in the Present?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Years ago I came upon a scientific text which mentioned different types of radiation and how they are perceived. One such form of radiation was described (from our perspective and understanding of space-time) to originate from the future and travel backwards through time.

I cannot for the life of me remember what this type of radiation was called and I would love to know it.


One concept of antiparticles (see Feynman) is that they're regular particles travelling backward in time. If you go with that, then by extending the meaning of "radiation" to include physical particles -- which is common usage, e.g. $alpha$ and $eta $ particles -- then antiparticles such as positrons came from the future.


Nothing that is known to exist

Relativity does not allow for a massive particle to travel at the speed of light, but it doesn't prevent a particle from travelling faster than light. Such a particle has been called a Tachyon. No such particle has ever been observed. There are good reasons for believing that they don't exist.

Such a particle would be extremely strange. The mass of such a particle would not be positive, it would be imaginary! However you could not use such a particle to send a message to the past. The particles can't be localised, which means that you can't detect them as being "at" a particular place at a particular time.

In conclusion, such particles have never been detected, probably don't exist, and couldn't be used to send messages.


read more at http://math.ucr.edu/home/baez/physics/ParticleAndNuclear/tachyons.html


Long-term health effects of Hiroshima and Nagasaki atomic bombs not as dire as perceived

The detonation of atomic bombs over the Japanese cities of Hiroshima and Nagasaki in August 1945 resulted in horrific casualties and devastation. The long-term effects of radiation exposure also increased cancer rates in the survivors. But public perception of the rates of cancer and birth defects among survivors and their children is in fact greatly exaggerated when compared to the reality revealed by comprehensive follow-up studies. The reasons for this mismatch and its implications are discussed in a Perspectives review of the Hiroshima/Nagasaki survivor studies published in the August issue of the journal GENETICS, a publication of the Genetics Society of America.

"Most people, including many scientists, are under the impression that the survivors faced debilitating health effects and very high rates of cancer, and that their children had high rates of genetic disease," says Bertrand Jordan, an author and a molecular biologist at UMR 7268 ADÉS, Aix-Marseille Université/EFS/CNRS, in France. "There's an enormous gap between that belief and what has actually been found by researchers."

Dr. Jordan's article contains no new data, but summarizes over 60 years of medical research on the Hiroshima/Nagasaki survivors and their children and discusses reasons for the persistent misconceptions. The studies have clearly demonstrated that radiation exposure increases cancer risk, but also show that the average lifespan of survivors was reduced by only a few months compared to those not exposed to radiation. No health effects of any sort have so far been detected in children of the survivors.

Approximately 200,000 people died in the bombings and their immediate aftermath, mainly from the explosive blast, the firestorm it sparked, and from acute radiation poisoning. Around half of the those who survived subsequently took part in studies tracking their health over their entire lifespan. These studies began in 1947 and are now conducted by a dedicated agency, the Radiation Effects Research Foundation (RERF), with funding from the Japanese and U.S. governments. The project has followed approximately 100,000 survivors, 77,000 of their children, plus 20,000 people who were not exposed to radiation.

This massive data set has been uniquely useful for quantifying the risks of radiation because the bombs served as a single, well-defined exposure source, and because the relative exposure of each individual can be reliably estimated using the person's distance from the detonation site. The data has been particularly invaluable in setting acceptable radiation exposure limits for nuclear industry workers and the general public.

Cancer rates among survivors was higher compared to rates in those who had been out of town at the time. The relative risk increased according to how close the person was to the detonation site, their age (younger people faced a greater lifetime risk), and their sex (greater risk for women than men). However, most survivors did not develop cancer. Incidence of solid cancers between 1958 and 1998 among the survivors were 10% higher, which corresponds to approximately 848 additional cases among 44,635 survivors in this part of the study. However, most of the survivors received a relatively modest dose of radiation. In contrast, those exposed to a higher radiation dose of 1 Gray (approximately 1000 times higher than current safety limits for the general public) bore a 44% greater risk of cancer over the same time span (1958-1998). Taking into consideration all causes of death, this relatively high dose reduced average lifespan by approximately 1.3 years.

Although no differences in health or mutations rates have yet been detected among children of survivors, Jordan suggests that subtle effects might one day become evident, perhaps through more detailed sequencing analysis of their genomes. But it is now clear that even if the children of survivors do in fact face additional health risks, those risks must be very small.

Jordan attributes the difference between the results of these studies and public perception of the long-term effects of the bombs to a variety of possible factors, including historical context.

"People are always more afraid of new dangers than familiar ones," says Jordan. "For example, people tend to disregard the dangers of coal, both to people who mine it, and to the public exposed to atmospheric pollution. Radiation is also much easier to detect than many chemical hazards. With a hand-held geiger counter, you can sensitively detect tiny amounts of radiation that pose no health risk at all."

Jordan cautions that the results should not be used to foster complacency about the effects of nuclear accidents or the threat of nuclear war. "I used to support nuclear power until Fukushima happened," he says. "Fukushima showed disasters can occur even in a country like Japan that has strict regulations. However, I think it's important that the debate be rational, and I would prefer that people look at the scientific data, rather than gross exaggerations of the danger."


Contents

Since the earliest days of radio communications, the negative effects of interference from both intentional and unintentional transmissions have been felt and the need to manage the radio frequency spectrum became apparent.

In 1933, a meeting of the International Electrotechnical Commission (IEC) in Paris recommended the International Special Committee on Radio Interference (CISPR) be set up to deal with the emerging problem of EMI. CISPR subsequently produced technical publications covering measurement and test techniques and recommended emission and immunity limits. These have evolved over the decades and form the basis of much of the world's EMC regulations today.

In 1979, legal limits were imposed on electromagnetic emissions from all digital equipment by the FCC in the US in response to the increased number of digital systems that were interfering with wired and radio communications. Test methods and limits were based on CISPR publications, although similar limits were already enforced in parts of Europe.

In the mid 1980s, the European Union member states adopted a number of "new approach" directives with the intention of standardizing technical requirements for products so that they do not become a barrier to trade within the EC. One of these was the EMC Directive (89/336/EC) [3] and it applies to all equipment placed on the market or taken into service. Its scope covers all apparatus "liable to cause electromagnetic disturbance or the performance of which is liable to be affected by such disturbance".

This was the first time there was a legal requirement on immunity, as well as emissions on apparatus intended for the general population. Although there may be additional costs involved for some products to give them a known level of immunity, it increases their perceived quality as they are able to co-exist with apparatus in the active EM environment of modern times and with fewer problems.

Many countries now have similar requirements for products to meet some level of electromagnetic compatibility (EMC) regulation.

Electromagnetic interference can be categorized as follows:

    EMI or RFI , which typically emanates from intended transmissions such as radio and TV stations or mobile phones EMI or RFI , which is unintentional radiation from sources such as electric power transmission lines. [4][5][6]

Conducted electromagnetic interference is caused by the physical contact of the conductors as opposed to radiated EMI, which is caused by induction (without physical contact of the conductors). Electromagnetic disturbances in the EM field of a conductor will no longer be confined to the surface of the conductor and will radiate away from it. This persists in all conductors and mutual inductance between two radiated electromagnetic fields will result in EMI.

Interference with the meaning of electromagnetic interference, also radio-frequency interference (EMI or RFI) is – according to Article 1.166 of the International Telecommunication Union ' s (ITU) Radio Regulations (RR) [7] – defined as "The effect of unwanted energy due to one or a combination of emissions, radiations, or inductions upon reception in a radiocommunication system, manifested by any performance degradation, misinterpretation, or loss of information which could be extracted in the absence of such unwanted energy".

This is also a definition used by the frequency administration to provide frequency assignments and assignment of frequency channels to radio stations or systems, as well as to analyze electromagnetic compatibility between radiocommunication services.

In accordance with ITU RR (article 1) variations of interference are classified as follows:

Conducted EMI is caused by the physical contact of the conductors as opposed to radiated EMI which is caused by induction (without physical contact of the conductors).

For lower frequencies, EMI is caused by conduction and, for higher frequencies, by radiation.

EMI through the ground wire is also very common in an electrical facility.

Interference tends to be more troublesome with older radio technologies such as analogue amplitude modulation, which have no way of distinguishing unwanted in-band signals from the intended signal, and the omnidirectional antennas used with broadcast systems. Newer radio systems incorporate several improvements that enhance the selectivity. In digital radio systems, such as Wi-Fi, error-correction techniques can be used. Spread-spectrum and frequency-hopping techniques can be used with both analogue and digital signalling to improve resistance to interference. A highly directional receiver, such as a parabolic antenna or a diversity receiver, can be used to select one signal in space to the exclusion of others.

The most extreme example of digital spread-spectrum signalling to date is ultra-wideband (UWB), which proposes the use of large sections of the radio spectrum at low amplitudes to transmit high-bandwidth digital data. UWB, if used exclusively, would enable very efficient use of the spectrum, but users of non-UWB technology are not yet prepared to share the spectrum with the new system because of the interference it would cause to their receivers (the regulatory implications of UWB are discussed in the ultra-wideband article).

In the United States, the 1982 Public Law 97-259 allowed the Federal Communications Commission (FCC) to regulate the susceptibility of consumer electronic equipment. [8] [9]

Potential sources of RFI and EMI include: [10] various types of transmitters, doorbell transformers, toaster ovens, electric blankets, ultrasonic pest control devices, electric bug zappers, heating pads, and touch controlled lamps. Multiple CRT computer monitors or televisions sitting too close to one another can sometimes cause a "shimmy" effect in each other, due to the electromagnetic nature of their picture tubes, especially when one of their de-gaussing coils is activated.

Switching loads (inductive, capacitive, and resistive), such as electric motors, transformers, heaters, lamps, ballast, power supplies, etc., all cause electromagnetic interference especially at currents above 2 A. The usual method used for suppressing EMI is by connecting a snubber network, a resistor in series with a capacitor, across a pair of contacts. While this may offer modest EMI reduction at very low currents, snubbers do not work at currents over 2 A with electromechanical contacts. [11] [12]

Another method for suppressing EMI is the use of ferrite core noise suppressors (or ferrite beads), which are inexpensive and which clip on to the power lead of the offending device or the compromised device.

Switched-mode power supplies can be a source of EMI, but have become less of a problem as design techniques have improved, such as integrated power factor correction.

Most countries have legal requirements that mandate electromagnetic compatibility: electronic and electrical hardware must still work correctly when subjected to certain amounts of EMI, and should not emit EMI, which could interfere with other equipment (such as radios).

Radio frequency signal quality has declined throughout the 21st century by roughly one decibel per year as the spectrum becomes increasingly crowded. [ additional citation(s) needed ] This has inflicted a Red Queen's race on the mobile phone industry as companies have been forced to put up more cellular towers (at new frequencies) that then cause more interference thereby requiring more investment by the providers and frequent upgrades of mobile phones to match. [13]

The International Special Committee for Radio Interference or CISPR (French acronym for "Comité International Spécial des Perturbations Radioélectriques"), which is a committee of the International Electrotechnical Commission (IEC) sets international standards for radiated and conducted electromagnetic interference. These are civilian standards for domestic, commercial, industrial and automotive sectors. These standards form the basis of other national or regional standards, most notably the European Norms (EN) written by CENELEC (European committee for electrotechnical standardisation). US organizations include the Institute of Electrical and Electronics Engineers (IEEE), the American National Standards Institute (ANSI), and the US Military (MILSTD).

Integrated circuits are often a source of EMI, but they must usually couple their energy to larger objects such as heatsinks, circuit board planes and cables to radiate significantly. [14]

On integrated circuits, important means of reducing EMI are: the use of bypass or decoupling capacitors on each active device (connected across the power supply, as close to the device as possible), rise time control of high-speed signals using series resistors, [15] and IC power supply pin filtering. Shielding is usually a last resort after other techniques have failed, because of the added expense of shielding components such as conductive gaskets.

The efficiency of the radiation depends on the height above the ground plane or power plane (at RF, one is as good as the other) and the length of the conductor in relation to the wavelength of the signal component (fundamental frequency, harmonic or transient such as overshoot, undershoot or ringing). At lower frequencies, such as 133 MHz, radiation is almost exclusively via I/O cables RF noise gets onto the power planes and is coupled to the line drivers via the VCC and GND pins. The RF is then coupled to the cable through the line driver as common-mode noise. Since the noise is common-mode, shielding has very little effect, even with differential pairs. The RF energy is capacitively coupled from the signal pair to the shield and the shield itself does the radiating. One cure for this is to use a braid-breaker or choke to reduce the common-mode signal.

At higher frequencies, usually above 500 MHz, traces get electrically longer and higher above the plane. Two techniques are used at these frequencies: wave shaping with series resistors and embedding the traces between the two planes. If all these measures still leave too much EMI, shielding such as RF gaskets and copper tape can be used. Most digital equipment is designed with metal or conductive-coated plastic cases.

RF immunity and testing Edit

Any unshielded semiconductor (e.g. an integrated circuit) will tend to act as a detector for those radio signals commonly found in the domestic environment (e.g. mobile phones). [16] Such a detector can demodulate the high frequency mobile phone carrier (e.g., GSM850 and GSM1900, GSM900 and GSM1800) and produce low-frequency (e.g., 217 Hz) demodulated signals. [17] This demodulation manifests itself as unwanted audible buzz in audio appliances such as microphone amplifier, speaker amplifier, car radio, telephones etc. Adding onboard EMI filters or special layout techniques can help in bypassing EMI or improving RF immunity. [18] Some ICs are designed (e.g., LMV831-LMV834, [19] MAX9724 [20] ) to have integrated RF filters or a special design that helps reduce any demodulation of high-frequency carrier.

Designers often need to carry out special tests for RF immunity of parts to be used in a system. These tests are often done in an anechoic chamber with a controlled RF environment where the test vectors produce a RF field similar to that produced in an actual environment. [17]

Interference in radio astronomy, where it is commonly referred to as radio-frequency interference (RFI), is any source of transmission that is within the observed frequency band other than the celestial sources themselves. Because transmitters on and around the Earth can be many times stronger than the astronomical signal of interest, RFI is a major concern for performing radio astronomy. Natural sources of interference, such as lightning and the Sun, are also often referred to as RFI.

Some of the frequency bands that are very important for radio astronomy, such as the 21-cm HI line at 1420 MHz, are protected by regulation. This is called spectrum management. However, modern radio-astronomical observatories such as VLA, LOFAR, and ALMA have a very large bandwidth over which they can observe. Because of the limited spectral space at radio frequencies, these frequency bands cannot be completely allocated to radio astronomy. Therefore, observatories need to deal with RFI in their observations.

Techniques to deal with RFI range from filters in hardware to advanced algorithms in software. One way to deal with strong transmitters is to filter out the frequency of the source completely. This is for example the case for the LOFAR observatory, which filters out the FM radio stations between 90-110 MHz. It is important to remove such strong sources of interference as soon as possible, because they might "saturate" the highly sensitive receivers (amplifiers and analog-to-digital converters), which means that the received signal is stronger than the receiver can handle. However, filtering out a frequency band implies that these frequencies can never be observed with the instrument.

A common technique to deal with RFI within the observed frequency bandwidth, is to employ RFI detection in software. Such software can find samples in time, frequency or time-frequency space that are contaminated by an interfering source. These samples are subsequently ignored in further analysis of the observed data. This process is often referred to as data flagging. Because most transmitters have a small bandwidth and are not continuously present such as lightning or citizens' band (CB) radio devices, most of the data remains available for the astronomical analysis. However, data flagging can not solve issues with continuous broad-band transmitters, such as windmills, digital video or digital audio transmitters.

Another way to manage RFI is to establish a radio quiet zone (RQZ). RQZ is a well-defined area surrounding receivers that has special regulations to reduce RFI in favor of radio astronomy observations within the zone. The regulations may include special management of spectrum and power flux or power flux-density limitations. The controls within the zone may cover elements other than radio transmitters or radio devices. These include aircraft controls and control of unintentional radiators such as industrial, scientific and medical devices, vehicles, and power lines. The first RQZ for radio astronomy is United States National Radio Quiet Zone (NRQZ), established in 1958. [21]

Prior to the introduction of Wi-Fi, one of the biggest applications of 5 GHz band is the Terminal Doppler Weather Radar. [22] [23] The decision to use 5 GHz spectrum for Wi-Fi was finalized in World Radiocommunication Conference in 2003 however, meteorological community was not involved in the process. [24] [25] The subsequent lax implementation and misconfiguration of DFS had caused significant disruption in weather radar operations in a number of countries around the world. In Hungary, the weather radar system was declared non-operational for more than a month. Due to the severity of interference, South African weather services ended up abandoning C band operation, switching their radar network to S band. [23] [26]

Transmissions on adjacent bands to those used by passive remote sensing, such as weather satellites, have caused interference, sometimes significant. [27] There is concern that adoption of insufficiently regulated 5G could produce major interference issues. Significant interference can significantly impair numerical weather prediction performance and incur substantially negative economic and public safety impacts. [28] [29] [30] These concerns led US Secretary of Commerce Wilbur Ross and NASA Administrator Jim Bridenstine in February 2019 to urge the FCC to cancel proposed spectrum auctioning, which was rejected. [31]


What Types of Radiation Emanate in the Future and are Perceived in the Present? - Astronomy

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited.

Feature Papers represent the most advanced research with significant potential for high impact in the field. Feature Papers are submitted upon individual invitation or recommendation by the scientific editors and undergo peer review prior to publication.

The Feature Paper can be either an original research article, a substantial novel research study that often involves several techniques or approaches, or a comprehensive review paper with concise and precise updates on the latest progress in the field that systematically reviews the most exciting advances in scientific literature. This type of paper provides an outlook on future directions of research or possible applications.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to authors, or important in this field. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.


Present

Refinements to the definition of the candela came with advances in radiometry (the measurement of optical radiation) and to accommodate easier methods of creating light that did not rely on a platinum “artifact,” but they continued to reference the amount of light a traditional candle would generate. In 1979, the General Conference on Weights and Measures (CGPM) adopted a new definition: “The candela (cd) is the luminous intensity, in a given direction, of a source that emits monochromatic radiation of frequency 540 × 10 12 hertz and that has a radiant intensity in that direction of 1/683 watt per steradian.” Technical though this language may be, it remains an attempt to match our modern understanding of the human eye with what we still perceive to be a candle’s brightness.

The 1979 definition narrowed the previous formulation in one notable respect: It specified one particular frequency of light, not the entire visible spectrum. This change acknowledges a particularity of human vision. Though we can see colors from red to violet, our eyes—in the daytime at least—are most sensitive to greenish-yellow light. The frequency 540 × 10 12 hertz is a hue of green.

Most lightbulbs are not green, however. For light of another color, or a broad spectrum of colors such as the light from a white LED bulb, the International Commission on Illumination (CIE) has created a method of accounting for the eye’s greater sensitivity to certain hues. Akin to the idea of placing weights on a scale, the method grants more influence to those colors the eye registers more strongly. Taking into account each color’s respective influence gives you the light source’s overall luminous intensity.

Speaking of intensity, there are also two distinct references to it in the definition: luminous intensity and radiant intensity. This distinction relates to human perception as well. While “radiant intensity” is the intensity of light without adjustments for human vision, luminous intensity is a “photometric” unit that compensates for human visual sensitivity.

To convey the idea of light radiating into three-dimensional space, scientists employ the idea of a solid angle. Where a two-dimensional flat angle carves out a piece of a circle looking like a slice of pie, a three-dimensional solid angle is a section of a sphere shaped like a cone. This three-dimensional angle stretching out from the center to the surface is measured in steradians, the unit of solid angle. If you had a sphere whose radius was one meter, a one-steradian section’s base would be a circle marking out an area of one square meter of its surface. (Regardless of its size, the solid angle of a sphere is 4π steradians.)

Using solid angles as part of the candela definition specifies light that shines in a particular direction. When you purchase a lightbulb, its brightness is given not in candelas but in lumens, which tells you how much light it puts out in total—telling you its brightness in all directions. (This makes sense because you want to figure out how well it will illuminate your entire room.) The candela, on the other hand, is a measure of how brightly the light source will appear to your eye if you look directly at it.

This understanding of the human vision has led to the creation of one of the seven defining constants for the SI base units. This constant, Kcd, is equal to 683 lumens per watt—a value that makes the modern candela roughly equivalent to the previously defined candela. Roughly speaking, 1/683 watt per steradian is the amount of power needed to generate a candle’s brightness. (This value of 1/683 watt per steradian is not tied to any particular fundamental physical effect separate from human perception, any more than the previous definition’s 1/600,000 square meter of a black body was. In both cases, scientists were literally eyeballing it.)

All this attention on human sense is striking when Kcd is compared with the other fundamental constants. Many of the others—such as the speed of light in a vacuum or the charge of an electron—are properties of the universe. Kcd is universal in that it is green everywhere in the cosmos, but to an alien—or a creature from our planet that has different visual sensitivity—its selection as a definitive constant would seem highly arbitrary. It is simply convenient for human perception.

As of May 20, 2019, the definition of the candela has changed to a far more complicated wording than the 1979 definition. The actual value of the candela will not appreciably change, though because it is tied to the other SI units, which in some cases have been redefined themselves, there will be an imperceptible difference. The language of the candela’s definition is now primarily aimed at technical experts:

“The candela is defined by taking the fixed numerical value of the luminous efficacy of monochromatic radiation of frequency 540×10 12 Hz, Kcd, to be 683 when expressed in the unit lm⋅W −1 , which is equal to cd⋅sr⋅W −1 , or cd⋅sr⋅kg −1 ⋅m −2 ⋅s 3 , where the kilogram, meter and second are defined in terms of h, c and ΔνCs.”

The candela is now linked closely to the other units and constants of the new SI. A one-candela light source, though, will still appear to the human eye to be as bright as a wax candle of yesteryear.


Indices of environmental temperatures for primates in open habitats

Studies of thermoregulation in primates are under-represented in the literature, although there is sufficient evidence to suggest that temperature represents an important ecological constraint. One of the problems in examining thermoregulation in primates, however, is the difficulty in quantifying the thermal environment, since shade temperatures, solar radiation, humidity and wind speed all serve to alter an animal's 'perceived' temperature. Since animals respond to their perceived temperature, we need methods to account for each of these factors, both individually and collectively, if we are to understand the integrated impact of the thermal environment on primates. Here, we present a review of some thermal indices currently available. Black bulb temperatures can account for the effect of solar radiation, with wind chill equivalent temperatures and the heat index providing quantifiable estimates of the relative impact of wind speed and humidity, respectively. We present three potential indices of the 'perceived environmental temperature' (PET) that account for the combined impact of solar radiation, humidity and wind speed on temperature, and perform a preliminary test of all of the climatic indices against behavioural data from a field study of chacma baboons ( Papio cynocephalus ursinus) at De Hoop Nature Reserve, South Africa. One measure of the perceived environmental temperature, PET2, is an effective thermal index, since it enters the models for feeding and resting behaviour, and also accounts for levels of allogrooming. Solar radiation intensity is an important factor underlying these relationships, although the wind chill equivalent temperature and humidity enter the models for other behaviours. Future studies should thus be mindful of the impact of each of these elements of the thermal environment. A detailed understanding of primate thermoregulation will only come with the development of biophysical models of the thermal characteristics of the species and its environment. Until such developments, however, the indices presented here should permit a more detailed examination of the thermal environment, allowing thermoregulation to be given greater precedence in future studies of primate behaviour.


Particles of information

Just as radiation scientists can conduct experiments to examine the effects of exposing a living cell to radioactive particles, communication scientists can conduct experiments to study the effects of exposing the human mind to “particles of information.” In one study, college students were given a pie chart depicting a person’s degree of exposure to radiation from eight sources (MacGregor et al., 2002a). The students found radon to be a larger source of exposure than they had expected, and industrial sources and nuclear medicine to be smaller exposures than expected. The students had different ideas about the meaning of “natural background radiation,” but after being tutored about diverse sources of radiation exposure and their relative contributions to a personal radiation “budget,” the students perceived less risk of radiation-induced harm in the form of cancer and birth defects. However, they still believed that human-caused exposures were much more likely to cause harm than natural background exposures.

This experiment was part of a larger pilot study that tutored participants in the basics of radiation science to test whether greater knowledge might lessen the gap between expert and lay perceptions. The study produced mixed results (MacGregor, 2002), but pre- and post-tutorial testing showed that educated laypersons could significantly increase their knowledge of radiation science. Increased knowledge led to increased concerns about radiation exposure from x-rays and other medical applications, air travel, cosmic radiation, natural background radiation, and radon. Risk perception decreased for hospital waste, nuclear waste, and nuclear power plants. Attitudes toward the adequacy of radiation risk-management policies were slightly more favorable after exposure to the tutorial. More studies of this nature should be conducted to determine the effects of education on radiation risk perceptions and attitudes.


Calculated Risks: How Radiation Rules Manned Mars Exploration

Nearly everything we know about the radiation exposure on a trip to Mars we have learned in the past 200 days.

For much longer, we have known that space is a risky place to be, radiation being one of many reasons. We believed that once our explorers safely landed on the surface of Mars, the planet would provide shielding from the ravages of radiation. We didn’t how much, or how little, until very recently. Radiation and its variations impact not only the planning of human and robotic missions, but also the search for life taking place right now.

The first-ever radiation readings from the surface of another planet were published last month in the journal Science. The take-home lesson, as well as the getting-there lesson and the staying-there lesson, is this: don’t forget to pack your shielding. [Mars Radiation Threat to Astronauts Explained (Infographic)]

"Radiation is the one environmental characteristic that we don’t have a lot of experience with on Earth because we’re protected by our magnetosphere and relatively thick atmosphere. But it’s a daily fact of life on Mars," said Don Hassler, the lead author on the paper, "Mars’ Surface Radiation Environment Measured with the Mars Science Laboratory’s Curiosity Rover."

Measuring radiation on Mars

On Earth, we often associate radiation exposure with fallout from catastrophes such as Chernobyl and Fukushima. We sometimes worry over CAT scans, chest X-rays or transcontinental flights. However, according to the Health Physics Society, the biggest source of radiation for most of us, by far, is inhaled radon. The sky above our heads and the earth beneath our feet are typically the least of our worries.

In open space, human beings continuously contend with intense solar and cosmic background radiation. Solar energetic particles (SEPs) and galactic cosmic rays (GCRs) turn a trip to Mars into a six-month radiation shower.

The Mars rover Curiosity has allowed us to finally calculate an average dose over the 180-day journey. It is approximately 300 mSv, the equivalent of 24 CAT scans. In just getting to Mars, an explorer would be exposed to more than 15 times an annual radiation limit for a worker in a nuclear power plant.

Data from Curiosity also demonstrated that landing only partially solves the problem. Once on the Martian surface, cosmic radiation coming from the far side of the planet is blocked. This cuts down detected GCRs by half. The protection from strong solar particles, though, is shoddy and inconsistent. Substantial variations in SEPs occur as the meager Martian atmosphere is tussled by solar wind.

"The variability [in radiation levels] was much larger than expected," Hassler said. "[This creates] variability in weekly and monthly dose rates. There are also seasonal variations in radiation."

Study co-author Jennifer Eigenbrode, from the Goddard Institute of Space Studies, described how fluxes in radiation are critical in determining the possibility of life on the Red Planet.

"Radiation is probably the key parameter in determining how much alteration organics are experiencing in the rocks on the surface," Eigenbrode said.

Eigenbrode said this is because the most powerful particles in the air also penetrate the Martian soil. On impacting the surface, the GCRs and strong SEPs from space produce gamma rays and neutrons easily capable of breaking molecular bonds in the soil.

These events may have obliterated all evidence of life close to the surface. The new study estimates that finding intact organic molecules means digging deeper, down a meter or so, and digging for newer evidence, near impact sites where rock has spent less time exposed to the elements. [The Search for Life on Mars (A Photo Timeline)]

"If we find organics on Mars, the circumstance in which we find them [the context of the rocks], the history of the rocks, and the chemistry that we find, will help guide our mission strategy," Eigenbrode said.

Radiation levels measured by Curiosity have given us a better guide on how and where to look for former or current life. Future life, specifically the lives of our astronauts, also hinges upon these radiation measurements.

Fundamentally, "situational awareness is the strategy we have to use going forward," Hassler said. "We can design shelters on the surface to protect the astronauts."

Deep space, the place of greatest exposure, remains an issue.

"Perhaps one of the areas they would be most vulnerable would be during a spacewalk [on the way] to Mars."

Predicting space weather

In transit and on the planet, surviving space means predicting space weather. Space weather forecasting is a relatively new field, but one that's proving to be critical to all space missions.

Space weather prediction involves forecasting solar flares, coronal mass ejections, and geomagnetic storms. These highly energetic events emanate from the sun. When they cross the orbit of a planet, the same SEPs attacking organics can spell disaster for satellites, space stations, astronauts and the communication systems they all depend upon.

"To protect our satellites is becoming more and more important here on Earth," Hassler said.

Protecting satellites and people around Earth and Mars likely involves setting up two separate systems. Using Earth-based technologies to predict the radiation levels on Mars isn’t the best choice. The distance and opposition of the planets compounds the problem. When Mars is on the far side of the sun, it isn’t even an option.

"When we send astronauts to Mars, we will have to do our own space weather monitoring from [Mars]," Hassler said.

From beneath the shelter of Earth’s ample atmosphere, we continue to receive daily updates from Curiosity. Its 3-pound Radiation Assessment Detector (RAD) instrument informs us about surface radiation events, particle type and relative frequencies. For now, RAD is the only way that we can study Martian radiation and make plans for the future.

In the future, what we’ve learned from RAD will be used to better look for life on the surface, to design suits and habitats, to plan extravehicular activities. Because of what we have learned, we can begin to establish weather prediction systems. We can tell explorers that there is an increased risk of cancer associated with a trip to Mars (approximately 5 percent over a lifetime).

In these ways, radiation rules the past, present and future of effective planetary exploration. Thanks to RAD measurements and the resulting analysis, we can begin to write a survival guide for life on Mars.


People In The U.S. And The U.K. Show Strong Similarities In Their Attitudes Toward Nanotechnologies

The results of a new U.S.&ndashU.K. study published in the journal Nature Nanotechnology show that ordinary people in both countries hold very positive views of nanotechnologies and what the future of these technologies might bring. Participants in both countries indicated a significantly higher comfort level with energy applications of nanotechnologies than with applications used in health treatments.

Nanotechnology &ndash&ndash the science and technology of exceptionally small materials and processes &ndash&ndash is among the latest new technologies to raise public concerns about health and environmental risks.

The article reports on the first study of its kind. It involved four workshops, held at the same time in Santa Barbara and Cardiff, Wales. Workshop participants deliberated about two broad types of nanotechnology applications &ndash&ndash energy and health.

The study was carried out in the United States by the NSF Center for Nanotechnology in Society at the University of California, Santa Barbara, and in the United Kingdom by a collaborating research team from the School of Psychology at Cardiff University.

Barbara Herr Harthorn, director of the UCSB Center, led the interdisciplinary, international research team. She noted that one of the unexpectedly strong findings of the study was that the type of nanotechnology mattered greatly to the participants. She said participants in both countries viewed energy applications of nanotechnology more positively than health technologies, in terms of risks and benefits.

"Much of the public perception research on nanotechnology in the U.S. and abroad has focused on a generic 'nanotechnology' risk object," said Harthorn. "This work moves to a higher level of specificity and in doing so finds striking differences in views of benefit depending on application context.

"More specifically, perceived urgency of need for new energy technologies is strongly associated with high perceived benefit and lower risk perception, regardless of what materials, processes, or environmental risks are associated," she said.

Nick Pidgeon, who led the research team at the School of Psychology at Cardiff University, explained, "The Royal Society's 2004 report on nanotechnologies recommended public engagement and deliberation on nanotechnology risks and benefits. This study represents the first ever such public engagement exercise to be simultaneously conducted in two different countries."

The results include the following key findings:

  • Overall participants in both countries focused on the benefits rather than the risks of nanotechnologies, and also exhibited a high degree of optimism regarding the future contribution of new technologies to society. This pattern was very similar in the workshops in both the United States and Britain.
  • Some small cross-country differences were present. U.K. participants were generally more aware of recent technological controversies and risk governance failures (examples include genetically modified organisms, bovine spongiform encephalopathy (BSE), and foot and mouth disease), leading some to voice specific concerns about future nanotechnology risks.
  • Greater differences were observed when participants (irrespective of their country) discussed the different applications. In particular, new technology developments for energy applications were seen as unproblematic, while questions of human health were felt to raise moral and ethical dilemmas. As was found by the U.K. Royal Society in 2004 for Britain, in the current study participants in both the U.K. and U.S. questioned whether those responsible (governments, industry, scientists) could be fully trusted to control nanotechnologies in the future.

The research was funded primarily by the National Science Foundation with additional support to Cardiff University provided by the Leverhulme Trust.


What Types of Radiation Emanate in the Future and are Perceived in the Present? - Astronomy

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited.

Feature Papers represent the most advanced research with significant potential for high impact in the field. Feature Papers are submitted upon individual invitation or recommendation by the scientific editors and undergo peer review prior to publication.

The Feature Paper can be either an original research article, a substantial novel research study that often involves several techniques or approaches, or a comprehensive review paper with concise and precise updates on the latest progress in the field that systematically reviews the most exciting advances in scientific literature. This type of paper provides an outlook on future directions of research or possible applications.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to authors, or important in this field. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.