We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Refractors only use the length of the telescope once, reflectors twice, catadioptric telescopes like those of the Schmidt-Cassegrain design three times. Have telescopes been built that reflect the incoming light at least once more from one end of the telescope to another?
Say, by adding a tertiary mirror which reflects the light forward again, towards a small quarternary mirror in front of the secondary, which ultimately directs the light though a small hole in the tertiary (and primary) mirror towards the eyepiece?
Note: The above description makes this question different from the similar-sounding Do triple or more times reflecting telescopes exist?
If your willing to accept more than 2 discontinuous mirrors, the Three Mirror Anastigmat has 4 passes along some/most of the overall tube length. An early working prototype example (which I've actually seen in person many years ago) was built at the University of Cambridge by Dr. Roderick Willstrop. The Institute of Astronomy has a page on the Three Mirror Telescope which includes the following optical diagram:
The 0.5 meter 3 Mirror Telescope (3MT) has a focal length of 0.8 meters, in an overall tube length of 1.2 meters producing a very compact telescope with a large field of view (5 degrees in diameter) with good image quality (<0.33" across the whole field of view). The quality of the site at Cambridge was not ideal for building on the progress of the prototype and a proposal for building a larger version at a better site was declined by the UK science funding agency of the time (SERC, which became PPARC and then STFC) in favor of buying into the Gemini collaboration of two 8 meter telescopes.
Some constructed or in construction examples of three mirror anastigmat include:
- James Webb Space Telescope (see Figure 2 of this page on the JWST Telescope)
- The Simonyi Survey Telescope of the Vera C. Rubin Observatory which will carry out the Legacy Survey of Space and Time (LSST) has M1 and M3 ground into the same piece of glass (see Figure 7 of the LSST overview paper) with a separate secondary M2 mirror and prime focus camera
- The planned ESO Extremely Large Telescope will use 5 mirrors in a folded three mirror anastigmat design along with 2 flat mirrors which will have 5 bounces along all or part of the overall "tube" structure before exiting to a Nasmyth focus; see the not very clear diagram below: from the ESO ELT optical diagram page, somewhat better diagram in Figure 4 of Hippler 2018.
A telescopic sight, commonly called a scope for short, is an optical sighting device based on a refracting telescope.  It is equipped with some form of a referencing pattern – known as a reticle – mounted in a focally appropriate position in its optical system to provide an accurate point of aim. Telescopic sights are used with all types of systems that require magnification in addition to reliable visual aiming, as opposed to non-magnifying iron sights, reflector (reflex) sights, holographic sights or laser sights, and are most commonly found on firearms, particularly rifles, usually via a scope mount. The optical components may be combined with optoelectronics to form a digital night scope or a "smart scope".
Operation of a telescope
The primary function of a telescope is that of radiation gathering, in many cases light gathering. As will be seen below, resolution limits on telescopes would not call for an aperture much larger than about 30 in (76 cm). However, there are many telescopes around the world with diameters several times this value. The reason for this occurrence is that larger telescopes can see further because they can collect more light. The 200 in (508 cm) diameter reflecting telescope at Mt. Palomar, California, for instance can gather 25 times more light than the 40 in (102 cm) Yerkes telescope at Williams Bay, Wisconsin, the largest refracting telescope in the world. The light gathering power grows as the area of the objective increases, or the square of its diameter if it is circular. The more light a telescope can gather, the more distant the objects it can detect, and therefore larger telescopes increase the size of the observable universe.
Spitzer telescope reveals the precise timing of a black hole dance
Black holes aren't stationary in space in fact, they can be quite active in their movements. But because they are completely dark and can't be observed directly, they're not easy to study. Scientists have finally figured out the precise timing of a complicated dance between two enormous black holes, revealing hidden details about the physical characteristics of these mysterious cosmic objects.
The OJ 287 galaxy hosts one of the largest black holes ever found, with over 18 billion times the mass of our Sun. Orbiting this behemoth is another black hole with about 150 million times the Sun's mass. Twice every 12 years, the smaller black hole crashes through the enormous disk of gas surrounding its larger companion, creating a flash of light brighter than a trillion stars -- brighter, even, than the entire Milky Way galaxy. The light takes 3.5 billion years to reach Earth.
But the smaller black hole's orbit is oblong, not circular, and it's irregular: It shifts position with each loop around the bigger black hole and is tilted relative to the disk of gas. When the smaller black hole crashes through the disk, it creates two expanding bubbles of hot gas that move away from the disk in opposite directions, and in less than 48 hours the system appears to quadruple in brightness.
Because of the irregular orbit, the black hole collides with the disk at different times during each 12-year orbit. Sometimes the flares appear as little as one year apart other times, as much as 10 years apart. Attempts to model the orbit and predict when the flares would occur took decades, but in 2010, scientists created a model that could predict their occurrence to within about one to three weeks. They demonstrated that their model was correct by predicting the appearance of a flare in December 2015 to within three weeks.
Then, in 2018, a group of scientists led by Lankeswar Dey, a graduate student at the Tata Institute of Fundamental Research in Mumbai, India, published a paper with an even more detailed model they claimed would be able to predict the timing of future flares to within four hours. In a new study published in the Astrophysical Journal Letters, those scientists report that their accurate prediction of a flare that occurred on July 31, 2019, confirms the model is correct.
The observation of that flare almost didn't happen. Because OJ 287 was on the opposite side of the Sun from Earth, out of view of all telescopes on the ground and in Earth orbit, the black hole wouldn't come back into view of those telescopes until early September, long after the flare had faded. But the system was within view of NASA's Spitzer Space Telescope, which the agency retired in January 2020.
After 16 years of operations, the spacecraft's orbit had placed it 158 million miles (254 million kilometers) from Earth, or more than 600 times the distance between Earth and the Moon. From this vantage point, Spitzer could observe the system from July 31 (the same day the flare was expected to appear) to early September, when OJ 287 would become observable to telescopes on Earth.
"When I first checked the visibility of OJ 287, I was shocked to find that it became visible to Spitzer right on the day when the next flare was predicted to occur," said Seppo Laine, an associate staff scientist at Caltech/IPAC in Pasadena, California, who oversaw Spitzer's observations of the system. "It was extremely fortunate that we would be able to capture the peak of this flare with Spitzer, because no other human-made instruments were capable of achieving this feat at that specific point in time."
Ripples in Space
Scientists regularly model the orbits of small objects in our solar system, like a comet looping around the Sun, taking into account the factors that will most significantly influence their motion. For that comet, the Sun's gravity is usually the dominant force, but the gravitational pull of nearby planets can change its path, too.
Determining the motion of two enormous black holes is much more complex. Scientists must account for factors that might not noticeably impact smaller objects chief among them are something called gravitational waves. Einstein's theory of general relativity describes gravity as the warping of space by an object's mass. When an object moves through space, the distortions turn into waves. Einstein predicted the existence of gravitational waves in 1916, but they weren't observed directly until 2015 by the Laser Interferometer Gravitational Wave Observatory (LIGO).
The larger an object's mass, the larger and more energetic the gravitational waves it creates. In the OJ 287 system, scientists expect the gravitational waves to be so large that they can carry enough energy away from the system to measurably alter the smaller black hole's orbit -- and therefore timing of the flares.
While previous studies of OJ 287 have accounted for gravitational waves, the 2018 model is the most detailed yet. By incorporating information gathered from LIGO's detections of gravitational waves, it refines the window in which a flare is expected to occur to just 1 1/2 days.
To further refine the prediction of the flares to just four hours, the scientists folded in details about the larger black hole's physical characteristics. Specifically, the new model incorporates something called the "no-hair" theorem of black holes.
Published in the 1960s by a group of physicists that included Stephen Hawking, the theorem makes a prediction about the nature of black hole "surfaces." While black holes don't have true surfaces, scientists know there is a boundary around them beyond which nothing -- not even light -- can escape. Some ideas posit that the outer edge, called the event horizon, could be bumpy or irregular, but the no-hair theorem posits that the "surface" has no such features, not even hair (the theorem's name was a joke).
In other words, if one were to cut the black hole down the middle along its rotational axis, the surface would be symmetric. (The Earth's rotational axis is almost perfectly aligned with its North and South Poles. If you cut the planet in half along that axis and compared the two halves, you would find that our planet is mostly symmetric, though features like oceans and mountains create some small variations between the halves.)
In the 1970s, Caltech professor emeritus Kip Thorne described how this scenario -- a satellite orbiting a massive black hole -- could potentially reveal whether the black hole's surface was smooth or bumpy. By correctly anticipating the smaller black hole's orbit with such precision, the new model supports the no-hair theorem, meaning our basic understanding of these incredibly strange cosmic objects is correct. The OJ 287 system, in other words, supports the idea that black hole surfaces are symmetric along their rotational axes.
So how does the smoothness of the massive black hole's surface impact the timing of the smaller black hole's orbit? That orbit is determined mostly by the mass of the larger black hole. If it grew more massive or shed some of its heft, that would change the size of smaller black hole's orbit. But the distribution of mass matters as well. A massive bulge on one side of the larger black hole would distort the space around it differently than if the black hole were symmetric. That would then alter the smaller black hole's path as it orbits its companion and measurably change the timing of the black hole's collision with the disk on that particular orbit.
"It is important to black hole scientists that we prove or disprove the no-hair theorem. Without it, we cannot trust that black holes as envisaged by Hawking and others exist at all," said Mauri Valtonen, an astrophysicist at University of Turku in Finland and a coauthor on the paper.
Spitzer science data continues to be analyzed by the science community via the Spitzer data archive located at the Infrared Science Archive housed at IPAC at Caltech in Pasadena. JPL managed Spitzer mission operations for NASA's Science Mission Directorate in Washington. Science operations were conducted at the Spitzer Science Center at IPAC at Caltech. Spacecraft operations were based at Lockheed Martin Space in Littleton, Colorado. Caltech manages JPL for NASA.
An instrument used to collect, measure, or analyze electromagnetic radiation from distant objects. A telescope overcomes the limitations of the eye by increasing the ability to see faint objects and discern fine details. In addition, when used in conjunction with modern detectors, a telescope can “see” light that is otherwise invisible. The wavelength of the light of interest can have a profound effect on the design of a telescope. See Electromagnetic radiation, Light
For many applications, the Earth's atmosphere limits the effectiveness of larger telescopes. The most obvious deleterious effect is image scintillation and motion, collectively known as poor seeing. Atmospheric turbulence produces an extremely rapid motion of the image resulting in a smearing. On the very best nights at ideal observing sites, the image of a star will be spread out over a 0.25-arcsecond seeing disk on an average night, the seeing disk may be between 0.5 and 2.0 arcseconds.
The upper atmosphere glows faintly because of the constant influx of charged particles from the Sun. The combination of the finite size of the seeing disk of stars and the presence of airglow limits the telescope's ability to see faint objects. One solution is placing a large telescope in orbit above the atmosphere. In practice, the effects of air and light pollution outweigh those of airglow at most observatories in the United States.
There are basically three types of optical systems in use in astronomical telescopes: refracting systems whose main optical elements are lenses which focus light by refraction reflecting systems, whose main imaging elements are mirrors which focus light by reflection and catadioptric systems, whose main elements are a combination of a lens and a mirror. The most notable example of the last type is the Schmidt camera.
Astronomers seldom use large telescopes for visual observations. Instead, they record their data for future study. Modern developments in photoelectric imaging devices are supplanting photographic techniques for many applications. The great advantages of detectors such as charge-coupled devices is their high sensitivity, and the images can be read out in a computer-compatible format for immediate analysis.
Light received from most astronomical objects is made up of radiation of all wavelengths. The spectral characteristics of this radiation may be extracted by special instruments called spectrographs.
As collectors of radiation from a specific direction, telescopes may be classified as focusing and nonfocusing. Nonfocusing telescopes are used for radiation with energies of x-rays and above (x-ray, gamma-ray, cosmic-ray, and neutrino telescopes). Focusing telescopes, intended for nonvisible wavelengths, are similar to optical ones (solar, radio, infrared, and ultraviolet telescopes), but they differ in the details of construction. See Cerenkov radiation
The 5-m (200-in.) Hale telescope at Palomar Mountain, California, was completed in 1950. The primary mirror is 5 m in diameter with a 1.02-m (40-in.) hole in the center.
The 4-m (158-in.) Mayall reflector at the Kitt Peak National Observatory was dedicated in 1973. The prime focus has a field of view six times greater than that of the Hale reflector. An identical telescope was subsequently installed at Cerro Tololo Inter-American Observatory, in Chile.
The mirrors for these traditional large telescopes were all produced using the same general methodology. A large, thick glass mirror blank was first cast then the top surface of the mirror was laboriously ground and polished to the requisite shape. The practical and economical limit to the size of traditional mirror designs was nearly reached by the 6-m (236-in.) telescope in the Caucasus Mountains, Russia. Newer telescopes have been designed and built that use either a number of mirrors mounted such that the light collection by them is brought to a common focus, or lightweight mirrors in computer-controlled mounts.
The Keck Telescope on Mauna Kea, Hawaii, completed in 1993, is the largest of the segmented mirror telescopes to be put into operation. The telescope itself is a fairly traditional design. However, its primary mirror is made up of 36 individual hexagonal segments mosaiced together to form a single 10-m (386-in.) mirror. Electronic sensors built into the edges of the segments monitor the relative positions of the segments, and feed the results to a computer-controlled actuator system.
In 1989, the European Southern Observatory put into operation their New Technology Telescope. The 3.58-m (141-in.) mirror was produced by a technique known as spin-casting, where molten glass is poured into a rotating mold.
Worldwide efforts are under way on a new generation of large, ground-based telescopes, using both the spin-casting method and the segmented method to produce large mirrors. The Gemini project of the National Optical Astronomy Observatories included twin 8.1-m (319-in.) telescopes, Gemini North (see illustration) on Mauna Kea, Hawaii, and Gemini South on Cerro Pachon in Chile.
The Very Large Telescope (VLT), operated by the European Southern Observatory on Cerro Paranel, Chile, consists of four 8-m (315-in.) “unit” telescopes with spin-cast mirrors. The light from the four telescopes is combined to give the equivalent light-gathering power of a 16-m (630-in.) telescope. The last of the four telescopes began collecting scientific data in September 2000.
The ability of large telescopes to resolve fine detail is limited by a number of factors. Distortion due to the mirror's own weight causes problems in addition to those of atmospheric seeing. The Earth-orbiting Hubble Space Telescope (HST), with an aperture of 2.4 m (94 in.), was designed to eliminate these problems. The telescope operates in ultraviolet as well as visible light, resulting in a great improvement in resolution not only by the elimination of the aforementioned terrestrial effects but by the reduced blurring by diffraction in the ultraviolet. See Diffraction, Resolving power (optics)
Soon after the telescope was launched in 1990, it was discovered that the optical system was plagued with spherical aberration, which severely limited its spatial resolution. After space-shuttle astronauts serviced and repaired the telescope in 1993, adding what amounted to eyeglasses for the scientific instruments, the telescope exceeded its prelaunch specifications for spatial resolution. Subsequent servicing missions replaced instruments with newer technology.
There are three basic components of a modern system for measuring radiation from astronomical sources. First, there is a telescope, which serves as a “bucket” for collecting visible light (or radiation at other wavelengths, as shown in (Figure). Just as you can catch more rain with a garbage can than with a coffee cup, large telescopes gather much more light than your eye can. Second, there is an instrument attached to the telescope that sorts the incoming radiation by wavelength. Sometimes the sorting is fairly crude. For example, we might simply want to separate blue light from red light so that we can determine the temperature of a star. But at other times, we want to see individual spectral lines to determine what an object is made of, or to measure its speed (as explained in the Radiation and Spectra chapter). Third, we need some type of detector, a device that senses the radiation in the wavelength regions we have chosen and permanently records the observations.
Orion Region at Different Wavelengths.Figure 1. The same part of the sky looks different when observed with instruments that are sensitive to different bands of the spectrum. (a) Visible light: this shows part of the Orion region as the human eye sees it, with dotted lines added to show the figure of the mythical hunter, Orion. (b) X-rays: here, the view emphasizes the point-like X-ray sources nearby. The colors are artificial, changing from yellow to white to blue with increasing energy of the X-rays. The bright, hot stars in Orion are still seen in this image, but so are many other objects located at very different distances, including other stars, star corpses, and galaxies at the edge of the observable universe. (c) Infrared radiation: here, we mainly see the glowing dust in this region. (credit a: modification of work by Howard McCallon/NASA/IRAS credit b: modification of work by Howard McCallon/NASA/IRAS credit c: modification of work by Michael F. Corcoran)
The history of the development of astronomical telescopes is about how new technologies have been applied to improve the efficiency of these three basic components: the telescopes, the wavelength-sorting device, and the detectors. Let’s first look at the development of the telescope.
Many ancient cultures built special sites for observing the sky (Figure). At these ancient observatories, they could measure the positions of celestial objects, mostly to keep track of time and date. Many of these ancient observatories had religious and ritual functions as well. The eye was the only device available to gather light, all of the colors in the light were observed at once, and the only permanent record of the observations was made by human beings writing down or sketching what they saw.
Two Pre-Telescopic Observatories.Figure 2. (a) Machu Picchu is a fifteenth century Incan site located in Peru. (b) Stonehenge, a prehistoric site (3000–2000 BCE), is located in England. (credit a: modification of work by Allard Schmidt)
While Hans Lippershey, Zaccharias Janssen, and Jacob Metiusare all credited with the invention of the telescope around 1608—applying for patents within weeks of each other—it was Galileo who, in 1610, used this simple tube with lenses (which he called a spyglass) to observe the sky and gather more light than his eyes alone could. Even his small telescope—used over many nights—revolutionized ideas about the nature of the planets and the position of Earth.
How Telescopes Work
Telescopes have come a long way since Galileo’s time. Now they tend to be huge devices the most expensive cost hundreds of millions to billions of dollars. (To provide some reference point, however, keep in mind that just renovating college football stadiums typically costs hundreds of millions of dollars—with the most expensive recent renovation, at Texas A&M University’s Kyle Field, costing $450 million.) The reason astronomers keep building bigger and bigger telescopes is that celestial objects—such as planets, stars, and galaxies—send much more light to Earth than any human eye (with its tiny opening) can catch, and bigger telescopes can detect fainter objects. If you have ever watched the stars with a group of friends, you know that there’s plenty of starlight to go around each of you can see each of the stars. If a thousand more people were watching, each of them would also catch a bit of each star’s light. Yet, as far as you are concerned, the light not shining into your eye is wasted. It would be great if some of this “wasted” light could also be captured and brought to your eye. This is precisely what a telescope does.
The most important functions of a telescope are (1) to collect the faint light from an astronomical source and (2) to focus all the light into a point or an image. Most objects of interest to astronomers are extremely faint: the more light we can collect, the better we can study such objects. (And remember, even though we are focusing on visible light first, there are many telescopes that collect other kinds of electromagnetic radiation.)
Telescopes that collect visible radiation use a lens or mirror to gather the light. Other types of telescopes may use collecting devices that look very different from the lenses and mirrors with which we are familiar, but they serve the same function. In all types of telescopes, the light-gathering ability is determined by the area of the device acting as the light-gathering “bucket.” Since most telescopes have mirrors or lenses, we can compare their light-gathering power by comparing the apertures, or diameters, of the opening through which light travels or reflects.
The amount of light a telescope can collect increases with the size of the aperture. A telescope with a mirror that is 4 meters in diameter can collect 16 times as much light as a telescope that is 1 meter in diameter. (The diameter is squared because the area of a circle equals πd 2 /4, where d is the diameter of the circle.)
Proposals and precursors Edit
In 1923, Hermann Oberth — considered a father of modern rocketry, along with Robert H. Goddard and Konstantin Tsiolkovsky — published Die Rakete zu den Planetenräumen ("The Rocket into Planetary Space"), which mentioned how a telescope could be propelled into Earth orbit by a rocket. 
The history of the Hubble Space Telescope can be traced back as far as 1946, to astronomer Lyman Spitzer's paper entitled "Astronomical advantages of an extraterrestrial observatory".  In it, he discussed the two main advantages that a space-based observatory would have over ground-based telescopes. First, the angular resolution (the smallest separation at which objects can be clearly distinguished) would be limited only by diffraction, rather than by the turbulence in the atmosphere, which causes stars to twinkle, known to astronomers as seeing. At that time ground-based telescopes were limited to resolutions of 0.5–1.0 arcseconds, compared to a theoretical diffraction-limited resolution of about 0.05 arcsec for an optical telescope with a mirror 2.5 m (8 ft 2 in) in diameter. Second, a space-based telescope could observe infrared and ultraviolet light, which are strongly absorbed by the atmosphere of Earth.
Spitzer devoted much of his career to pushing for the development of a space telescope. In 1962, a report by the U.S. National Academy of Sciences recommended development of a space telescope as part of the space program, and in 1965 Spitzer was appointed as head of a committee given the task of defining scientific objectives for a large space telescope. 
Space-based astronomy had begun on a very small scale following World War II, as scientists made use of developments that had taken place in rocket technology. The first ultraviolet spectrum of the Sun was obtained in 1946,  and the National Aeronautics and Space Administration (NASA) launched the Orbiting Solar Observatory (OSO) to obtain UV, X-ray, and gamma-ray spectra in 1962.  An orbiting solar telescope was launched in 1962 by the United Kingdom as part of the Ariel space program, and in 1966 NASA launched the first Orbiting Astronomical Observatory (OAO) mission. OAO-1's battery failed after three days, terminating the mission. It was followed by Orbiting Astronomical Observatory 2 (OAO-2), which carried out ultraviolet observations of stars and galaxies from its launch in 1968 until 1972, well beyond its original planned lifetime of one year. 
The OSO and OAO missions demonstrated the important role space-based observations could play in astronomy. In 1968, NASA developed firm plans for a space-based reflecting telescope with a mirror 3 m (9.8 ft) in diameter, known provisionally as the Large Orbiting Telescope or Large Space Telescope (LST), with a launch slated for 1979. These plans emphasized the need for crewed maintenance missions to the telescope to ensure such a costly program had a lengthy working life, and the concurrent development of plans for the reusable Space Shuttle indicated that the technology to allow this was soon to become available. 
Quest for funding Edit
The continuing success of the OAO program encouraged increasingly strong consensus within the astronomical community that the LST should be a major goal. In 1970, NASA established two committees, one to plan the engineering side of the space telescope project, and the other to determine the scientific goals of the mission. Once these had been established, the next hurdle for NASA was to obtain funding for the instrument, which would be far more costly than any Earth-based telescope. The U.S. Congress questioned many aspects of the proposed budget for the telescope and forced cuts in the budget for the planning stages, which at the time consisted of very detailed studies of potential instruments and hardware for the telescope. In 1974, public spending cuts led to Congress deleting all funding for the telescope project. 
In response a nationwide lobbying effort was coordinated among astronomers. Many astronomers met congressmen and senators in person, and large scale letter-writing campaigns were organized. The National Academy of Sciences published a report emphasizing the need for a space telescope, and eventually the Senate agreed to half the budget that had originally been approved by Congress. 
The funding issues led to something of a reduction in the scale of the project, with the proposed mirror diameter reduced from 3 m to 2.4 m, both to cut costs  and to allow a more compact and effective configuration for the telescope hardware. A proposed precursor 1.5 m (4 ft 11 in) space telescope to test the systems to be used on the main satellite was dropped, and budgetary concerns also prompted collaboration with the European Space Agency (ESA). ESA agreed to provide funding and supply one of the first generation instruments for the telescope, as well as the solar cells that would power it, and staff to work on the telescope in the United States, in return for European astronomers being guaranteed at least 15% of the observing time on the telescope.  Congress eventually approved funding of US$36 million for 1978, and the design of the LST began in earnest, aiming for a launch date of 1983.  In 1983, the telescope was named after Edwin Hubble,  who confirmed one of the greatest scientific discoveries of the 20th century, made by Georges Lemaître, that the universe is expanding. 
Construction and engineering Edit
Once the Space Telescope project had been given the go-ahead, work on the program was divided among many institutions. Marshall Space Flight Center (MSFC) was given responsibility for the design, development, and construction of the telescope, while Goddard Space Flight Center was given overall control of the scientific instruments and ground-control center for the mission.  MSFC commissioned the optics company Perkin-Elmer to design and build the Optical Telescope Assembly (OTA) and Fine Guidance Sensors for the space telescope. Lockheed was commissioned to construct and integrate the spacecraft in which the telescope would be housed. 
Optical Telescope Assembly Edit
Optically, the HST is a Cassegrain reflector of Ritchey–Chrétien design, as are most large professional telescopes. This design, with two hyperbolic mirrors, is known for good imaging performance over a wide field of view, with the disadvantage that the mirrors have shapes that are hard to fabricate and test. The mirror and optical systems of the telescope determine the final performance, and they were designed to exacting specifications. Optical telescopes typically have mirrors polished to an accuracy of about a tenth of the wavelength of visible light, but the Space Telescope was to be used for observations from the visible through the ultraviolet (shorter wavelengths) and was specified to be diffraction limited to take full advantage of the space environment. Therefore, its mirror needed to be polished to an accuracy of 10 nanometers, or about 1/65 of the wavelength of red light.  On the long wavelength end, the OTA was not designed with optimum IR performance in mind—for example, the mirrors are kept at stable (and warm, about 15 °C) temperatures by heaters. This limits Hubble's performance as an infrared telescope. 
Perkin-Elmer intended to use custom-built and extremely sophisticated computer-controlled polishing machines to grind the mirror to the required shape.  However, in case their cutting-edge technology ran into difficulties, NASA demanded that PE sub-contract to Kodak to construct a back-up mirror using traditional mirror-polishing techniques.  (The team of Kodak and Itek also bid on the original mirror polishing work. Their bid called for the two companies to double-check each other's work, which would have almost certainly caught the polishing error that later caused such problems.)  The Kodak mirror is now on permanent display at the National Air and Space Museum.   An Itek mirror built as part of the effort is now used in the 2.4 m telescope at the Magdalena Ridge Observatory. 
Construction of the Perkin-Elmer mirror began in 1979, starting with a blank manufactured by Corning from their ultra-low expansion glass. To keep the mirror's weight to a minimum it consisted of top and bottom plates, each 25 mm (0.98 in) thick, sandwiching a honeycomb lattice. Perkin-Elmer simulated microgravity by supporting the mirror from the back with 130 rods that exerted varying amounts of force.  This ensured the mirror's final shape would be correct and to specification when finally deployed. Mirror polishing continued until May 1981. NASA reports at the time questioned Perkin-Elmer's managerial structure, and the polishing began to slip behind schedule and over budget. To save money, NASA halted work on the back-up mirror and put the launch date of the telescope back to October 1984.  The mirror was completed by the end of 1981 it was washed using 9,100 L (2,000 imp gal 2,400 US gal) of hot, deionized water and then received a reflective coating of 65 nm-thick aluminum and a protective coating of 25 nm-thick magnesium fluoride.  
Doubts continued to be expressed about Perkin-Elmer's competence on a project of this importance, as their budget and timescale for producing the rest of the OTA continued to inflate. In response to a schedule described as "unsettled and changing daily", NASA postponed the launch date of the telescope until April 1985. Perkin-Elmer's schedules continued to slip at a rate of about one month per quarter, and at times delays reached one day for each day of work. NASA was forced to postpone the launch date until March and then September 1986. By this time, the total project budget had risen to US$1.175 billion. 
Spacecraft systems Edit
The spacecraft in which the telescope and instruments were to be housed was another major engineering challenge. It would have to withstand frequent passages from direct sunlight into the darkness of Earth's shadow, which would cause major changes in temperature, while being stable enough to allow extremely accurate pointing of the telescope. A shroud of multi-layer insulation keeps the temperature within the telescope stable and surrounds a light aluminum shell in which the telescope and instruments sit. Within the shell, a graphite-epoxy frame keeps the working parts of the telescope firmly aligned.  Because graphite composites are hygroscopic, there was a risk that water vapor absorbed by the truss while in Lockheed's clean room would later be expressed in the vacuum of space resulting in the telescope's instruments being covered by ice. To reduce that risk, a nitrogen gas purge was performed before launching the telescope into space. 
While construction of the spacecraft in which the telescope and instruments would be housed proceeded somewhat more smoothly than the construction of the OTA, Lockheed still experienced some budget and schedule slippage, and by the summer of 1985, construction of the spacecraft was 30% over budget and three months behind schedule. An MSFC report said Lockheed tended to rely on NASA directions rather than take their own initiative in the construction. 
Computer systems and data processing Edit
The two initial, primary computers on the HST were the 1.25 MHz DF-224 system, built by Rockwell Autonetics, which contained three redundant CPUs, and two redundant NSSC-1 (NASA Standard Spacecraft Computer, Model 1) systems, developed by Westinghouse and GSFC using diode–transistor logic (DTL). A co-processor for the DF-224 was added during Servicing Mission 1 in 1993, which consisted of two redundant strings of an Intel-based 80386 processor with an 80387 math co-processor.  The DF-224 and its 386 co-processor were replaced by a 25 MHz Intel-based 80486 processor system during Servicing Mission 3A in 1999.  The new computer is 20 times faster, with six times more memory, than the DF-224 it replaced. It increases throughput by moving some computing tasks from the ground to the spacecraft and saves money by allowing the use of modern programming languages. 
Additionally, some of the science instruments and components had their own embedded microprocessor-based control systems. The MATs (Multiple Access Transponder) components, MAT-1 and MAT-2, utilize Hughes Aircraft CDP1802CD microprocessors.  The Wide Field and Planetary Camera (WFPC) also utilized an RCA 1802 microprocessor (or possibly the older 1801 version).  The WFPC-1 was replaced by the WFPC-2 during Servicing Mission 1 in 1993, which was then replaced by the Wide Field Camera 3 (WFC3) during Servicing Mission 4 in 2009.
Initial instruments Edit
When launched, the HST carried five scientific instruments: the Wide Field and Planetary Camera (WF/PC), Goddard High Resolution Spectrograph (GHRS), High Speed Photometer (HSP), Faint Object Camera (FOC) and the Faint Object Spectrograph (FOS). WF/PC was a high-resolution imaging device primarily intended for optical observations. It was built by NASA's Jet Propulsion Laboratory, and incorporated a set of 48 filters isolating spectral lines of particular astrophysical interest. The instrument contained eight charge-coupled device (CCD) chips divided between two cameras, each using four CCDs. Each CCD has a resolution of 0.64 megapixels.  The wide field camera (WFC) covered a large angular field at the expense of resolution, while the planetary camera (PC) took images at a longer effective focal length than the WF chips, giving it a greater magnification. 
The Goddard High Resolution Spectrograph (GHRS) was a spectrograph designed to operate in the ultraviolet. It was built by the Goddard Space Flight Center and could achieve a spectral resolution of 90,000.  Also optimized for ultraviolet observations were the FOC and FOS, which were capable of the highest spatial resolution of any instruments on Hubble. Rather than CCDs, these three instruments used photon-counting digicons as their detectors. The FOC was constructed by ESA, while the University of California, San Diego, and Martin Marietta Corporation built the FOS. 
The final instrument was the HSP, designed and built at the University of Wisconsin–Madison. It was optimized for visible and ultraviolet light observations of variable stars and other astronomical objects varying in brightness. It could take up to 100,000 measurements per second with a photometric accuracy of about 2% or better. 
HST's guidance system can also be used as a scientific instrument. Its three Fine Guidance Sensors (FGS) are primarily used to keep the telescope accurately pointed during an observation, but can also be used to carry out extremely accurate astrometry measurements accurate to within 0.0003 arcseconds have been achieved. 
Ground support Edit
The Space Telescope Science Institute (STScI) is responsible for the scientific operation of the telescope and the delivery of data products to astronomers. STScI is operated by the Association of Universities for Research in Astronomy (AURA) and is physically located in Baltimore, Maryland on the Homewood campus of Johns Hopkins University, one of the 39 U.S. universities and seven international affiliates that make up the AURA consortium. STScI was established in 1981   after something of a power struggle between NASA and the scientific community at large. NASA had wanted to keep this function in-house, but scientists wanted it to be based in an academic establishment.   The Space Telescope European Coordinating Facility (ST-ECF), established at Garching bei München near Munich in 1984, provided similar support for European astronomers until 2011, when these activities were moved to the European Space Astronomy Centre.
One rather complex task that falls to STScI is scheduling observations for the telescope.  Hubble is in a low-Earth orbit to enable servicing missions, but this means most astronomical targets are occulted by the Earth for slightly less than half of each orbit. Observations cannot take place when the telescope passes through the South Atlantic Anomaly due to elevated radiation levels, and there are also sizable exclusion zones around the Sun (precluding observations of Mercury), Moon and Earth. The solar avoidance angle is about 50°, to keep sunlight from illuminating any part of the OTA. Earth and Moon avoidance keeps bright light out of the FGSs, and keeps scattered light from entering the instruments. If the FGSs are turned off, the Moon and Earth can be observed. Earth observations were used very early in the program to generate flat-fields for the WFPC1 instrument. There is a so-called continuous viewing zone (CVZ), at roughly 90° to the plane of Hubble's orbit, in which targets are not occulted for long periods.
Due to the precession of the orbit, the location of the CVZ moves slowly over a period of eight weeks. Because the limb of the Earth is always within about 30° of regions within the CVZ, the brightness of scattered earthshine may be elevated for long periods during CVZ observations. Hubble orbits in low Earth orbit at an altitude of approximately 540 kilometers (340 mi) and an inclination of 28.5°.  The position along its orbit changes over time in a way that is not accurately predictable. The density of the upper atmosphere varies according to many factors, and this means Hubble's predicted position for six weeks' time could be in error by up to 4,000 km (2,500 mi). Observation schedules are typically finalized only a few days in advance, as a longer lead time would mean there was a chance the target would be unobservable by the time it was due to be observed.  Engineering support for HST is provided by NASA and contractor personnel at the Goddard Space Flight Center in Greenbelt, Maryland, 48 km (30 mi) south of the STScI. Hubble's operation is monitored 24 hours per day by four teams of flight controllers who make up Hubble's Flight Operations Team. 
Challenger disaster, delays, and eventual launch Edit
By January 1986, the planned launch date of October looked feasible, but the Challenger explosion brought the U.S. space program to a halt, grounding the Shuttle fleet and forcing the launch of Hubble to be postponed for several years. The telescope had to be kept in a clean room, powered up and purged with nitrogen, until a launch could be rescheduled. This costly situation (about US$6 million per month) pushed the overall costs of the project even higher. This delay did allow time for engineers to perform extensive tests, swap out a possibly failure-prone battery, and make other improvements.  Furthermore, the ground software needed to control Hubble was not ready in 1986, and was barely ready by the 1990 launch. 
Eventually, following the resumption of shuttle flights in 1988, the launch of the telescope was scheduled for 1990. On April 24, 1990, Space Shuttle Discovery successfully launched it during the STS-31 mission. 
From its original total cost estimate of about US$400 million , the telescope cost about US$4.7 billion by the time of its launch. Hubble's cumulative costs were estimated to be about US$10 billion in 2010, twenty years after launch. 
Hubble accommodates five science instruments at a given time, plus the Fine Guidance Sensors, which are mainly used for aiming the telescope but are occasionally used for scientific astrometry measurements. Early instruments were replaced with more advanced ones during the Shuttle servicing missions. COSTAR was a corrective optics device rather than a science instrument, but occupied one of the five instrument bays.
Since the final servicing mission in 2009, the four active instruments have been ACS, COS, STIS and WFC3. NICMOS is kept in hibernation, but may be revived if WFC3 were to fail in the future.
- (ACS 2002–present) (COS 2009–present) (COSTAR 1993–2009) (FOC 1990–2002) (FOS 1990–1997) (FGS 1990–present) (GHRS/HRS 1990–1997) (HSP 1990–1993) (NICMOS 1997–present, hibernating since 2008) (STIS 1997–present (non-operative 2004–2009)) (WFPC 1990–1993) (WFPC2 1993–2009) (WFC3 2009–present)
Of the former instruments, three (COSTAR, FOS and WFPC2) are displayed in the Smithsonian National Air and Space Museum. The FOC is in the Dornier museum, Germany. The HSP is in the Space Place at the University of Wisconsin–Madison. The first WFPC was dismantled, and some components were then re-used in WFC3.
Within weeks of the launch of the telescope, the returned images indicated a serious problem with the optical system. Although the first images appeared to be sharper than those of ground-based telescopes, Hubble failed to achieve a final sharp focus and the best image quality obtained was drastically lower than expected. Images of point sources spread out over a radius of more than one arcsecond, instead of having a point spread function (PSF) concentrated within a circle 0.1 arcseconds (485 nrad) in diameter, as had been specified in the design criteria.  
The effect of the mirror flaw on scientific observations depended on the particular observation—the core of the aberrated PSF was sharp enough to permit high-resolution observations of bright objects, and spectroscopy of point sources was affected only through a sensitivity loss. However, the loss of light to the large, out-of-focus halo severely reduced the usefulness of the telescope for faint objects or high-contrast imaging. This meant nearly all the cosmological programs were essentially impossible, since they required observation of exceptionally faint objects.  This led politicians to question NASA's competence, scientists to rue the cost which could have gone to more productive endeavors, and comedians to make jokes about NASA and the telescope  − in the 1991 comedy The Naked Gun 2½: The Smell of Fear, in a scene where historical disasters are displayed, Hubble is pictured with RMS Titanic and LZ 129 Hindenburg.  Nonetheless, during the first three years of the Hubble mission, before the optical corrections, the telescope still carried out a large number of productive observations of less demanding targets.  The error was well characterized and stable, enabling astronomers to partially compensate for the defective mirror by using sophisticated image processing techniques such as deconvolution. 
Origin of the problem Edit
A commission headed by Lew Allen, director of the Jet Propulsion Laboratory, was established to determine how the error could have arisen. The Allen Commission found that a reflective null corrector, a testing device used to achieve a properly shaped non-spherical mirror, had been incorrectly assembled—one lens was out of position by 1.3 mm (0.051 in).  During the initial grinding and polishing of the mirror, Perkin-Elmer analyzed its surface with two conventional refractive null correctors. However, for the final manufacturing step (figuring), they switched to the custom-built reflective null corrector, designed explicitly to meet very strict tolerances. The incorrect assembly of this device resulted in the mirror being ground very precisely but to the wrong shape. A few final tests, using the conventional null correctors, correctly reported spherical aberration. But these results were dismissed, thus missing the opportunity to catch the error, because the reflective null corrector was considered more accurate. 
The commission blamed the failings primarily on Perkin-Elmer. Relations between NASA and the optics company had been severely strained during the telescope construction, due to frequent schedule slippage and cost overruns. NASA found that Perkin-Elmer did not review or supervise the mirror construction adequately, did not assign its best optical scientists to the project (as it had for the prototype), and in particular did not involve the optical designers in the construction and verification of the mirror. While the commission heavily criticized Perkin-Elmer for these managerial failings, NASA was also criticized for not picking up on the quality control shortcomings, such as relying totally on test results from a single instrument. 
Design of a solution Edit
Many feared that Hubble would be abandoned.  The design of the telescope had always incorporated servicing missions, and astronomers immediately began to seek potential solutions to the problem that could be applied at the first servicing mission, scheduled for 1993. While Kodak had ground a back-up mirror for Hubble, it would have been impossible to replace the mirror in orbit, and too expensive and time-consuming to bring the telescope back to Earth for a refit. Instead, the fact that the mirror had been ground so precisely to the wrong shape led to the design of new optical components with exactly the same error but in the opposite sense, to be added to the telescope at the servicing mission, effectively acting as "spectacles" to correct the spherical aberration.  
The first step was a precise characterization of the error in the main mirror. Working backwards from images of point sources, astronomers determined that the conic constant of the mirror as built was −1.01390 ± 0.0002 , instead of the intended −1.00230 .   The same number was also derived by analyzing the null corrector used by Perkin-Elmer to figure the mirror, as well as by analyzing interferograms obtained during ground testing of the mirror. 
Because of the way the HST's instruments were designed, two different sets of correctors were required. The design of the Wide Field and Planetary Camera 2, already planned to replace the existing WF/PC, included relay mirrors to direct light onto the four separate charge-coupled device (CCD) chips making up its two cameras. An inverse error built into their surfaces could completely cancel the aberration of the primary. However, the other instruments lacked any intermediate surfaces that could be figured in this way, and so required an external correction device. 
The Corrective Optics Space Telescope Axial Replacement (COSTAR) system was designed to correct the spherical aberration for light focused at the FOC, FOS, and GHRS. It consists of two mirrors in the light path with one ground to correct the aberration.  To fit the COSTAR system onto the telescope, one of the other instruments had to be removed, and astronomers selected the High Speed Photometer to be sacrificed.  By 2002, all the original instruments requiring COSTAR had been replaced by instruments with their own corrective optics.  COSTAR was removed and returned to Earth in 2009 where it is exhibited at the National Air and Space Museum. The area previously used by COSTAR is now occupied by the Cosmic Origins Spectrograph. 
Hubble was designed to accommodate regular servicing and equipment upgrades while in orbit. Instruments and limited life items were designed as orbital replacement units.  Five servicing missions (SM 1, 2, 3A, 3B, and 4) were flown by NASA space shuttles, the first in December 1993 and the last in May 2009.  Servicing missions were delicate operations that began with maneuvering to intercept the telescope in orbit and carefully retrieving it with the shuttle's mechanical arm. The necessary work was then carried out in multiple tethered spacewalks over a period of four to five days. After a visual inspection of the telescope, astronauts conducted repairs, replaced failed or degraded components, upgraded equipment, and installed new instruments. Once work was completed, the telescope was redeployed, typically after boosting to a higher orbit to address the orbital decay caused by atmospheric drag. 
Servicing Mission 1 Edit
The first Hubble serving mission was scheduled for 1993 before the mirror problem was discovered. It assumed greater importance, as the astronauts would need to do extensive work to install corrective optics failure would have resulted in either abandoning Hubble or accepting its permanent disability. Other components failed before the mission, causing the repair cost to rise to $500 million (not including the cost of the shuttle flight). A successful repair would help demonstrate the viability of building Space Station Alpha. 
STS-49 in 1992 demonstrated the difficulty of space work. While its rescue of Intelsat 603 received praise, the astronauts had taken possibly reckless risks in doing so. Neither the rescue nor the unrelated assembly of prototype space station components occurred as the astronauts had trained, causing NASA to reassess planning and training, including for the Hubble repair. The agency assigned to the mission Story Musgrave—who had worked on satellite repair procedures since 1976—and six other experienced astronauts, including two from STS-49. The first mission director since Project Apollo would coordinate a crew with 16 previous shuttle flights. The astronauts were trained to use about a hundred specialized tools. 
Heat had been the problem on prior spacewalks, which occurred in sunlight. Hubble needed to be repaired out of sunlight. Musgrave discovered during vacuum training, seven months before the mission, that spacesuit gloves did not sufficiently protect against the cold of space. After STS-57 confirmed the issue in orbit, NASA quickly changed equipment, procedures, and flight plan. Seven total mission simulations occurred before launch, the most thorough preparation in shuttle history. No complete Hubble mockup existed, so the astronauts studied many separate models (including one at the Smithsonian) and mentally combined their varying and contradictory details.  Service Mission 1 flew aboard Endeavour in December 1993, and involved installation of several instruments and other equipment over ten days.
Most importantly, the High Speed Photometer was replaced with the COSTAR corrective optics package, and WF/PC was replaced with the Wide Field and Planetary Camera 2 (WFPC2) with an internal optical correction system. The solar arrays and their drive electronics were also replaced, as well as four gyroscopes in the telescope pointing system, two electrical control units and other electrical components, and two magnetometers. The onboard computers were upgraded with added coprocessors, and Hubble's orbit was boosted. 
On January 13, 1994, NASA declared the mission a complete success and showed the first sharper images.  The mission was one of the most complex performed up until that date, involving five long extra-vehicular activity periods. Its success was a boon for NASA, as well as for the astronomers who now had a more capable space telescope.
Servicing Mission 2 Edit
Servicing Mission 2, flown by Discovery in February 1997, replaced the GHRS and the FOS with the Space Telescope Imaging Spectrograph (STIS) and the Near Infrared Camera and Multi-Object Spectrometer (NICMOS), replaced an Engineering and Science Tape Recorder with a new Solid State Recorder, and repaired thermal insulation.  NICMOS contained a heat sink of solid nitrogen to reduce the thermal noise from the instrument, but shortly after it was installed, an unexpected thermal expansion resulted in part of the heat sink coming into contact with an optical baffle. This led to an increased warming rate for the instrument and reduced its original expected lifetime of 4.5 years to about two years. 
Servicing Mission 3A Edit
Servicing Mission 3A, flown by Discovery, took place in December 1999, and was a split-off from Servicing Mission 3 after three of the six onboard gyroscopes had failed. The fourth failed a few weeks before the mission, rendering the telescope incapable of performing scientific observations. The mission replaced all six gyroscopes, replaced a Fine Guidance Sensor and the computer, installed a Voltage/temperature Improvement Kit (VIK) to prevent battery overcharging, and replaced thermal insulation blankets. 
Servicing Mission 3B Edit
Servicing Mission 3B flown by Columbia in March 2002 saw the installation of a new instrument, with the FOC (which, except for the Fine Guidance Sensors when used for astrometry, was the last of the original instruments) being replaced by the Advanced Camera for Surveys (ACS). This meant COSTAR was no longer required, since all new instruments had built-in correction for the main mirror aberration.  The mission also revived NICMOS by installing a closed-cycle cooler  and replaced the solar arrays for the second time, providing 30 percent more power. 
Servicing Mission 4 Edit
Plans called for Hubble to be serviced in February 2005, but the Columbia disaster in 2003, in which the orbiter disintegrated on re-entry into the atmosphere, had wide-ranging effects to the Hubble program and other NASA missions. NASA Administrator Sean O'Keefe decided all future shuttle missions had to be able to reach the safe haven of the International Space Station should in-flight problems develop. As no shuttles were capable of reaching both HST and the space station during the same mission, future crewed service missions were canceled.  This decision was criticised by numerous astronomers who felt Hubble was valuable enough to merit the human risk.  HST's planned successor, the James Webb Telescope (JWST), as of 2004 was not expected to launch until at least 2011. A gap in space-observing capabilities between a decommissioning of Hubble and the commissioning of a successor was of major concern to many astronomers, given the significant scientific impact of HST.  The consideration that JWST will not be located in low Earth orbit, and therefore cannot be easily upgraded or repaired in the event of an early failure, only made concerns more acute. On the other hand, many astronomers felt strongly that servicing Hubble should not take place if the expense were to come from the JWST budget.
In January 2004, O'Keefe said he would review his decision to cancel the final servicing mission to HST, due to public outcry and requests from Congress for NASA to look for a way to save it. The National Academy of Sciences convened an official panel, which recommended in July 2004 that the HST should be preserved despite the apparent risks. Their report urged "NASA should take no actions that would preclude a space shuttle servicing mission to the Hubble Space Telescope".  In August 2004, O'Keefe asked Goddard Space Flight Center to prepare a detailed proposal for a robotic service mission. These plans were later canceled, the robotic mission being described as "not feasible".  In late 2004, several Congressional members, led by Senator Barbara Mikulski, held public hearings and carried on a fight with much public support (including thousands of letters from school children across the U.S.) to get the Bush Administration and NASA to reconsider the decision to drop plans for a Hubble rescue mission. 
The nomination in April 2005 of a new NASA Administrator, Michael D. Griffin, changed the situation, as Griffin stated he would consider a crewed servicing mission.  Soon after his appointment Griffin authorized Goddard to proceed with preparations for a crewed Hubble maintenance flight, saying he would make the final decision after the next two shuttle missions. In October 2006 Griffin gave the final go-ahead, and the 11-day mission by Atlantis was scheduled for October 2008. Hubble's main data-handling unit failed in September 2008,  halting all reporting of scientific data until its back-up was brought online on October 25, 2008.  Since a failure of the backup unit would leave the HST helpless, the service mission was postponed to incorporate a replacement for the primary unit. 
Servicing Mission 4 (SM4), flown by Atlantis in May 2009, was the last scheduled shuttle mission for HST.   SM4 installed the replacement data-handling unit, repaired the ACS and STIS systems, installed improved nickel hydrogen batteries, and replaced other components including all six gyroscopes. SM4 also installed two new observation instruments—Wide Field Camera 3 (WFC3) and the Cosmic Origins Spectrograph (COS)  —and the Soft Capture and Rendezvous System, which will enable the future rendezvous, capture, and safe disposal of Hubble by either a crewed or robotic mission.  Except for the ACS's High Resolution Channel, which could not be repaired and was disabled,    the work accomplished during SM4 rendered the telescope fully functional. 
Since the start of the program, a number of research projects have been carried out, some of them almost solely with Hubble, others coordinated facilities such as Chandra X-ray Observatory and ESO's Very Large Telescope. Although the Hubble observatory is nearing the end of its life, there are still major projects scheduled for it. One example is the upcoming Frontier Fields program,  inspired by the results of Hubble's deep observation of the galaxy cluster Abell 1689. 
Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey Edit
In an August 2013 press release, CANDELS was referred to as "the largest project in the history of Hubble". The survey "aims to explore galactic evolution in the early Universe, and the very first seeds of cosmic structure at less than one billion years after the Big Bang."  The CANDELS project site describes the survey's goals as the following: 
The Cosmic Assembly Near-IR Deep Extragalactic Legacy Survey is designed to document the first third of galactic evolution from z = 8 to 1.5 via deep imaging of more than 250,000 galaxies with WFC3/IR and ACS. It will also find the first Type Ia SNe beyond z > 1.5 and establish their accuracy as standard candles for cosmology. Five premier multi-wavelength sky regions are selected each has multi-wavelength data from Spitzer and other facilities, and has extensive spectroscopy of the brighter galaxies. The use of five widely separated fields mitigates cosmic variance and yields statistically robust and complete samples of galaxies down to 10 9 solar masses out to z
Frontier Fields program Edit
The program, officially named "Hubble Deep Fields Initiative 2012", is aimed to advance the knowledge of early galaxy formation by studying high-redshift galaxies in blank fields with the help of gravitational lensing to see the "faintest galaxies in the distant universe".  The Frontier Fields web page describes the goals of the program being:
- to reveal hitherto inaccessible populations of z = 5–10 galaxies that are ten to fifty times fainter intrinsically than any presently known
- to solidify our understanding of the stellar masses and star formation histories of sub-L* galaxies at the earliest times
- to provide the first statistically meaningful morphological characterization of star forming galaxies at z > 5
- to find z > 8 galaxies stretched out enough by cluster lensing to discern internal structure and/or magnified enough by cluster lensing for spectroscopic follow-up. 
Cosmic Evolution Survey (COSMOS) Edit
The Cosmic Evolution Survey (COSMOS)  is an astronomical survey designed to probe the formation and evolution of galaxies as a function of both cosmic time (redshift) and the local galaxy environment. The survey covers a two square degree equatorial field with spectroscopy and X-ray to radio imaging by most of the major space-based telescopes and a number of large ground based telescopes,  making it a key focus region of extragalactic astrophysics. COSMOS was launched in 2006 as the largest project pursued by the Hubble Space Telescope at the time, and still is the largest continuous area of sky covered for the purposes of mapping deep space in blank fields, 2.5 times the area of the moon on the sky and 17 times larger than the largest of the CANDELS regions. The COSMOS scientific collaboration that was forged from the initial COSMOS survey is the largest and longest-running extragalactic collaboration, known for its collegiality and openness. The study of galaxies in their environment can be done only with large areas of the sky, larger than a half square degree.  More than two million galaxies are detected, spanning 90% of the age of the Universe. The COSMOS collaboration is led by Caitlin Casey, Jeyhan Kartaltepe, and Vernesa Smolcic and involves more than 200 scientists in a dozen countries. 
Anyone can apply for time on the telescope there are no restrictions on nationality or academic affiliation, but funding for analysis is available only to U.S. institutions.  Competition for time on the telescope is intense, with about one-fifth of the proposals submitted in each cycle earning time on the schedule.  
Calls for proposals are issued roughly annually, with time allocated for a cycle lasting about one year. Proposals are divided into several categories "general observer" proposals are the most common, covering routine observations. "Snapshot observations" are those in which targets require only 45 minutes or less of telescope time, including overheads such as acquiring the target. Snapshot observations are used to fill in gaps in the telescope schedule that cannot be filled by regular general observer programs. 
Astronomers may make "Target of Opportunity" proposals, in which observations are scheduled if a transient event covered by the proposal occurs during the scheduling cycle. In addition, up to 10% of the telescope time is designated "director's discretionary" (DD) time. Astronomers can apply to use DD time at any time of year, and it is typically awarded for study of unexpected transient phenomena such as supernovae. 
Other uses of DD time have included the observations that led to views of the Hubble Deep Field and Hubble Ultra Deep Field, and in the first four cycles of telescope time, observations that were carried out by amateur astronomers.
Public image processing of Hubble data is encouraged as most of the data in the archives has not been processed into color imagery. 
Use by amateur astronomers Edit
The first director of STScI, Riccardo Giacconi, announced in 1986 that he intended to devote some of his director discretionary time to allowing amateur astronomers to use the telescope. The total time to be allocated was only a few hours per cycle but excited great interest among amateur astronomers. 
Proposals for amateur time were stringently reviewed by a committee of amateur astronomers, and time was awarded only to proposals that were deemed to have genuine scientific merit, did not duplicate proposals made by professionals, and required the unique capabilities of the space telescope. Thirteen amateur astronomers were awarded time on the telescope, with observations being carried out between 1990 and 1997.  One such study was "Transition Comets—UV Search for OH". The first proposal, "A Hubble Space Telescope Study of Posteclipse Brightening and Albedo Changes on Io", was published in Icarus,  a journal devoted to solar system studies. A second study from another group of amateurs was also published in Icarus.  After that time, however, budget reductions at STScI made the support of work by amateur astronomers untenable, and no additional amateur programs have been carried out.  
Regular Hubble proposals still include findings or discovered objects by amateurs and citizen scientists. These observations are often in a collaboration with professional astronomers. One of earliest such observations is the Great White Spot of 1990  on planet Saturn, discovered by amateur astronomer S. Wilber  and observed by HST under a proposal by J. Westphal (Caltech).   Later professional-amateur observations by Hubble include discoveries by the Galaxy Zoo project, such as Voorwerpjes and Green Pea galaxies.   The "Gems of the Galaxies" program is based on a list of objects by galaxy zoo volunteers that was shortened with the help of an online vote.  Additionally there are observations of minor planets discovered by amateur astronomers, such as 2I/Borisov and changes in the atmosphere of the gas giants Jupiter and Saturn or the ice giants Uranus and Neptune.   In the pro-am collaboration backyard worlds the HST was used to observe a planetary mass object, called WISE J0830+2837. The non-detection by the HST helped to classify this peculiar object. 
Key projects Edit
In the early 1980s, NASA and STScI convened four panels to discuss key projects. These were projects that were both scientifically important and would require significant telescope time, which would be explicitly dedicated to each project. This guaranteed that these particular projects would be completed early, in case the telescope failed sooner than expected. The panels identified three such projects: 1) a study of the nearby intergalactic medium using quasar absorption lines to determine the properties of the intergalactic medium and the gaseous content of galaxies and groups of galaxies  2) a medium deep survey using the Wide Field Camera to take data whenever one of the other instruments was being used  and 3) a project to determine the Hubble constant within ten percent by reducing the errors, both external and internal, in the calibration of the distance scale. 
Important discoveries Edit
Hubble has helped resolve some long-standing problems in astronomy, while also raising new questions. Some results have required new theories to explain them.
Age of the universe Edit
Among its primary mission targets was to measure distances to Cepheid variable stars more accurately than ever before, and thus constrain the value of the Hubble constant, the measure of the rate at which the universe is expanding, which is also related to its age. Before the launch of HST, estimates of the Hubble constant typically had errors of up to 50%, but Hubble measurements of Cepheid variables in the Virgo Cluster and other distant galaxy clusters provided a measured value with an accuracy of ±10%, which is consistent with other more accurate measurements made since Hubble's launch using other techniques.  The estimated age is now about 13.7 billion years, but before the Hubble Telescope, scientists predicted an age ranging from 10 to 20 billion years. 
Expansion of the universe Edit
While Hubble helped to refine estimates of the age of the universe, it also cast doubt on theories about its future. Astronomers from the High-z Supernova Search Team and the Supernova Cosmology Project used ground-based telescopes and HST to observe distant supernovae and uncovered evidence that, far from decelerating under the influence of gravity, the expansion of the universe may in fact be accelerating. Three members of these two groups have subsequently been awarded Nobel Prizes for their discovery.  The cause of this acceleration remains poorly understood  the most common cause attributed is dark energy. 
Black holes Edit
The high-resolution spectra and images provided by the HST have been especially well-suited to establishing the prevalence of black holes in the center of nearby galaxies. While it had been hypothesized in the early 1960s that black holes would be found at the centers of some galaxies, and astronomers in the 1980s identified a number of good black hole candidates, work conducted with Hubble shows that black holes are probably common to the centers of all galaxies.    The Hubble programs further established that the masses of the nuclear black holes and properties of the galaxies are closely related. The legacy of the Hubble programs on black holes in galaxies is thus to demonstrate a deep connection between galaxies and their central black holes.
Extending visible wavelength images Edit
A unique window on the Universe enabled by Hubble are the Hubble Deep Field, Hubble Ultra-Deep Field, and Hubble Extreme Deep Field images, which used Hubble's unmatched sensitivity at visible wavelengths to create images of small patches of sky that are the deepest ever obtained at optical wavelengths. The images reveal galaxies billions of light years away, and have generated a wealth of scientific papers, providing a new window on the early Universe. The Wide Field Camera 3 improved the view of these fields in the infrared and ultraviolet, supporting the discovery of some of the most distant objects yet discovered, such as MACS0647-JD.
The non-standard object SCP 06F6 was discovered by the Hubble Space Telescope in February 2006.  
On March 3, 2016, researchers using Hubble data announced the discovery of the farthest known galaxy to date: GN-z11. The Hubble observations occurred on February 11, 2015, and April 3, 2015, as part of the CANDELS/GOODS-North surveys.  
Solar System discoveries Edit
HST has also been used to study objects in the outer reaches of the Solar System, including the dwarf planets Pluto  and Eris. 
The collision of Comet Shoemaker-Levy 9 with Jupiter in 1994 was fortuitously timed for astronomers, coming just a few months after Servicing Mission 1 had restored Hubble's optical performance. Hubble images of the planet were sharper than any taken since the passage of Voyager 2 in 1979, and were crucial in studying the dynamics of the collision of a comet with Jupiter, an event believed to occur once every few centuries.
During June and July 2012, U.S. astronomers using Hubble discovered Styx, a tiny fifth moon orbiting Pluto. 
In March 2015, researchers announced that measurements of aurorae around Ganymede, one of Jupiter's moons, revealed that it has a subsurface ocean. Using Hubble to study the motion of its aurorae, the researchers determined that a large saltwater ocean was helping to suppress the interaction between Jupiter's magnetic field and that of Ganymede. The ocean is estimated to be 100 km (60 mi) deep, trapped beneath a 150 km (90 mi) ice crust.  
From June to August 2015, Hubble was used to search for a Kuiper belt object (KBO) target for the New Horizons Kuiper Belt Extended Mission (KEM) when similar searches with ground telescopes failed to find a suitable target.  This resulted in the discovery of at least five new KBOs, including the eventual KEM target, 486958 Arrokoth, that New Horizons performed a close fly-by of on January 1, 2019.   
In August 2020, taking advantage of a total lunar eclipse, astronomers using NASA's Hubble Space Telescope have detected Earth's own brand of sunscreen – ozone – in our atmosphere. This method simulates how astronomers and astrobiology researchers will search for evidence of life beyond Earth by observing potential "biosignatures" on exoplanets (planets around other stars). 
Supernova reappearance Edit
On December 11, 2015, Hubble captured an image of the first-ever predicted reappearance of a supernova, dubbed "Refsdal", which was calculated using different mass models of a galaxy cluster whose gravity is warping the supernova's light. The supernova was previously seen in November 2014 behind galaxy cluster MACS J1149.5+2223 as part of Hubble's Frontier Fields program. Astronomers spotted four separate images of the supernova in an arrangement known as an Einstein Cross. The light from the cluster has taken about five billion years to reach Earth, though the supernova exploded some 10 billion years ago. Based on early lens models, a fifth image was predicted to reappear by the end of 2015.  The detection of Refsdal's reappearance in December 2015 served as a unique opportunity for astronomers to test their models of how mass, especially dark matter, is distributed within this galaxy cluster. 
Mass and size of Milky Way Edit
In March 2019, observations from Hubble and data from the European Space Agency's Gaia space observatory were combined to determine that the Milky Way Galaxy weighs approximately 1.5 trillion solar units and has a radius of 129,000 light years. 
Other discoveries Edit
Other discoveries made with Hubble data include proto-planetary disks (proplyds) in the Orion Nebula  evidence for the presence of extrasolar planets around Sun-like stars  and the optical counterparts of the still-mysterious gamma-ray bursts. 
Impact on astronomy Edit
Many objective measures show the positive impact of Hubble data on astronomy. Over 15,000 papers based on Hubble data have been published in peer-reviewed journals,  and countless more have appeared in conference proceedings. Looking at papers several years after their publication, about one-third of all astronomy papers have no citations, while only two percent of papers based on Hubble data have no citations. On average, a paper based on Hubble data receives about twice as many citations as papers based on non-Hubble data. Of the 200 papers published each year that receive the most citations, about 10% are based on Hubble data. 
Although the HST has clearly helped astronomical research, its financial cost has been large. A study on the relative astronomical benefits of different sizes of telescopes found that while papers based on HST data generate 15 times as many citations as a 4 m (13 ft) ground-based telescope such as the William Herschel Telescope, the HST costs about 100 times as much to build and maintain. 
Deciding between building ground- versus space-based telescopes is complex. Even before Hubble was launched, specialized ground-based techniques such as aperture masking interferometry had obtained higher-resolution optical and infrared images than Hubble would achieve, though restricted to targets about 10 8 times brighter than the faintest targets observed by Hubble.   Since then, advances in adaptive optics have extended the high-resolution imaging capabilities of ground-based telescopes to the infrared imaging of faint objects. The usefulness of adaptive optics versus HST observations depends strongly on the particular details of the research questions being asked. In the visible bands, adaptive optics can correct only a relatively small field of view, whereas HST can conduct high-resolution optical imaging over a wide field. Only a small fraction of astronomical objects are accessible to high-resolution ground-based imaging in contrast Hubble can perform high-resolution observations of any part of the night sky, and on objects that are extremely faint.
Impact on aerospace engineering Edit
In addition to its scientific results, Hubble has also made significant contributions to aerospace engineering, in particular the performance of systems in low Earth orbit (LEO). These insights result from Hubble's long lifetime on orbit, extensive instrumentation, and return of assemblies to the Earth where they can be studied in detail. In particular, Hubble has contributed to studies of the behavior of graphite composite structures in vacuum, optical contamination from residual gas and human servicing, radiation damage to electronics and sensors, and the long term behavior of multi-layer insulation.  One lesson learned was that gyroscopes assembled using pressurized oxygen to deliver suspension fluid were prone to failure due to electric wire corrosion. Gyroscopes are now assembled using pressurized nitrogen.  Another is that optical surfaces in LEO can have surprisingly long lifetimes Hubble was only expected to last 15 years before the mirror became unusable, but after 14 years there was no measureable degradation.  Finally, Hubble servicing missions, particularly those that serviced components not designed for in-space maintenance, have contributed towards the development of new tools and techniques for on-orbit repair. 
Transmission to Earth Edit
Hubble data was initially stored on the spacecraft. When launched, the storage facilities were old-fashioned reel-to-reel tape recorders, but these were replaced by solid state data storage facilities during servicing missions 2 and 3A. About twice daily, the Hubble Space Telescope radios data to a satellite in the geosynchronous Tracking and Data Relay Satellite System (TDRSS), which then downlinks the science data to one of two 60-foot (18-meter) diameter high-gain microwave antennas located at the White Sands Test Facility in White Sands, New Mexico.  From there they are sent to the Space Telescope Operations Control Center at Goddard Space Flight Center, and finally to the Space Telescope Science Institute for archiving.  Each week, HST downlinks approximately 140 gigabits of data. 
Color images Edit
All images from Hubble are monochromatic grayscale, taken through a variety of filters, each passing specific wavelengths of light, and incorporated in each camera. Color images are created by combining separate monochrome images taken through different filters. This process can also create false-color versions of images including infrared and ultraviolet channels, where infrared is typically rendered as a deep red and ultraviolet is rendered as a deep blue.   
All Hubble data is eventually made available via the Mikulski Archive for Space Telescopes at STScI,  CADC  and ESA/ESAC.  Data is usually proprietary—available only to the principal investigator (PI) and astronomers designated by the PI—for twelve months after being taken. The PI can apply to the director of the STScI to extend or reduce the proprietary period in some circumstances. 
Observations made on Director's Discretionary Time are exempt from the proprietary period, and are released to the public immediately. Calibration data such as flat fields and dark frames are also publicly available straight away. All data in the archive is in the FITS format, which is suitable for astronomical analysis but not for public use.  The Hubble Heritage Project processes and releases to the public a small selection of the most striking images in JPEG and TIFF formats. 
Pipeline reduction Edit
Astronomical data taken with CCDs must undergo several calibration steps before they are suitable for astronomical analysis. STScI has developed sophisticated software that automatically calibrates data when they are requested from the archive using the best calibration files available. This 'on-the-fly' processing means large data requests can take a day or more to be processed and returned. The process by which data is calibrated automatically is known as 'pipeline reduction', and is increasingly common at major observatories. Astronomers may if they wish retrieve the calibration files themselves and run the pipeline reduction software locally. This may be desirable when calibration files other than those selected automatically need to be used. 
Data analysis Edit
Hubble data can be analyzed using many different packages. STScI maintains the custom-made Space Telescope Science Data Analysis System (STSDAS) software, which contains all the programs needed to run pipeline reduction on raw data files, as well as many other astronomical image processing tools, tailored to the requirements of Hubble data. The software runs as a module of IRAF, a popular astronomical data reduction program. 
It has always been important for the Space Telescope to capture the public's imagination, given the considerable contribution of taxpayers to its construction and operational costs.  After the difficult early years when the faulty mirror severely dented Hubble's reputation with the public, the first servicing mission allowed its rehabilitation as the corrected optics produced numerous remarkable images.
Several initiatives have helped to keep the public informed about Hubble activities. In the United States, outreach efforts are coordinated by the Space Telescope Science Institute (STScI) Office for Public Outreach, which was established in 2000 to ensure that U.S. taxpayers saw the benefits of their investment in the space telescope program. To that end, STScI operates the HubbleSite.org website. The Hubble Heritage Project, operating out of the STScI, provides the public with high-quality images of the most interesting and striking objects observed. The Heritage team is composed of amateur and professional astronomers, as well as people with backgrounds outside astronomy, and emphasizes the aesthetic nature of Hubble images. The Heritage Project is granted a small amount of time to observe objects which, for scientific reasons, may not have images taken at enough wavelengths to construct a full-color image. 
Since 1999, the leading Hubble outreach group in Europe has been the Hubble European Space Agency Information Centre (HEIC).  This office was established at the Space Telescope European Coordinating Facility in Munich, Germany. HEIC's mission is to fulfill HST outreach and education tasks for the European Space Agency. The work is centered on the production of news and photo releases that highlight interesting Hubble results and images. These are often European in origin, and so increase awareness of both ESA's Hubble share (15%) and the contribution of European scientists to the observatory. ESA produces educational material, including a videocast series called Hubblecast designed to share world-class scientific news with the public. 
The Hubble Space Telescope has won two Space Achievement Awards from the Space Foundation, for its outreach activities, in 2001 and 2010. 
A replica of the Hubble Space Telescope is on the courthouse lawn in Marshfield, Missouri, the hometown of namesake Edwin P. Hubble.
Celebration images Edit
The Hubble Space Telescope celebrated its 20th anniversary in space on April 24, 2010. To commemorate the occasion, NASA, ESA, and the Space Telescope Science Institute (STScI) released an image from the Carina Nebula. 
To commemorate Hubble's 25th anniversary in space on April 24, 2015, STScI released images of the Westerlund 2 cluster, located about 20,000 light-years (6,100 pc) away in the constellation Carina, through its Hubble 25 website.  The European Space Agency created a dedicated 25th anniversary page on its website.  In April 2016, a special celebratory image of the Bubble Nebula was released for Hubble's 26th "birthday". 
Gyroscope rotation sensors Edit
HST uses gyroscopes to detect and measure any rotations so it can stabilize itself in orbit and point accurately and steadily at astronomical targets. Three gyroscopes are normally required for operation observations are still possible with two or one, but the area of sky that can be viewed would be somewhat restricted, and observations requiring very accurate pointing are more difficult.  In 2018, the plan was to drop into one-gyroscope mode if less than three working gyroscopes were operational. The gyroscopes are part of the Pointing Control System, which uses five types of sensors (magnetic sensors, optical sensors, and the gyroscopes) and two types of actuators (reaction wheels and magnetic torquers).  Hubble carries six gyroscopes in total.
After the Columbia disaster in 2003, it was unclear whether another servicing mission would be possible, and gyroscope life became a concern again, so engineers developed new software for two-gyroscope and one-gyroscope modes to maximize the potential lifetime. The development was successful, and in 2005, it was decided to switch to two-gyroscope mode for regular telescope operations as a means of extending the lifetime of the mission. The switch to this mode was made in August 2005, leaving Hubble with two gyroscopes in use, two on backup, and two inoperable.  One more gyroscope failed in 2007. 
By the time of the final repair mission in May 2009, during which all six gyroscopes were replaced (with two new pairs and one refurbished pair), only three were still working. Engineers determined that the gyroscope failures were caused by corrosion of electric wires powering the motor that was initiated by oxygen-pressurized air used to deliver the thick suspending fluid.  The new gyroscope models were assembled using pressurized nitrogen  and were expected to be much more reliable.  In the 2009 servicing mission all six gyroscopes were replaced, and after almost ten years only three gyroscopes failed, and only after exceeding the average expected run time for the design. 
Of the six gyroscopes replaced in 2009, three were of the old design susceptible for flex-lead failure, and three were of the new design with a longer expected lifetime. The first of the old-style gyroscopes failed in March 2014, and the second in April 2018. On October 5, 2018, the last of the old-style gyroscopes failed, and one of the new-style gyroscopes was powered-up from standby state. However, that reserve gyroscope did not immediately perform within operational limits, and so the observatory was placed into "safe" mode while scientists attempted to fix the problem.   NASA tweeted on October 22, 2018, that the "rotation rates produced by the backup gyro have reduced and are now within a normal range. Additional tests [are] to be performed to ensure Hubble can return to science operations with this gyro." 
The solution that restored the backup new-style gyroscope to operational range was widely reported as "turning it off and on again".  A "running restart" of the gyroscope was performed, but this had no effect, and the final resolution to the failure was more complex. The failure was attributed to an inconsistency in the fluid surrounding the float within the gyroscope (e.g., an air bubble). On October 18, 2018, the Hubble Operations Team directed the spacecraft into a series of maneuvers—moving the spacecraft in opposite directions—in order to mitigate the inconsistency. Only after the maneuvers, and a subsequent set of maneuvers on October 19, did the gyroscope truly operate within its normal range. 
Instruments and electronics Edit
Past servicing missions have exchanged old instruments for new ones, avoiding failure and making new types of science possible. Without servicing missions, all the instruments will eventually fail. In August 2004, the power system of the Space Telescope Imaging Spectrograph (STIS) failed, rendering the instrument inoperable. The electronics had originally been fully redundant, but the first set of electronics failed in May 2001.  This power supply was fixed during Servicing Mission 4 in May 2009.
Similarly, the Advanced Camera for Surveys (ACS) main camera primary electronics failed in June 2006, and the power supply for the backup electronics failed on January 27, 2007.  Only the instrument's Solar Blind Channel (SBC) was operable using the side-1 electronics. A new power supply for the wide angle channel was added during SM 4, but quick tests revealed this did not help the high resolution channel.  The Wide Field Channel (WFC) was returned to service by STS-125 in May 2009 but the High Resolution Channel (HRC) remains offline. 
On January 8, 2019, Hubble entered a partial safe mode following suspected hardware problems in its most advanced instrument, the Wide Field Camera 3 instrument. NASA later reported that the cause of the safe mode within the instrument was a detection of voltage levels out of a defined range. On January 15, 2019, NASA said the cause of the failure was a software problem. Engineering data within the telemetry circuits were not accurate. In addition, all other telemetry within those circuits also contained erroneous values indicating that this was a telemetry issue and not a power supply issue. After resetting the telemetry circuits and associated boards the instrument began functioning again. On January 17, 2019, the device was returned to normal operation and on the same day it completed its first science observations.  
On June 13, 2021, Hubble's payload computer halted due to a suspected issue with a memory module. An attempt to restart the computer on June 14 failed. Further attempts to switch to one of three other backup memory modules onboard the spacecraft failed on June 18. As of June 19, scientific operations have been suspended while NASA continues to diagnose and resolve the issue.  
Orbital decay and controlled reentry Edit
Hubble orbits the Earth in the extremely tenuous upper atmosphere, and over time its orbit decays due to drag. If not reboosted, it will re-enter the Earth's atmosphere within some decades, with the exact date depending on how active the Sun is and its impact on the upper atmosphere. If Hubble were to descend in a completely uncontrolled re-entry, parts of the main mirror and its support structure would probably survive, leaving the potential for damage or even human fatalities.  In 2013, deputy project manager James Jeletic projected that Hubble could survive into the 2020s.  Based on solar activity and atmospheric drag, or lack thereof, a natural atmospheric reentry for Hubble will occur between 2028 and 2040.   In June 2016, NASA extended the service contract for Hubble until June 2021. 
NASA's original plan for safely de-orbiting Hubble was to retrieve it using a Space Shuttle. Hubble would then have most likely been displayed in the Smithsonian Institution. This is no longer possible since the Space Shuttle fleet has been retired, and would have been unlikely in any case due to the cost of the mission and risk to the crew. Instead, NASA considered adding an external propulsion module to allow controlled re-entry.  Ultimately, in 2009, as part of Servicing Mission 4, the last servicing mission by the Space Shuttle, NASA installed the Soft Capture Mechanism (SCM), to enable deorbit by either a crewed or robotic mission. The SCM, together with the Relative Navigation System (RNS), mounted on the Shuttle to collect data to "enable NASA to pursue numerous options for the safe de-orbit of Hubble", constitute the Soft Capture and Rendezvous System (SCRS).  
Possible service missions Edit
As of 2017 [update] , the Trump Administration was considering a proposal by the Sierra Nevada Corporation to use a crewed version of its Dream Chaser spacecraft to service Hubble some time in the 2020s both as a continuation of its scientific capabilities and as insurance against any malfunctions in the to-be-launched James Webb Space Telescope.  In 2020, John Grunsfeld said that SpaceX Crew Dragon or Orion could perform another repair mission within ten years. While robotic technology is not yet sophisticated enough, he said, with another manned visit "We could keep Hubble going for another few decades" with new gyros and instruments. 
|Visible spectrum range|
There is no direct replacement to Hubble as an ultraviolet and visible light space telescope, because near-term space telescopes do not duplicate Hubble's wavelength coverage (near-ultraviolet to near-infrared wavelengths), instead concentrating on the further infrared bands. These bands are preferred for studying high redshift and low-temperature objects, objects generally older and farther away in the universe. These wavelengths are also difficult or impossible to study from the ground, justifying the expense of a space-based telescope. Large ground-based telescopes can image some of the same wavelengths as Hubble, sometimes challenge HST in terms of resolution by using adaptive optics (AO), have much larger light-gathering power, and can be upgraded more easily, but cannot yet match Hubble's excellent resolution over a wide field of view with the very dark background of space.
Plans for a Hubble successor materialized as the Next Generation Space Telescope project, which culminated in plans for the James Webb Space Telescope (JWST), the formal successor of Hubble.  Very different from a scaled-up Hubble, it is designed to operate colder and farther away from the Earth at the L2 Lagrangian point, where thermal and optical interference from the Earth and Moon are lessened. It is not engineered to be fully serviceable (such as replaceable instruments), but the design includes a docking ring to enable visits from other spacecraft.  A main scientific goal of JWST is to observe the most distant objects in the universe, beyond the reach of existing instruments. It is expected to detect stars in the early Universe approximately 280 million years older than stars HST now detects.  The telescope is an international collaboration between NASA, the European Space Agency, and the Canadian Space Agency since 1996,  and is planned for launch on an Ariane 5 rocket.  Although JWST is primarily an infrared instrument, its coverage extends down to 600 nm wavelength light, or roughly orange in the visible spectrum. A typical human eye can see to about 750 nm wavelength light, so there is some overlap with the longest visible wavelength bands, including orange and red light.
A complementary telescope, looking at even longer wavelengths than Hubble or JWST, was the European Space Agency's Herschel Space Observatory, launched on May 14, 2009. Like JWST, Herschel was not designed to be serviced after launch, and had a mirror substantially larger than Hubble's, but observed only in the far infrared and submillimeter. It needed helium coolant, of which it ran out on April 29, 2013.
|Selected space telescopes and instruments |
|Human eye||—||0.39–0.75 μm||0.01 m|
|Spitzer||2003||3–180 μm||0.85 m|
|Hubble STIS||1997||0.115–1.03 μm||2.4 m|
|Hubble WFC3||2009||0.2–1.7 μm||2.4 m|
|Herschel||2009||55–672 μm||3.5 m|
|JWST||Planned||0.6–28.5 μm||6.5 m|
Further concepts for advanced 21st-century space telescopes include the Large Ultraviolet Optical Infrared Surveyor (LUVOIR),  a conceptualized 8 to 16.8 meters (310 to 660 inches) optical space telescope that if realized could be a more direct successor to HST, with the ability to observe and photograph astronomical objects in the visible, ultraviolet, and infrared wavelengths, with substantially better resolution than Hubble or the Spitzer Space telescope. This effort is being planned for the 2025–2035 time frame.
Existing ground-based telescopes, and various proposed Extremely Large Telescopes, can exceed the HST in terms of sheer light-gathering power and diffraction limit due to larger mirrors, but other factors affect telescopes. In some cases, they may be able to match or exceed Hubble in resolution by using adaptive optics (AO). However, AO on large ground-based reflectors will not make Hubble and other space telescopes obsolete. Most AO systems sharpen the view over a very narrow field—Lucky Cam, for example, produces crisp images just 10 to 20 arcseconds wide, whereas Hubble's cameras produce crisp images across a 150 arcsecond (2½ arcminutes) field. Furthermore, space telescopes can study the universe across the entire electromagnetic spectrum, most of which is blocked by Earth's atmosphere. Finally, the background sky is darker in space than on the ground, because air absorbs solar energy during the day and then releases it at night, producing a faint—but nevertheless discernible—airglow that washes out low-contrast astronomical objects. 
New nanomanufacturing technique advances imaging, biosensing technology
More than a decade ago, theorists predicted the possibility of a nanolens -- a chain of three nanoscale spheres that would focus incoming light into a spot much smaller than possible with conventional microscopy. Such a device would make possible extremely high-resolution imaging or biological sensing. But scientists had been unable to build and arrange many nanolenses over a large area.
"That's where we came in," said Xiaoying Liu, senior research scientist at the University of Chicago's Institute for Molecular Engineering. Liu and Paul Nealey, the Dougan Professor in Molecular Engineering, teamed with experts in nanophotonics at the Air Force Research Laboratory and Florida State University to invent a novel way to build nanolenses in large arrays using a combination of chemical and lithographic techniques.
They aligned three spherical gold nanoparticles of graduated sizes in the string-of-pearls arrangement predicted to produce the focusing effect. The key, said Liu, was control: "We placed each individual nanoparticle building block into exactly the position we wanted it to go. That's the essence of our fabrication technique."
The team described its technique in the latest edition of Advanced Materials. The first step employs the lithographic methods used in making printed circuits to create a chemical mask. Liu and Nealey's mask leaves exposed a pattern of three spots of decreasing size on a substrate such as silicon or glass that won't absorb the gold nanoparticles.
Lithography allows for extremely precise and delicate patterns, but it can't produce three-dimensional structures. So the scientists used chemistry to build atop the patterned substrate in three dimensions. They treated the spots with polymer chains that were then tethered to the substrate through chemical bonds.
"The chemical contrast between the three spots and the background makes the gold particles go only to the spots," said Liu. To get each of the three sizes of nanospheres to adhere only to its own designated spot, the scientists played with the strength of the chemical interaction between spot and sphere. "We control the size of the different areas in the chemical pattern and we control the interaction potential of the chemistry of those areas with the nanoparticles," said Nealey.
Only the largest spot has the amount of force needed to attract and hold the largest particle the interaction of the particle with the middle and the small spots is too weak.
When the big spheres are adsorbed, the scientists use the same trick to put the medium-sized spheres onto the medium-sized spots, and finally move on to the smallest.
"It's like the Three Bears story," said Nealey. "We can put big ones on the big spots, but they won't stick to the smaller spots then put the next-sized one on the medium spot, but it won't stick to the small spot. By this sequential manufacturing we're able to arrive at these precise assemblies of three different-sized particles in close proximity to one another."
The spheres are separated by only a few nanometers. It is this tiny separation, coupled with the sequential ordering of the different-sized spheres, that produces the nanolensing effect.
"You get this concentration in the intensity of the light between the small- and the medium-sized nanoparticles," said Nealey.
The scientists are already exploring using this "hot spot" for high-resolution sensing using spectroscopy. "If you put a molecule there, it will interact with the focused light," said Liu. "The enhanced field at these hot spots will help you to get orders of magnitude stronger signals. And that gives us the opportunity to get ultra-sensitive sensing. Maybe ultimately we can detect single molecules."
The researchers also foresee applying their manufacturing technique to nanoparticles of other shapes, such as rods and stars. "The physics of particles shaped differently than spheres enables even a wider spectrum of applications," said Nealey.
"There's a large range of properties that you could realize by putting particles with asymmetric shapes next to each other." The method will have broad application for any process that requires precision placement of materials in proximity to the same or different types of materials. It will, Nealey predicts, "be part of the way that nanomanufacturing is done."
Astronomy Across the Electromagnetic Spectrum
While all light across the electromagnetic spectrum is fundamentally the same thing, the way that astronomers observe light depends on the portion of the spectrum they wish to study.
For example, different detectors are sensitive to different wavelengths of light. In addition, not all light can get through the Earth's atmosphere, so for some wavelengths we have to use telescopes aboard satellites. Even the way we collect the light can change depending on the wavelength. Astronomers must have a number of different telescopes and detectors to study the light from celestial objects across the electromagnetic spectrum.
A sample of telescopes (operating as of February 2013) operating at wavelengths across the electromagnetic spectrum. Observatories are placed above or below the portion of the EM spectrum that their primary instrument(s) observe.
The represented observatories are: HESS, Fermi and Swift for gamma-ray, NuSTAR and Chandra for X-ray, GALEX for ultraviolet, Kepler, Hubble, Keck (I and II), SALT, and Gemini (South) for visible, Spitzer, Herschel, and Sofia for infrared, Planck and CARMA for microwave, Spektr-R, Greenbank, and VLA for radio. Click here to see this image with the observatories labeled.
(Credit: Credit: Observatory images from NASA, ESA (Herschel and Planck), Lavochkin Association (Specktr-R), HESS Collaboration (HESS), Salt Foundation (SALT), Rick Peterson/WMKO (Keck), Germini Observatory/AURA (Gemini), CARMA team (CARMA), and NRAO/AUI (Greenbank and VLA) background image from NASA)
Do telescopes exist that reflect the incoming light more than three times along their length? - Astronomy
Almost all we know about the universe derives from the observation of photons. Radio waves (and radar), infrared (used by night-vision goggles and heat-seeking missles), visible light, ultra violet waves like those that give you a suntan, X-rays and the powerful and deadly gamma rays such as those now seen coming from neutron stars, are all electromagnetic waves composed of photons. We are learning some further things about the cosmos beyond the solar system by observing cosmic rays, which are mostly made up of either atomic nuclei minus their orbiting electrons, or one of their basic components, protons. But these positively charged particles do not point to their place of origin due to the magnetic fields of our galaxy which affect their flight paths like a magnet affects iron filings.
What is needed for deep, sharply focused examination of the universe is a telescope that can see a particle that is not much affected by the gas, dust, and swirling magnetic fields it passes on its journey. Neutrinos are a candidate. They constitute much of the total number of elementary particles in the universe, and these neutral weakly interacting particles (see section NEUTRINO) come to us almost without any disruption straight from their sources, traveling at very close to the speed of light. A (low energy) neutrino in flight would not notice a barrier of lead fifty light years thick. When we are able to see outwards in neutrino light we will doubtless get a wondrous new view of the universe.
Neutrinos in the Universe
The fundamental building blocks of the universe of which all matter is composed, consist of Fermions: the quarks (up, down, charmed, strange, top and bottom) and leptons (electron, muon and tau-on, plus a neutral particle partner for each, electron-neutrino, muon-neutrino and tau-neutrino). As a result of data from the Super-Kamiokande experiment presented in 1998, we now know with high probability that some neutrinos have mass, and thus so do all the Fermions. One of the greatest challenges to elementary particle physics is now to explain the great gap between neutrino masses and those of the electrically charged fundamental Fermions (a factor of more than a hundred billion).
Neutrinos were made in staggering numbers at the time of the Big Bang. Like the cosmic background radiation (see section on this) the neutrinos now posess little kinetic energy (the energy of motion that is to say, like the energy of an incoming meteor) due to expansion of the universe. There are expected to be at least 114 neutrinos per cubic centimeter, averaged over all space. There could be many more at earth because of condensation of neutrinos, now moving slowly under the gravitational pull of our galaxy. As of now, we only have a lower limit on the total mass in this free floating ghostly gas of neutrinos, but even so it is roughly equivalent to the total mass of all the visible stars in the universe.
These relic neutrinos would be wonderful to observe, and much thought has gone into seeking ways to do so. The problem is that the probability of neutrinos interacting within a detector decreases with the square of the neutrino's energy, for low energies. And even in the rare case when the neutrino does react in the detector the resulting signal is frustratingly miniscule. Nobody has been able to detect these lowest energy neutrinos as yet. Prospects are not good for a low-energy neutrino telescope, at least in the near future.
Next best are neutrinos from the nuclear burning of stars. Here we are more fortunate, as we have the sun close by producing a huge flux (number per unit area per unit time) of neutrinos, which have been detected now in five experiments (see SOLAR NEUTRINOS). A thirty year mystery persists in the deficit of about a half in the numbers of neutrinos detected compared to expectations, the so-called "Solar Neutrino Problem". This deficit is now thought probably to be due to neutrino oscillations. Really, the fact of calculating the expected neutrino flux from our sun and getting the answer to be close to observations represents a great triumph for our understanding of stellar burning and evolution. So, in this sense we are already doing neutrino astronomy. However, we are limited to the sun. Just as the sky is dark at night despite all the stars, the sun outshines all the rest of the cosmos in numbers of neutrinos we detect.
A marvelous event occurred at 07:35:41 GMT on 23 February 1987, when two detectors in deep mines in the US (the IMB experiment) and Japan (the Kamiokande experiment) recorded a total of 19 neutrino interactions over a span of 13 seconds. Two and a half hours later (but reported sooner) astronomers in the Southern Hemisphere saw the first Supernova to be visible with the unaided eye since the time of Kepler, 250 years ago, and this occurred in the Large Magellanic Clouds at a distance of some 50 kiloparsecs (roughly 150,000 light years). From this spectacular beginning to neutrino astronomy followed many deductions about the nature of neutrinos, such as limits on mass, charge, gravitational attraction, magnetic moment, and so on, and several hundred publications. Never had so much science and astronomy been extracted from so few events. Supernovae of the gravitational collapse type, occur when elderly stars run out of nuclear fusion energy and can no longer resist the force of gravity. The neutrinos wind up carrying off most of the in-fall energy, some 10% of the total mass-energy of the inner part of star of about 1.4 solar masses. Approximately 3 x 10^53 ergs get released with about 10^58 neutrinos over a few seconds. This is a staggering thousand times the solar energy release over its whole lifetime! The awesome visible fireworks consist of a mere one thousandth of the energy release in neutrinos.
Much can yet be learned from the death throes of stars, not only about the process of stellar collapse to a neutron star or black hole (the latter if the progenitor is very massive), but also about properties of neutrinos. For example, heavier neutrinos travel more slowly and by studying the structure of the neutrino wave passing by earth, we can perhaps extract the relative masses of the three types of neutrinos in a direct way, aside from that of the phenomenon of neutrino oscillations. As of this time (1999), four underground detectors (Super-Kamiokande in Japan, SNO in Canada, LVD, and MACRO in Italy) have significant capability for supernova detection from our galaxy. The rate of visible supernovae in our galaxy is only about one per 200 years from historical information, but many cannot be seen optically due to the obscuration of the galactic plane. From historical records and from observations of distant spiral galaxies we expect the rate of supernovae in our galaxy to be between one per twenty and one hundred years. Thus experimentalists may have to wait a long time before the next observation, and we have no way of predicting when it will occur.
High Energy Cosmic Neutrinos
Moving up in energy, physicists have realized for many years that higher energy neutrinos would be made inevitably in many of the most luminous and energetic objects in the universe. The most powerful objects seen are active galactic nuclei, which are known to produce particles with energies much higher than the most powerful human-constructed particle accelerators. There also exist enigmatic objects such as Gamma Ray Bursters, which may be the most energetic explosions observed and which are mostly at cosmological distances. These produce gamma rays up to great energies as well, and may be bountiful neutrino sources or maybe not, depending upon the mechanism for the radiation, at present a mystery. Seemingly disallowed cosmic rays have been observed in recent years, with energies more than a 100 million times greater than terrestrial accelerators (more than 10^20 eV). These mysterious particles apparently do not come from our galaxy, and indeed remain of unknown origin. In fact, after nearly a century of study, we do not know the origin of the cosmic rays generally, particularly above about 1 PeV (10^15 eV), though many models have been proposed. Whatever the source, the machinery which accelerates particles to the higher energies will inevitably also produce neutrinos. At the highest energies many speculative models have been proposed as neutrino sources, including decays of Planck mass objects left over from the Big Bang, radiation from around super-conducting cosmic strings and the like, exciting findings if verified, and of fundamental importance to particle physics and cosmology.
Thus we know that high energy neutrinos surely arrive to us from the cosmos, and may teach us much as we study their directions, energy, type and variation with time. The burning question for would-be neutrino astronomers is however, are there enough neutrinos to detect? Two things make prospects more bright in the near future for higher energy neutrino astronomy than lower energies.
First the interaction probability for neutrinos goes up with energy. For the largest present underground detector, Super-Kamiokande, only about one in a trillion neutrinos of the typical energy (about 1 GeV, or the equivalent to the proton rest mass) interact when passing through the detector and can be studied. This goes up almost in proportion to the energy of the neutrino however. In fact above about 1 PeV, the earth is opaque to neutrinos and one must look for neutrinos only coming downwards. At lower energies one does neutrino astronomy backwards from optical astronomy, looking downwards, using the earth to filter out anything but neutrinos. It is this region between 1 TeV (10^12 eV) and 1 PeV, roughly, that is the favorite hunting ground for attempts to begin regular neutrino astronomy.
The second virtue of seeking higher rather than lower energy neutrinos is that the consequences of neutrino interaction with a target (earth or detector) become more detectable as the energy release is greater. The favored method is to detect muons produced by neutrinos. These muons (unlike electrons or tau-ons) fly a long distance (in closely the same direction as the neutrino) in earth before stopping, for example about one kilometer at an energy of a few TeV. These charged particles produce Cherenkov radiation, a short flash of light detectable at tens of meters distance by photomultipliers in clear water or ice. Cherenkov radiation occurs when particles exceed the velocity of light in the medium (75% of c in this case), and which is rather like an electromagnetic version of a sonic boom, or the wake of a ship. Thus a detector can effectively collect the results of neutrino interactions from a target volume much greater than the detector volume itself.
High Energy Neutrino Telescopes
Neutrino detectors must be generally placed deep underground or water to escape the backgrounds caused by the inescapable rain of cosmic rays upon the atmosphere. The cosmic rays produce many muons which penetrate deeply into the earth, in even the deepest mines, but of course with ever decreasing numbers with depth. Hence the first attempts at high energy neutrino astronomy have been initiated underwater and under ice. The lead project, called DUMAND was canceled in 1995 on account of slow progress and budget difficulties, but managed to make great headway in pioneering techniques, studying backgrounds, exploring detector designs, and perhaps most importantly stimulating the community to consider neutrinos in astrophysics. Another long running project exists in Lake Baikal, the largest and deepest lake in the world, in Siberia. That instrument consists of large light detectors (0.4 m diameter) lowered on cables from the winter ice and connected to shore by cable. The Baikal project has reached a level of producing some modest physics results, including atmospheric neutrino detections, but is still a few years from significant neutrino astronomy since the present area amounts to only a few hundred square meters.
Two projects similar to DUMAND are underway in the Mediterranean, the more developed NESTOR Project located off Pylos in the Southwest of Greece, and the new ANTARES Project located offshore from Marseilles, France. Another project is being talked about for southern Italy as well. These projects differ in the method of supporting photo-detectors and array geometry, but basically employ the same method of bottom anchored cables with photomultipliers protected in spherical glass pressure housings, as developed for DUMAND. Both projects aim at prototype neutrino detectors in the near future (several years, with NESTOR a bit ahead). The prototype instruments will have effective areas for muon collection in the range of 20-50 thousand m^2 area. This may be compared with the largest present underground instruments which are about 1000 m^2 area, and the desired size for real astronomy of about one million m^2 (a square kilometer).
The deep ocean water is amazingly clear with optical attenuation lengths of 40-50 meters. Instruments can be spaced a few tens of meters apart to detect most muons passing nearby. An array of vertical strings of such detectors can cover a whole cubic kilometer employing roughly the same number of detectors as in the existing Super-Kamiokande deep mine instrument. Of course, placing these photo-detectors in the deep ocean is much more tricky and costly than in a tank in a mine, but the point is that such detectors are now well within the realm of technical feasibility and costs not large compared to equivalent scientific endeavors.
A different type of neutrino telescope is under construction at the South Pole, in ice, the AMANDA Project. It turns out that ice below about 1.4 km depth is quite clear (100 m attenuation length) and bubble free, though optical scattering is still somewhat of a problem (25 m effective scattering length). The experimenters have worked out a method to use hot water to drill 2 km deep holes, down which they lower strings of photomultipliers. The instruments become permanently frozen-in after about a day, but the cables can be accessed at the surface, so no complex and expensive electronics need be placed in the inaccessible holes. This array is topologically rather like the underwater arrays, turned bottom-side up. The AMANDA group has reported the detection of a few upcoming neutrino events, a demonstration of feasibility.
There have been many discussions about the relative virtues of the deep lake, ocean and ice approaches, and each has attractions and liabilities (for example, while the underwater arrays can be retrieved for service or reconfiguration, the local background light is worse and the access not easy). At this time all four (possibly five) projects are making progress and working towards detectors of a few tens of thousands of m^2 in a few years. Hence, it seems likely that real high energy neutrino astronomy with kilometer scale projects is still about a decade away. Meanwhile, calculations go on, and underground detectors wait patiently or the next galactic supernova. In the very long run, as has been the case with every venture into new parts of the electromagnetic spectrum, one can be sure that neutrino astronomy will teach us many new and unexpected wonders as we open a new window upon the universe.
KCRM – he Keck Cosmic Reionization Mapper will complete the Keck Cosmic Web Imager (KCWI), the world’s most capable spectroscopic imager. The design for KCWI includes two separate channels to detect light in the blue and the red portions of the visible wavelength spectrum. KCWI-Blue was commissioned and started routine science observations in September 2017. The red channel of KCWI is KCRM a powerful addition that will open a window for new discoveries at high redshifts.
KPF – The Keck Planet Finder (KPF) will be the most advanced spectrometer of its kind in the world. The instrument is a fiber-fed high-resolution, two-channel cross-dispersed echelle spectrometer for the visible wavelengths and is designed for the Keck II telescope. KPF allows precise measurements of the mass-density relationship in Earth-like exoplanets, which will help astronomers identify planets around other stars that are capable of supporting life.