How are image credits expressed in Astronomy presentations?

How are image credits expressed in Astronomy presentations?

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Sorry if this is a stupid question. I'm preparing an astronomy presentation, and I want to make sure I give credits where credits are due. But I see credits appear in different formats, for example:

  • ESA/Hubble & NASA, V. Antoniou; Acknowledgment: Judy Schmidt - a slash and an ampersand
  • ESA/Hubble & NASA, B. Nisini - a slash and an ampersand
  • NASA, ESA, and R. Humphreys (University of Minnesota), and J. Olmsted (STScI) - commas
  • ESA/Hubble & NASA, L. Stanghellini - commas and ampersand

If possible, I want all of the credits in my presentation to have the same format. So essentially, my question is: do a comma and a slash mean different things in image credits? Or can I write everything using the same format, e.g. for the four photos above:

  • ESA / Hubble / NASA / V. Antoniou / Judy Schmidt
  • ESA / Hubble / NASA / B. Nisini
  • NASA / ESA / R. Humphreys (University of Minnesota) / J. Olmsted (STScI)
  • ESA / Hubble / NASA / L. Stanghellini

There is no conventional format, really. For publications purposes if one uses data from a telescope or image or code created by someone else or similar, the source usually requests being cited or acknowledged in a form specific to their own taste. Some examples include what you quoted or “This paper makes use of the following ALMA data: ADS/JAO.ALMA#******. ALMA is a partnership of… “.

Answering your question.

  • For ESA and NASA there is probably a difference between “/“ and “&”.
  • For a publication in a journal you must follow the exact wording.
  • If you just give a presentation no one minds if you use your favorite formatting while acknowledging all parties involved.

I'll note that the only use of "/" in your list seems to be in "ESA/Hubble", and I suspect this is supposed to mean something like "the part of ESA [European Space Agency] devoted to Hubble". It shows up on the website a lot, for example.

So the best way to think about this may be: "Only use '/' when it's part of the name of an entity or organization -- e.g. 'ESA/Hubble'; don't use it as a general separator." Otherwise, it's a standard list format: two entities/persons should be combined with "and" or an ampersand, three or more combined with commas and a final "and" or ampersand. (I would use Oxford commas for this, but that is, alas, really a matter of taste ;-)

And, as @MeL and I and others have pointed out, there really isn't a standard format for presentations. Ampersands save a little space and look a little less formal than "and", but it's up to you which to use.

Data Access

All articles, presentations, and other publications using data obtained using NSO facilities must acknowledge the facility with which the data was collected, the program under which it falls (NISP or DKIST) and the sponsorship of these facilities by the National Science Foundation. Please use this template:

“This work utilizes __[instrument]__ data obtained by the __[program]__, managed by the National Solar Observatory, which is operated by the Association of Universities for Research in Astronomy (AURA), Inc. under a cooperative agreement with the National Science Foundation.”

Abbreviated version: “Data were acquired by __[instrument]/[program]__ operated by NSO/AURA/NSF.”

Data obtained with NSO telescopes and instrumentation is required to be placed in the public domain. The data shall remain available for the exclusive use of the original investigators for a period of eighteen months following the completion of the observing program. After this interval, the data will be available to any qualified investigator who submits a proposal to use the data.

For data that reside at an NSO facility, such a proposal for access to this data will be subject to review and the investigator may be subject to charges for any costs incurred in the copying or transfer of the data. NSO will inform the original investigator of such requests and encourages those who wish to use the data to communicate with the original investigator concerning the details of the data and their suitability for the proposed investigation.

For NSO data that does not reside at an NSO facility, requests should be made directly to the principal investigator(s) (PI) after the eighteen-month period. The PI is responsible for sharing the data and shall inform NSO of the additional utilization, or of any difficulties in complying with the request.

Data taken with an instrument for engineering or test purposes may have scientific value. The availability of such data to outside investigators will be considered on a case-by-case basis and is at the discretion of the Director.

Synoptic data that are taken routinely by any NSO facility are available to qualified investigators without a waiting period.

In all instances, the investigator can apply to the Director for exceptions to this policy.

A short introduction to astronomical image processing

An image is an array, or a matrix, of square pixels (picture elements) arranged in columns and rows.

In a (8-bit) greyscale image each picture element has an assigned intensity that ranges from 0 to 255. A grey scale image is what people normally call a black and white image, but the name emphasizes that such an image will also include many shades of grey.

A normal greyscale image has 8 bit colour depth = 256 greyscales. A "true colour" image has 24 bit colour depth = 8 x 8 x 8 bits = 256 x 256 x 256 colours =

Some greyscale images have more greyscales, for instance 16 bit = 65536 greyscales. In principle three greyscale images can be combined to form an image with 281,474,976,710,656 greyscales.

There are two general groups of 'images': vector graphics (or line art) and bitmaps (pixel-based or 'images'). Some of the most common file formats are:

  • GIF - an 8-bit (256 colour), non-destructively compressed bitmap format. Mostly used for web. Has several sub-standards one of which is the animated GIF.
  • JPEG - a very efficient (i.e. much information per byte) destructively compressed 24 bit (16 million colours) bitmap format. Widely used, especially for web and Internet (bandwidth-limited).
  • TIFF - the standard 24 bit publication bitmap format. Compresses non-destructively with, for instance, Lempel-Ziv-Welch (LZW) compression.
  • PS - Postscript, a standard vector format. Has numerous sub-standards and can be difficult to transport across platforms and operating systems.
  • PSD - a dedicated Photoshop format that keeps all the information in an image including all the layers.

For science communication, the two main colour spaces are RGB and CMYK.

The RGB colour model relates very closely to the way we perceive colour with the r, g and b receptors in our retinas. RGB uses additive colour mixing and is the basic colour model used in television or any other medium that projects colour with light. It is the basic colour model used in computers and for web graphics, but it cannot be used for print production.

The secondary colours of RGB – cyan, magenta, and yellow – are formed by mixing two of the primary colours (red, green or blue) and excluding the third colour. Red and green combine to make yellow, green and blue to make cyan, and blue and red form magenta. The combination of red, green, and blue in full intensity makes white.

In Photoshop using the “screen” mode for the different layers in an image will make the intensities mix together according to the additive colour mixing model. This is analogous to stacking slide images on top of each other and shining light through them.

The 4-colour CMYK model used in printing lays down overlapping layers of varying percentages of transparent cyan (C), magenta (M) and yellow (Y) inks. In addition a layer of black (K) ink can be added. The CMYK model uses the subtractive colour model.

The range, or gamut, of human colour perception is quite large. The two colour spaces discussed here span only a fraction of the colours we can see. Furthermore the two spaces do not have the same gamut, meaning that converting from one colour space to the other may cause problems for colours in the outer regions of the gamuts.

Astronomical Images

Images of astronomical objects are usually taken with electronic detectors such as a CCD (Charge Coupled Device). Similar detectors are found in normal digital cameras. Telescope images are nearly always greyscale, but nevertheless contain some colour information. An astronomical image may be taken through a colour filter. Different detectors and telescopes also usually have different sensitivities to different colours (wavelengths).

A telescope such as the NASA/ESA Hubble Space Telescope typically has a fixed number of well-defined filters. A filter list for Hubble’s WFPC2 (Wide Field and Planetary Camera 2) camera is seen to the right.

Filters can either be broad-band (Wide) or narrow-band (Narrow). A broad-band filter lets a wide range of colours through, for instance the entire green or red area of the spectrum. A narrow-band filter typically only lets a small wavelength span through, thus effectively restricting the transmitted radiation to that coming from a given atomic transition, allowing astronomers to investigate individual atomic processes in the object.

A filename such as 502nmos.fits indicates that the filter used has a peak at 502 nm. In the table below, you can see that this filter is a narrow bandwidth filter, i.e. it only lets radiation with wavelengths within a few nm of 502 nm through.

Below is an example of an image composed from narrow-band exposures. This results in very sharply defined wisps of nebulosity since each exposure separates light from only some very specific physical processes and locations in the nebula.

Galaxies are often studied through broad-band filters as they allow more light to get through. Also the processes in a galaxy are more ‘mixed’ or complicated, result from the outputs of billions of stars and so narrow-band filters give less ‘specific’ information about the processes there.

A figure illustrating the process of stacking together different colour exposures is seen in figure 10.

A figure of the process of stacking together different colour exposures is seen in figure 11 to the right.

Assign ing colours to different filter exposures
The astronomical images we see on the web and in the media are usually ‘refined’ or ‘processed’ as compared to the raw data that the astronomers work on with their computers. In ‘pretty pictures’ all artefacts coming from the telescope or the detectors are for instance removed as they do not say anything about the objects themselves. It is very rare that images are taken with the sole intention of producing a ‘pretty’ colour picture. Most ‘pretty pictures’ are constructed from data that was acquired to study some physical process, and the astronomer herself probably never bothered to assemble the greyscale images to a colour image.

Natural colour images
It is possible to create colour images that are close to “true-colour” if three wide band exposures exist, and if the filters are close to the r, g and b receptors in our eyes. Images that approximate what a fictitious space traveller would see if he or she actually travelled to the object are called “natural colour” images.

To make a natural colour image the order of the colours assigned to the different exposures should be in “chromatic order”, i.e. the lowest wavelength should be given a blue hue, the middle wavelength a green hue and the highest wavelength should be red.

Representative colour images
If one or more of the images in a data set is taken through a filter that allows radiation that lies outside the human vision span to pass – i.e. it records radiation invisible to us - it is of course not possible to make a natural colour image. But it is still possible to make a colour image that shows important information about the object. This type of image is called a representative colour image. Normally one would assign colours to these exposures in chromatic order with blue assigned to the shortest wavelength, and red to the longest. In this way it is possible to make colour images from electromagnetic radiation far from the human vision area, for example x-rays. Most often it is either infrared or ultraviolet radiation that is used.

Enhanced colour images
Sometimes there are reasons to not use a chromatic order for an image. Often these reasons are purely aesthetic, as is seen in the example below. This type of colour image is an enhanced colour image.

You are the judge
When processing raw science images one of the biggest problems is that, to a large degree, you are ‘creating’ the image and this means a colossal freedom within a huge parameter space. There are literally thousands of sliders, numbers, dials, curves etc. to twist and turn.

Speaking of right and wrong, there really are no wrong or right images. There are some fundamental scientific principles that should normally be observed, but the rest is a matter of aesthetics — taste. Chromatic ordering of the exposures is one of the important scientific principles.

Stretch function
One particularly important aspect of image processing is the choice of the best stretch function. You choose which “stretch function” or representation to use in the Fits Liberator window.

Astronomy in Africa: The Past, The Present and The Future

The history of science and technology in Africa has received little attention compared to other regions of the world, despite notable developments in various fields. Africa is home to the world’s oldest technological achievement in the world, with evidence found in NorthEastern Africa. A 7,000-year-old stone circle known as Nabta Playa located approximately 100 kilometres west of Abu Simbel in southern Egypt, which is older than the Stonehenge in England, the world’s most famous prehistoric monument which was erected 5,000 years ago in 2,500 BC.

The site located in Africa, stands 700 miles from the great pyramid of Giza in Egypt, according to the time of its construction, making it the oldest stone circle and possibly Earth’s oldest astronomical site. In an article published by astronomy , a leading resource website on columns and articles on sky viewing, astronomy and astrophysics, the 7000-year-old stone circle tracked the summer solstice and the arrival of the annual monsoon season.

The Nabta Playa Site

It is said to have been constructed by a cattle worshipping cult of nomadic people. In a statement with Astronomy, Mckim Malville, a professor emeritus at the University of Colorado and Archaeoastronomy expert, stated that “the existence of the site is human beings’ first attempt to make a serious spiritual connection with the heavens”. He further added that “the awakening of the people to begin to construct the Nabta Playa was the dawn of observational astronomy”.

As a new millennium arises, scholars still consider the study of astronomical practices in African Societies an open one. Astronomical studies happen in Arabic, Ge’ez, Hausa and Swahili, studying celestial symbols such as the sun, moon, star, comet etc. Embedded in African history showing new evidence of the African involvement in astronomy, which makes it no surprise that Muusa Galaal’s work on Somali ethnoastronomy is derived from oral literature possibly passed down generations. ( Galaal 1992 ).

An illustration of the Nabta Playa being used [Credits: The Human Project] For many years, societies in Africa arranged stones aligning the stars and sun to mark seasons, to know when to harvest crops and mark celebrations.

The rest of the world has, however, surpassed Africa in architecture, technology, science and especially, astronomy even though they were pioneers. In a study on the development of astronomy in Africa by Govinder K published in June 2011, a significant challenge is the absence of public understanding of modern scientific knowledge, leading to misconceptions. As different cultures have different propositions with the sky and their operation, the inclusion of contemporary science which may clash with their beliefs might make the public write it off as a mere tale or perceive it as evil.

African researchers have expressed the challenges faced in pushing their research. According to Edward Jurua, a physicist and founder of the astronomy programme at Mbarara University of Science and Technology in Uganda “it was an issue recruiting members for a new astronomy programme as astronomy was not offered in any of the Universities situated in Uganda coupled along with the lack of financial resources”.

“A country where over 70 % of its people cannot afford a three square meal per day, how would they afford to fund astronomy? The lucky ones who are able to want to further their dreams to study astronomy often meet mentors from outside the continent to motivate them”.

Pankyes Datok, a PhD student in hydrology and biogeochemistry at Paul Sabatier University, Toulouse, France, highlights the lack of awareness yet to enter the African continent on Modern Science and Technology as a challenge “when the people are made to understand, inspired and motivated, more people would spring up to study and collaborate on astronomy”.

Salma Sylla Mbaye, the first PhD student in Astronomy in Senegal at Cheikh Anta Diop University, Dakar mentions that the “unavailability of resources such as telescopes and computers and the lack of lecturers in that field of study” has another challenge. Students in Africa wish to pursue astronomy as an interest, but there are not scholars to teach them.

Various technologies lead to the development of astronomy, which nations that have taken into account have benefited. According to researchers , the Iphone’s camera, a device created and developed by Apple, is a charge-coupled device which converts the movement of electrical charge into a digital value. The astronomy invention now serves as a source of benefit to societies involved, thereby boosting their technological advancement.

The computer language Forth initially developed for the 36-foot telescope on Kitt Peak is now used by FedEx to track packages. Forth benefits are examples of what Africa has missed in the slow development of Astronomy while the rest of the world has forged ahead.

Until recently, South Africa, Namibia, Morocco, Egypt, Algeria, were the only astronomy references in Africa with their optical observatories. As of 2019, twelve countries in Africa have launched Forty-one satellites , with thirty-eight of them started by eleven of those countries individually, while the other three programmes involved several African countries in a multilateral project.

The Future of Astronomy in Africa shines brightly now as countries are taking extra steps with an African groundbreaking achievement with the launching of a 64-MeerKat array in South Africa. The project launched in 2018 by the Deputy President, David Mabuza, would serve as an inspiration towards the aspirations of the people. The South African MeerKat radio telescope, which will be the largest in Africa (for now), is the first of the SKA series and will be integrated into the mid-frequency component of the SKA phase. It will be critical to building the largest and most sensitive radio telescope in Africa with its locations stated as South Africa and Australia.

Photo: South African Astronomical Observatory. Source: TravelGround

With some Astronomical Observatories in Africa, its full astronomical potential is vital for its development. According to the African Astronomical Society , The Southern African Large Telescope (SALT) is the largest ( at 11 meters) single optical telescope in the southern hemisphere, funded by India, South Africa, Germany, the United States, the United Kingdom, Poland and New Zealand. It is in Sutherland.

The second largest at 1.9m diameter, The SAAO telescope, was built for the Radcliffe Observatory in Pretoria but now located in Sutherland. The largest telescope in North and Central Africa and the Middle East (1.88 diameters) at the Kottamia Observatory, northeast of Helwan in Egypt.

The UFS- Boyden Rockfeller, (1.5m) located at Boyden Observatory in South Africa has been used extensively since the early 1930s. The Dall-Kirkham Reflector and The Parks Telescope with an aperture of 0.45m and 0.41m were constructed in 1955 and 1994, located in SAAO in Cape Town.

Also in 2020, The University of Namibia and The Radboud University Nijmegen, through a partnership signed in 2016, were reported to have added to the strings of mega astronomy projects as reported in April 2020, The Africa Millimeter Telescope, a 15-m single-dish radio telescope to be positioned on the Gamsberg Mountain in Namibia. Its purpose would be to yield a link to a grid of telescopes located around the world known as the Event Horizon Telescope (EHT).

Asides from providing a relationship, the Africa Millimeter Telescope will serve as the only radio telescope in Africa. It would serve as an excellent opportunity for scholars, researchers, science enthusiasts and Namibia as a whole.

The Authority on News, Data and Market Analysis for the African Space Industry.

Technology Innovation

Astronomy is undergoing a revolution in the way we probe the universe and the way we answer fundamental questions. New technology enables this: novel detectors are opening new windows on the Universe, creating unprecedented volumes of high-quality data, and computing technology is keeping up with this explosion and driving a shift in the way science is produced in astronomy and astrophysics. Huge surveys of the sky over many wavelengths of light can be analyzed statistically for hidden correlations and explanations for puzzling observational data found using the technique of statistical inversion to aid in better understanding of the underlying physics. Rubin Observatory is the lighthouse project in this revolution, and solutions to Rubin Observatory's challenges are already having spin-off effects in broader areas of technology and “big data” science.

Realizing the vision of Rubin Observatory requires facing and solving extraordinary engineering and technological challenges: the fabrication of large, high-precision non-spherical optics construction of a huge, highly-integrated array of sensitive, wide-band imaging sensors and the operation of a data management facility handling tens of terabytes of data each day. The design and development effort includes structural, thermal, and optical analyses of all key hardware subsystems, prototyping and development of data management systems, and extensive systems engineering studies. To validate system performance, full end-to-end simulations are being done, with more than 100 technical personnel at a range of institutions currently engaged in this program.

Rubin Observatory R&D has led to a new-generation imaging CCD which is highly segmented, low noise, and sensitive from the UV to the near IR. The speed with which Rubin Observatory can cover half the sky will produce about 15 terabytes (TB) per night, leading to a total database over the ten years of operations of order 50 petabytes (PB) for the raw data, and 15 PB for the catalog database. The total data volume after processing will be over 100 PB, processed using 250 trillion floating-point operations per second of computing power. Processing such a large volume of data, converting the raw images into a faithful representation of the universe, automating data quality assessment, and archiving the results in useful form for a broad community of users is a major challenge.

The acquisition of scientific data in all disciplines is now accelerating, causing a nearly insurmountable data avalanche. It is no longer possible for humans to look at any representative fraction of the data. Instead, we may be looking over the shoulders of assisted learning machines at innovative visualizations of metadata. Discoveries will be made via searches for correlations. The role of the experimental scientist increasingly is as inventor of ambitious new searches and new algorithms. Novel theories of nature are tested through searching for predicted statistical relationships across big databases.

In this era of big data, we will require novel, increasingly automated, and increasingly more effective ways to fish scientific knowledge from oceans of bytes.

The scientific database will include:
  • A source catalog with 7 trillion rows
  • An object catalog with 37 billion rows, each with 200+ attributes
  • A moving object catalog with 6 million rows
  • An alerts database, with alerts issued worldwide within 60 seconds, and
  • Calibration, configuration, processing, and provenance metadata.

The science archive will consist of 400,000 sixteen-megapixel images per night (for 10 years), comprising 60 PB of pixel data. This enormous Rubin Observatory data archive and object database enables a diverse multidisciplinary research program: astronomy and astrophysics machine learning (data mining) exploratory data analysis extremely large databases scientific visualization computational science and distributed computing and inquiry-based science education (using data in the classroom). The advances in these technology areas will be exported to other big data science applications (biology, remote sensing, etc.) and will drive innovations in industry.

Birth of Space-based Astronomy

Space Science Board 31 March 1961 Letter Report to NASA, “Man’s Role in the National Space Program: The Space Science Board was appointed in 1958 and charged to survey the scientific aspects of the human exploration of space through the use of rockets and satellites. This is the cover letter of a report outlining the Board’s policy recommendations to James Webb, the NASA administrator at the time.
Credit: National Academy of Sciences

October 4, 1957, changed everything. That is when the Soviet Union proved that Earth-orbiting satellites were technologically possible by launching Sputnik.

After the launch of Sputnik, the US government was eager to send its own satellites into orbit around the Earth. I was immediately contacted by the Air Force about Princeton working out some of the problems of sending an intermediate-sized telescope into space orbit.

As a result of this new fervor, NASA was formed in 1958 and began forming a community of scientists to think about the possibilities of doing astronomy from space in a very real way. In the early 1960s working groups were organized by the Space Science Board of the National Academy of Sciences at NASA's request to discuss the future of space research and it was at these meetings that conversations about a large space telescope began in earnest. I was part of all of these groups. During this time, programs were under way to send medium-sized telescopes into Earth orbit. These were called the Orbiting Solar Observatory and Orbiting Astronomical Observatory programs. It was seeming more and more likely that a large, general purpose space-based telescope was becoming a reality.

Immediately after it was formed in 1958, NASA began planning the launch of three Orbiting Astronomical Observatories (OAOs), each designed to carry out astronomical research at ultraviolet wavelengths, using telescopes up to a meter in diameter.

For me, perhaps the highpoint of my career was that day in August 1972 when our ultraviolet spectrometer aboard the Copernicus satellite (OAO-3) was turned on from the Goddard Space Flight Center, several days after launch, and was found to be operating properly. Copernicus was the fourth attempted launch of an OAO satellite, and the second successful mission. It remained in operation for 9 years.

Copernicus Satellite Control Room: The control room at Goddard Space Flight Center from which Lyman Spitzer’s Princeton group controlled the instruments on the Copernicus satellite. PEP on the foreground console stands for Princeton Experimental Package.
Credit: Image courtesy Edward Jenkins

AstroCom NYC program

AstroCom NYC scholars have entered graduate programs at Berkeley, Brandeis, Cornell, Harvard, Heidelberg, Rice, Yale, Miami, UConn, and others, as well as industry, and have received prestigious fellowships from the NSF, the American Museum of Natural History, and Fulbright awards. We have years of experience mentoring undergraduates in research on galaxy properties and evolution, nearby and low mass stars, evolved stars, numerical simulations and modeling, and observational astronomy from the radio to gamma-rays. Our faculty work at our home CUNY campuses, the American Museum of Natural History, and the Flatiron Institute Center for Computational Astrophysics.

Students should be enrolled in a CUNY 2-yr or 4-yr college, and have an interest in pursuing a career in physics and astronomy as evidenced by their application essays, and choice of courses and major. Students from groups underrepresented in the sciences – e.g., African Americans, Hispanics – are particularly encouraged to apply.

For further information email AstroComNYC at , or contact any astrophysics faculty at your CUNY campus.

AstroCom NYC is supported in part by a National Science Foundation Partnerships in Astronomy and Astrophysics Research and Education grant. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

This entry is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license.

February 25, 2008

Lunar Eclipse Pictures!

Photo Credit: Sjoerd Witteveen

You didn't need a telescope to see last week's lunar eclipse. YRMG Staff Photographer Sjoerd Witteveen took this series of pictures showing the progression of the eclipse over the evening and these images are similar to what you could see with the naked eye. My favourite part is seeing the early stages when the moon appears to have a bite taken out of it. As things progress and the moon enters the full shadow of the earth, you see an orangey red hue cast upon the lunar surface. The colour is due to refracted sunlight bending around the edges of the earth and reaching the moon. Just imagine, that's all the sunsets and sunrises on earth at that moment painting our Luna in a beautiful light!

This weekend from Friday February 29th to Sunday March 2nd, I'll be at the Markham Spring Home Show . If you would like to learn more about astronomy I'll be at the Starlight Learning booth. In addition, I'll be making a presentation on "Backyard Astronomy" during two public lectures at the Home Show stage at 6:45 PM Friday and 5:45 PM Saturday.

PH-200 Series Courses

PH-201 General Physics I (1C & 2E)*

3 class hours 1 recitation hour 2 laboratory hours 4 credits
Prerequisite: MA-114 or MA-119 and MA-121 or the equivalent, or satisfactory score on the Mathematics Placement Test, Level II.

A beginning course for technology students. Topics include units, vectors, equilibrium, linear motion, Newton’s laws, circular motion, angular motion, momentum, and fluid motion. Emphasis is on applications. A working knowledge of simple algebra is assumed.

* Course qualifies as Pathways Common Core 1C–Life & Physical Sciences and 2E–Scientific World STEM Variant.

PH-202 General Physics II (1C & 2E)*

3 class hours 1 recitation hour 2 laboratory hours 4 credits
Prerequisite: PH-201 (with a grade of C or better)

Second semester of PH-201, 202 sequence. Topics include vibration and wave motion, electrostatics, electric and magnetic fields, electromagnetic waves, optics and topics in modern physics.

* Course qualifies as Pathways Common Core 1C–Life & Physical Sciences and 2E–Scientific World STEM Variant.

PH-229 Introduction to Photonics

2 class hours 1 recitation hour 3 laboratory hours 4 credits
Corequisite: MA-114

Topics in optics related to lasers and optical fiber and devices for modulating and directing signals from such devices. Students will study geometrical optics with emphasis on ray tracing and the application to lenses (thick and thin), mirrors, prisms and other passive optical elements and systems. Students will study the propagation of light in materials and dispersion and its effects. Additional topics will include an introduction to lasers and fiber optics, including an introduction to the propagation of light through fibers. Laboratory exercises complement class work.

PH-230 Matrix Optics and Aberrations

1 lecture hour 1 recitation/lab hour 1 credit
Prerequisite: PH-229

Topics in matrix optics applied to geometric (ray) optics including beam propagation, thin and thick lenses and lens systems. Introduction to aberrations in optical systems, how they are formed and controlled.

PH-231 Fundamentals of Lasers and Fiber Optics

3 class hours 3 laboratory hours 4 credits
Corequisite: MA-114

Topics in optics related to lasers and optical fiber and devices for modulating and directing signals from such devices. Geometrical optics with emphasis on ray tracing. Matrix methods in optics. Lenses thick and thin, mirrors, prisms and other passive optical elements and systems. Propagation of light in materials. Dispersion and its effects. Special topics in geometric and wave optics. Laboratory complements class work.

PH-232 Laser and Electro-Optics Technology*

3 class hours 2 recitation hours 3 laboratory hours 5 credits
Prerequisite: PH-231

Wave optics, interference, coherence, polarization, birefringence, diffraction, gratings in two and three dimensions, power and energy measurements, basics of laser safety, ultra-fast pulse techniques, electro-optic and acousto-optic switches, optical materials, non-linear optics. Laboratory complements class work.

* Students registering in PH-232 are required to pay a special services charge of $10.00.

PH-233 Laser Electro-Optics Devices, Measurements and Applications

3 class hours 3 laboratory hours 4 credits
Prerequisite: PH-231

Laser as a device, principle of operation, cavity modes and their control (tuning elements, Q switching, mode-locking) and detection, laser design, types of lasers, includes discussion of laser types for medical, ranging and tracking, material processing, pollution monitoring, and optical memory applications, semiconductor laser. Laboratory complements class work.

PH-234 Fiber Optics Devices, Measurements and Applications*

3 class hours 3 laboratory hours 4 credits
Prerequisites: PH-231, or ET-220 and PH-202.

Propagation of light in optical fiber, including analysis of the behavior of different modes. Dispersion and distortion. Specialized light sources and their characterization. Fiber optic sensors. All-optical fiber amplifiers. Optical switches and logic gates. Optical isolators. Techniques for joining fibers. Instruments for characterizing fiber and fiber links. Optical communications systems and protocols. Wavelength division multiplexing. Medical applications including fiber optics-diagnostic and surgical. Optical data processing and optical memories. Laboratory complements class work.

* Students registering in PH-234 are required to pay a special services charge of $40.00.

PH-235 Laser/Electro-Optics Projects

2 class hours 3 laboratory hours 3 credits
Prerequisite: PH-231
Corequisite: ET-910 or permission of the Department

Construction and testing of a laser, optical or electro-optic device such as a helium-neon laser, optical power meter, or fiber optics communication link oral presentations and computerized literature searches.

PH-236 Introduction to Computers in Electro-Optics

1 class hour 3 laboratory hours 2 credits
Prerequisite: PH-231
Corequisite: MA-128 or the equivalent.

Elements of a computer system and an introduction to computer languages. Scientific programming using BASIC/FORTRAN with applications in optics. Use of commercial optics programs. Digital techniques including number systems, logic gates, Karnaugh mapping, Boolean algebra, combinational logic design, sequential logic design.

PH-240 Computerized Physical Measurement Using Graphical Programming (2E)*

2 lecture hours 3 laboratory hours 3 credits
Prerequisites: Permission of the department based on one laboratory course in science or technology MA-114, MA-119 and MA-121 or the equivalent and ET-501, PH-303, BU-500 or the equivalent.

Students will design applications with a graphical programming language such as LabVIEWTM and use the computer for measurement and automation. Topics include: theory of measurement, physical principles of transducers and their use in measurement, instrument control, data acquisition, virtual instrumentation, signal/data conditioning and analysis.

* Course qualifies as Pathways Common Core 2E–Scientific World STEM Variant.

What Is The Future Of Gravitational Wave Astronomy?

After turning on in September of 2015, the twin Laser Interferometer Gravitational-wave Observatories -- the LIGO detectors in Hanford, WA and Livingston, LA -- simultaneously detected not just one but two definitive black hole-black hole mergers during its first run, despite having reached only 30% of the sensitivity it was designed for. These two events, one of a 36 and a 29 solar mass black hole merging on September 14, 2015, and one of a 14 and an 8 solar mass black hole merging on December 26, 2015, provided the first definitive, direct detections of the gravitational wave phenomena. It's a remarkable fact, in and of itself, that it took a full century after their predictions for technology to catch up to the theory, and actually catch them.

The first gravitational wave event ever directly detected. Image credit: Observation of . [+] Gravitational Waves from a Binary Black Hole Merger B. P. Abbott et al., (LIGO Scientific Collaboration and Virgo Collaboration), Physical Review Letters 116, 061102 (2016).

But detecting these waves is just the beginning, as a new era in astronomy is now dawning. 101 years ago, Einstein put forth a new theory of gravitation: General Relativity. Instead of distant masses instantaneously attracting one another across the Universe, the presence of matter and energy deformed the fabric of spacetime. This entirely new picture of gravity brought with it a slew of unexpected consequences, including gravitational lensing, an expanding Universe, gravitational time dilation and -- perhaps most elusively -- the existence of a new type of radiation: gravitational waves. As masses moved or accelerated relative to one another through space, the reaction of space itself causes the very fabric itself to ripple. These ripples travel through space at the speed of light, and when they pass through our detectors after a journey across the Universe, we can detect these disturbances as gravitational waves.

The spacetime in our local neighborhood, which can be ever so slightly perturbed by passing . [+] gravitational waves. Image Credit: T. Pyle/Caltech/MIT/LIGO Lab.

The easiest things to detect are the things that emit the largest signals, which are:

  • large masses,
  • with small distances between them,
  • orbiting quickly,
  • where the orbital changes are severe and significant.

This means collapsed objects, like black holes and neutron stars, are the prime candidates. We also need to consider the frequency at which we can detect these objects, which will be roughly equal to the path length of the detector (the arm length multiplied by the number of reflections) divided by the speed of light.

A simplified illustration of LIGO's laser interferometer system. Image credit: LIGO collaboration.

For LIGO, with its 4 km arms with a thousand reflections of the light before creating the interference pattern, it can see objects with frequencies in the millisecond range. This includes coalescing black holes and neutron stars in the final stages of a merger, along with exotic events like black holes or neutron stars that absorb a large chunk of matter and undergo a "quake" to become more spherical. A highly asymmetric supernova could create a gravitational wave as well a core-collapse event is unlikely to make detectable gravitational waves but perhaps nearby merging white dwarf stars could do it!

Image credit: Bohn et al 2015, SXS team, of two merging black holes and how they alter the . [+] appearance of the background spacetime in General Relativity.

We've seen black hole-black hole mergers already, and as LIGO continues to improve, we can reasonably expect to make the first population estimates of stellar mass black holes (from a few to maybe 100 solar masses) over the next few years. LIGO is also highly anticipating finding neutron star-neutron star mergers when it reaches the designed sensitivity, it may see up to three or four of these events each month if our estimates of their merger rates and LIGO's sensitivity are correct. This could teach us the origin of short-period gamma ray bursts, which are suspected to be merging neutron stars, but this has never been confirmed.

Illustration of a starquake occurring on the surface of a neutron star, one cause of a pulsar . [+] “glitch.” Image credit: NASA.

Asymmetric supernovae and exotic neutron star quakes are fun, if perhaps rare phenomena, but it's exciting to have a shot at studying these in a new way. But the biggest new advances will come when more detectors are built. When the VIRGO detector in Italy comes online, it will finally be possible to do true position triangulation: to locate exactly where in space these gravitational wave events are originating, making follow-up optical measurements possible for the first time. With additional new gravitational wave interferometers scheduled to be built in Japan and India, our coverage of the gravitational wave sky is slated to improve rapidly in the next few years.

Artist’s impression of eLISA. Image credit: AEI/MM/exozet.

But the biggest advances will come from taking our gravitational wave ambitions into space. In space, you're not limited by seismic noise, rumbling trucks or plate tectonics you have the quiet vacuum of space as your backdrop. You're not limited by the curvature of the Earth for how long you can build your gravitational wave observatory's arms you can put it in orbit behind the Earth, or even in orbit around the Sun! Instead of milliseconds, we can measure objects with periods of seconds, days, weeks or even longer. We'll be able to detect the gravitational waves from supermassive black holes, including from some of the largest known objects in the entire Universe.

Images credit: Ramon Naves of Observatorio Montcabrer, via . [+] (main) Tuorla Observatory / University of Turku, via (inset).

And finally, if we build a large enough, sensitive enough space observatory, we could see the leftover gravitational waves from before the Big Bang itself. We could directly detect the gravitational perturbations from cosmic inflation, and not only confirm our cosmic origin in a whole new way, but simultaneously prove that gravitation itself is a quantum force in nature. After all, these inflationary gravitational waves can't be generated unless gravitation itself is a quantum field. The success of LISA Pathfinder more than proves this is possible all it takes is the right investment.

Illustration of the density (scalar) and gravitational wave (tensor) fluctuations arising from the . [+] end of inflation. Image credit: National Science Foundation (NASA, JPL, Keck Foundation, Moore Foundation, related) – Funded BICEP2 Program.

There's currently a hotly contested race as to what will be chosen as the flagship NASA mission of the 2030s. Although many groups are proposing good missions, the biggest dream is a space-based, gravitational wave observatory in orbit around the Sun. A series of these could make our wildest gravitational wave dreams come true. We have the technology we've proved the concept we know the waves are there. The future of gravitational wave astronomy is limited only by what the Universe itself gives us, and how much we choose to invest in it. But this new era has already dawned. The only question is how bright this new field in astronomy is going to be. And that part of it is completely up to us.