Last week the Wall Street Journal did a story on light pollution. The article is a good summary of what is known and going on in light pollution lately. The observatory got some nice coverage including this great quote:
"We convert that starlight into knowledge," says Dan McKenna, superintendent of the Palomar ObservatoryOn the subject of light pollution, the Pauma tribe is poised to make a major expansion to their casino. Thankfully, they have been working with their neighbors, including the observatory. They will follow San Diego County's light pollution ordinance and have agreed to other terms to lessen the impact on the area. Read about it in this story from the San Diego Union Tribune.
Also, some of the software used on the Hale Telescope's adaptive optics system recently was named co-winner as NASA's 2007 Software of the Year Award. Winning was Jet Propulsion Laboratory’s Adaptive Modified Gerchberg-Saxton Phase Retrieval program, which analyzes data from a telescope's science camera to detect possible errors that limit its imaging performance.
Finally, the Palomar Transient Factory, set to debut this fall, got a mention in this article on data storage for NASA applications. The article mentions the Hale Telescope, but didn't get things quite right. The Palomar Transient Factory will use the 48-inch Samuel Oschin Telescope to hunt for unknown or variable objects. Data will be analyzed in real time with follow-up observations performed on the Palomar 60-inch telescope and others, including the Hale. A better description can be found here. From the article
NASA's IPAC (Infrared Processing and Analysis Center) is setting up a 4-year Palomar Transient Factory (PTF) project which will capture night sky images from the Oschin 48-inch telescope at Palomar in an attempt to detect and follow supernova events by registering changes in their spectroscopic data over time. This will involve something like 30,000 images of up to 30million astronomic objects per night (on clear nights), meaning around 40GB of data every 24 hours. The project team estimates there could be 42 billion images stored over the life of the project.