Skip navigation
Help

NASA

warning: Creating default object from empty value in /var/www/vhosts/sayforward.com/subdomains/recorder/httpdocs/modules/taxonomy/taxonomy.pages.inc on line 33.
Original author: 
Michael D. Lemonick

Amateur astronomers call it the Penguin, and no wonder. Even through a good-size backyard telescope, that’s exactly what seems to be out there, hanging in distant space 326 million light years from Earth. With the clear-eyed vision of the Hubble Space Telescope, the resemblance is even more striking: it’s as though some cosmic artist has captured a bright-eyed, sharp-beaked bird leaning protectively over a reddish egg, with two stars — one shooting — in the skies above.

Both bird and egg are fully certified galaxies, though, lying in the constellation Hydra. The bird is a spiral galaxy, officially known as NGC 2936, and it would normally look like the Milky Way — a great, majestically spinning pinwheel made up of hundreds of billions of stars.

But the egg has changed all that. It’s a blob-shaped elliptical galaxy, NGC 2937, and its gravity has pulled and elongated the spiral, stretching one side into a sharp, beak-shaped projection and smearing the other side into the penguin’s body. (The reddish streaks are clouds of interstellar dust that formerly permeated the galaxy’s spiral arms). The two bright spots hovering above the penguin’s head are plain old stars within the Milky Way that just happen to lie in the same direction as the two galaxies — and the streak that seems to be flying away from the right-hand star is yet another galaxy, far in the background.

(MORE: Meet the Itsy-Bitsy, Teeny-Weeny Galaxy)

Back in the 1960s, astronomer Halton Arp included this weird configuration in his Atlas of Peculiar Galaxies, but as telescopes have gotten more powerful, scientists now know that such distorted shapes are usually caused when two or more galaxies venture too close to each other, “exchanging matter and causing havoc” as a press release puts it.

The explanation is prosaic, but the image, taken recently in both infrared and visible light by the Hubble’s Wide Field Planetary Camera 3, is anything but. It’s just one more in a long list of space objects that look at least passingly biological — the Horsehead Nebula, the Crab Nebula, the Cat’s Eye Nebula, Jupiter’s moon Europa (which looks something like a bloodshot eyeball), and the infamous Face on Mars are just a few examples. Even the structure of the universe itself resembles the structure of the human brain, according to some scientists.

It’s no surprise, though: humans are hard-wired to see patterns in nature. That’s why we see all manner of creatures, not just in the heavens, but also in clouds. It’s a consequence of evolution — but it also transforms the world around us into a sort of living poetry.

(PHOTOS: The Solar-Powered Plane Soars Across America)

Michael D. Lemonick is a regular contributor to TIME, writing on science, space and technology. 

0
Your rating: None
Original author: 
WSJ Staff

NASA released a false-color image of one of the first close-up views of a massive hurricane churning above Saturn's north pole today.

0
Your rating: None
Original author: 
Arik Hesseldahl

cloud1Here’s a name I haven’t heard in a while: Anso Labs.

This was the cloud computing startup that originated at NASA, where the original ideas for OpenStack, the open source cloud computing platform, was born. Anso Labs was acquired by Rackspace a little more than two years ago.

It was a small team. But now a lot of the people who ran Anso Labs are back with a new outfit, still devoted to cloud computing, and still devoted to OpenStack. It’s called Nebula. And it builds a turnkey computer that will turn an ordinary rack of servers into a cloud-ready system, running — you guessed it — OpenStack.

Based in Mountain View, Calif., Nebula claims to have an answer for any company that has ever wanted to build its own private cloud system and not rely on outside vendors like Amazon or Hewlett-Packard or Rackspace to run it for them.

It’s called the Nebula One. And the setup is pretty simple, said Nebula CEO and founder Chris Kemp said: Plug the servers into the Nebula One, then you “turn it on and it boots up cloud.” All of the provisioning and management that a service provider would normally charge you for has been created on a hardware device. There are no services to buy, no consultants to pay to set it up. “Turn on the power switch, and an hour later you have a petascale cloud running on your premise,” Kemp told me.

The Nebula One sits at the top of a rack of servers; on its back are 48 Ethernet ports. It runs an operating system called Cosmos that grabs all the memory and storage and CPU capacity from every server in the rack and makes them part of the cloud. It doesn’t matter who made them — Dell, Hewlett-Packard or IBM.

Kemp named two customers: Genentech and Xerox’s research lab, PARC. There are more customer names coming, he says, and it already boasts investments from Kleiner Perkins, Highland Capital and Comcast Ventures. Nebula is also the only startup company that is a platinum member of the OpenStack Foundation. Others include IBM, HP, Rackspace, RedHat and AT&T.

If OpenStack becomes as easy to deploy as Kemp says it can be, a lot of companies — those that can afford to have their own data centers, anyway — are going to have their own clouds. And that is sort of the point.

0
Your rating: None

MarkWhittington writes "NASA engineers at the Marshall Spaceflight Center in Huntsville, Ala., are building a mockup of what appears to be a deep space habitat, though it could also be part of an interplanetary spacecraft. The purpose is to do human factors studies to find out how to sustain astronauts on lengthy deep space missions."


Share on Google+

Read more of this story at Slashdot.

0
Your rating: None

The startling majesty – and deceptive complexity – of Michael Benson’s space art can be traced back through a process he dubs “true color.” A multimedia artist, Benson is a man utterly fascinated with outer space (he points to 2001: A Space Odyssey as an inspiration for his interstellar works — works that so impressed 2001 author Arthur C. Clarke that the sci-fi titan agreed to write the foreword to one of Benson’s books), and he has fixed his talents on creating visions that break free of the confines of Earth, enabling viewers to behold the unseen wonders of the universe.

To encounter a Benson landscape is to be in awe of not only how he sees the universe, but also the ways in which he composes the never-ending celestial ballet. From the spidery volcanic fractures that scar the surface of Venus to the time-lapse flight path of a stray asteroid, the dizzying close-ups of the swirling “red spot” of Jupiter, the x-ray-filtered view of the sun’s surface and the rippling red dunes of Mars, Benson is a visual stylist with a gift for framing and focus. Apart from cutting-edge high-definition renderings of our solar system’s most familiar objects, he also routinely converts extra-terrestrial terrain into thrilling, abstract landscapes that seem positioned somewhere between the scientific and the avant-garde.

The cover of Planetfall: New Solar System Visions

The cover of Planetfall: New Solar System Visions

Some of his greatest achievements skew towards the hyper realistic; I have been following Benson’s work for years and still the image I remember most is a massive, intricately-detailed view of the surface of Io, one of Jupiter’s moons (slide 13 in the gallery above). Looming large in a print that renders the Io surface in a yellow-brownish hue, delineating the moon’s different terrains, Benson’s color scheme accentuates the dark volcanic calderas that dot the satellite’s surface. The final result is sharp, meticulous and magnificent. At first glimpse it’s a simple planetary object, but the closer your eye scans the terrain, the more you realize that Benson has somehow taken this imagery captured 400 million miles away and given us a front-row seat to consider the turbulent topography of this alien orb. Benson’s visions demand more than a single look; the longer one spends with his vast landscapes, considering the scale and scope, the more they facilitate a state of meditation.

Behind every one of these images, however, lies an intricate and involved photo editing process (watch the video of Benson’s method above). Benson typically begins each work by filtering through hundreds or thousands of raw images from space, made available to the public by NASA and the European Space Agency – photographs that have been taken by unmanned space probes flying throughout the solar system, rovers on Mars or humans aboard the International Space Station. Many of these photos come back to Earth as black and white composites, or as images created with only a few active color filters. Benson then sorts through the images in a hunt for something surprising, revealing or noteworthy. Once he’s found a subject of interest, he starts stitching together individual snapshots to create larger landscapes, and filtering these landscapes through his own color corrections to create a spectrum that approximates how these interstellar vistas would appear to the human eye.

In his latest published photo collection Planetfall: New Solar System Visions, now available from Abrams, Benson details the fine points of his processing techniques:

“The process of creating full-color images from black-and-white raw frames—and mosaic composites in which many such images are stitched together—can be quite complicated,” Benson writes. “In order for a full-color image to be created, the spacecraft needs to have taken at minimum two, but preferably three, individual photographs of a given subject, with each exposed through a different filter… ideally, those filters are red, green, and blue, in which case a composite color image can usually be created without too much trouble. But in practice, such spacecraft as the Cassini Orbiter or the Mars Exploration Rovers … have many different filters, which they use to record wavelengths of light well outside of the relatively narrow red, green and blue (RGB) zone of the electromagnetic spectrum that human eyes can see.”

Benson goes on to explain that he will often start working with images that are missing an essential filter — that ultraviolet and infrared filters have been used instead of color filters, meaning the composite image is lacking necessary information.

It is here where Benson has carved out an area of expertise, filling in that missing image information to add shape, scale and color to the planetary bodies he hopes to explore. The resulting visuals, as you can see above, are pristine and powerful glimpses of the furthest reaches of our solar system (and, in some of Benson’s other works, the very edges of the universe). With the landing of the Curiosity rover on Mars in August, and its subsequent photographs of what appears to be Martian riverbeds, the world was once again reminded of the power of a single image transmitted back to Earth across millions of miles of open space. It’s a dizzying thing, to behold an alien world, and scanning through the portfolio of Michael Benson — a true “space odyssey” — is to experience this rush of discovery again and again.

Michael Benson’s new book Planetfall: New Solar System Visions, is now available from Abrams. Also featured above are images from Beyond: Visions of the Interplanetary Probes (Abrams, 2008). Images from Planetfall will be on display at New York’s Hasted Kraeutler Gallery in December 2012. To see more of Benson’s work, visit his web site.

Steven James Snyder is an Assistant Managing Editor at TIME.com.

0
Your rating: None

Since astronauts must learn to adjust to dark, isolated and confined spaces, scant equipment, privacy and supplies--where better to practice for the deprivations of life in space than deep underground? Because navigating safely in a cave requires tethering, 3D orientation, with no-touch areas (statlactites, stalagmites) and treacherous no-go zones, it can provide many of the same technical challenges as a spacewalk.

0
Your rating: None
Image Courtesy NASA/JPL-Caltech

A note to viewers: LightBox suggests viewing the panorama in full-screen mode. For visitors on a mobile device or tablet, we recommend utilizing our versions optimized for a fully immersive experience: 

iPAD version iPHONE version

Taking pictures on another world has never been just point and click. For decades, unmanned probes from Earth have been venturing to distant planets, moons and other bodies—and for just as many decades, the images they have sent home have been composed and transmitted in a decidedly painstaking way. That is especially so in the case of the 360-degree panorama NASA is now releasing of the Curiosity rover’s landing site in Mars’s Gale Crater.

Even on Earth, you have to be selective when you photograph a landscape. After all, no matter how glorious your picture of one part of the Grand Canyon is, it by definition leaves out countless other, equally glorious parts. The only way to capture the whole sweep of the place is to take many small images and bit by bit, piece them all together. That’s hard enough when the camera is in your hand. Now imagine doing it when all of your hardware is 154 million miles away and the data has to be streamed back you in a comparative trickle that, even moving at light speed, takes 17 minutes to get here.

(See more: Inside Look at the Mars Curiosity Rover)

But NASA did just that to produce its full pirouette picture of the Marscape that surrounds Curiosity. The panorama was built from 30 smaller images shot by the rover’s Navcams—or navigation cameras—on Aug. 18 and Aug. 7. Each picture has a resolution of 1,024 pixels by 1,024 pixels, and all of them have been combined in such a way that the seams connecting them disappear. The lighter colored strip at the top right of the image is the rim of Gale Crater—chosen as the landing site because it was once a deep sea. Also visible is the peak of nearby Mount Sharp, which rises 3.4 mi. (5.5 km) into the rust-red sky. The portions of the picture in the Martian sky that appear gray are parts of the mosaic that have not yet been added, but will be the next time NASA updates the image.

As their name implies, the Navcams are used mostly for reconnaissance purposes—scouting out where the rover will drive and mapping the best route to get there. They were thus not designed with beauty in mind—and that means they shoot only in black and white. The cameras mounted atop Curiosity’s mast capture the full range of desert-like colors that define the brutally beautiful Gale Crater environment. The entire suite of on-board cameras will have a lot of work to do in the two years ahead—and every picture they take will be one worth saving. Once the rover starts rolling, after all, it will never be staying in any one place for long.

(Related: Window on Infinity: Pictures from Space)

0
Your rating: None