This picture is a still shot from a movie, and the little parade of galaxies marching diagonally across it is a section of one filament in a vast network of galaxies. Before I get to the point, let us pause a moment and reflect: these are fucking galaxies. They’re all Milky Ways of 10 billion solar systems each, maybe more. And they’re real, this is real data. I breathe deeply and move on. I wanted you to see this still picture before you see the movie.
The movie is a simulation — that is, the point of view is only God’s, nobody anywhere sees this. It’s God flying through the universe, swooping through the lace and foam of galaxies, slipping past black voids, swerving in to examine the knots in the filaments. But the data is from the Sloan Digital Sky Survey and every galaxy you see is really where you see it. The simulators fudged a little on the galaxies themselves: as Miguel Angel Aragon Calvo, one of the simulators, says, you can’t just load half a million galaxies into memory and if you did, the rendering would be slow, “so I do a trick.” They took 300 real images of real galaxies, made templates of them, and fitted the rest of the Sloan galaxies to the template galaxies with the closest properties.
Cosmologists have used simulations for decades now to check their theories: the universe happened only once and observations are limited to the latest technology, so put your theory into a computer and see whether the universe that comes out looks like the one we have. But in the last few years, observers’ surveys like the Sloan are turning out more data and faster and the problem is now to make sense of it. One way is called data visualization, a catch-all practice that ranges from fancy graphs to the kind of simulation in the movie above. Miguel Angel Aragon Calvo (I just like writing out his full name), research scientist at Johns Hopkins, and Mark Subbarao, astronomer at the Adler Planetarium and the University of Chicago, both out of Alex Szalay’s shop, have done visualizations before; this current one took about a year.
Data visualization isn’t astronomical science the ways numbers and graphs are and until astronomers started generating too many numbers, they didn’t take it seriously. But data visualizations clarify the difference between knowing something and actually seeing it. Subbarao says the visualization gives him a new sense of the three-dimensional cartography — the structure — of the universe. Miguel Angel Aragon Calvo says he’s heard from astronomers that until they saw the visualization, they hadn’t “seen” the hierarchy implicit in the voids. Visualizations use the brain’s ability to identify complex patterns, he says; looking at them makes new patterns “click.”
In fact, mental visualization has always been one way astronomers get at this other-worldly subject: in a book of interviews, one astronomer after another says things like, I close my eyes until I can see what happens and go on to the equations later; I picture the big bang as something I’m inside of; I think of the universe’s structure as waves on a choppy pond; I imagine great big structures with flexible, stretchable girders. With actual physical visualizations, Miguel Angel Aragon Calvo thinks, “in the near future we will be able to interact with complex data in real time and using natural interfaces to identify ‘regions of interest’ in the data.” I’m not sure exactly what he means by that — maybe that visualizations can mediate between observations and theory?
Here’s what “clicks’ for me: in two dimensions, the Sloan survey covers 14,555 square degrees of the sky; the whole sky is 40,000 square degrees, so the Sloan covers maybe a third of the sky. In the third dimension, time, the Sloan goes back to a median redshift of 0.52, about 3 billion years back in time, in a universe that’s 13.6 billion years old, so maybe a quarter of the way back. In this quarter/third of the observable universe, the Sloan catalogs 932,891,133 objects. These numbers have important caveats so don’t necessarily believe them. My point is, to see how the numbers feel, go back to the movie and zoom around with God.
Absolutely fantastic but un-nerving that I have identical dreams to this simulation.
Good grief, Rosie! You should hire yourself out to these guys — they took a year, you could do it overnight.
Very cool!
Made me think though, about some of the choices that God made with his flight path.
For instance, why did he go left at 23 secs instead of right, and then what would the Universe have looked like if he had chosen to go right instead? Why did he zig and not zag?
So we get one man’s viewpoint of God’s path and of the Universe, while the rest are left to interpretation. It’s a bit like religion, really. 🙂
You know someone’s excited when they drop a few F-bombs into a post 🙂 Absolutely justified, this is super cool!
Hang on. You mention the time dimension at the end, and the Sloane data is recording where each galaxy ‘was’, relative to us, when it emitted the light we now see, not where it is ‘now’. So it’s not a God’s eye view, is it?
Ignore me – pretty f’ing amazing little movie!
That’s your English Sloane, Tim. Ours is the American Sloan. I correct your spelling, you question my understanding of space/time and light. I think you win.
I suppose I was thinking Sloane Rangers, seemed appropriate …? Let’s just agree that Jim Gunn is God and call it quits, heh?
Whoa. How many light years did we just travel?
Tim: yes, we’ll agree on spelling and then on God.
AliB: Oh don’t ask me questions like that. I mean, a lot but not as many as we might have. These guys deal in lots of millions, sometimes billions.
Stunning!
OK, so this is an area I actually know something about – data visualization. Do a Google Images search for data visualization to see many striking images. And, of course, my favorite data geek, Hans Rosling – go to http://www.gapminder.org/videos and browse through the videos, my favorite being “200 Countries, 200 Years, 4 Minutes”.
But wait, there’s more! Check out data sonification, the representation of data through sound, the most familiar and simplest examples being the Geiger Counter and sonar. It has also been used in medical, vehicle displays/controls, and geological applications. For physics examples of solar data and of the Higgs Boson respectively, check out phys.org/news186418364.html (solar data) and news.discovery.com/space/listen-to-the-higgs-boson-120710.html. While I find this fascinating and probably quite useful in things like back-up safety for your car (and Geiger Counters), it is not clear to me how much incremental value it brings to data analysis.
That’s so interesting, Bruz. I hadn’t thought of sonification as a way of visualizing something, “visualizing” meaning “imagining,” oops, meaning “mentally picturing,” oops, can’t get away from the visual.
I should think “imagining” works, if you use the definition of that word “to form a mental representation”. Doesn’t have to be visual – any of the senses will do. I am waiting for the invention of data smellification, or maybe data tasting parties.