dcsimg
 
 
 
 
 

Makes IT Look Easy

 
 
 
 
 
 
 
 

by Tim Moran

In a tunnel 17 miles in circumference some 500 feet beneath the Franco-Swiss border lies the LHC--the Large Hadron Collider, "the world's most powerful particle accelerator. High-energy protons in two counter-rotating beams [are] smashed together in a search for signatures of supersymmetry, dark matter and the origins of mass. . . . The detectors could see up to 600 million collision events per second, with the experiments scouring the data for signs of extremely rare events, such as the creation of the much-sought Higgs boson."

And do you know what has to sift through all of these events? Software, of course.

A recent article on arstechnica.com, "The software brains behind the particle colliders," attempts to explain the role of software in this dizzying search for the origins of the universe. The article quotes Srini Rajagopalan, a Brookhaven National Lab employee working at CERN, the European Organization for Nuclear Physics and the builder and overseer of the LHC, as saying that the detector's software includes "event filters."

"Basically, the software can determine the extent to which the particles and energy that come out of a collision match a pattern that we'd expect to be produced by a given particle. . . . Right now, the software already has 300 event filters, but it can apparently handle up to 8,000, and prioritize each of them--so, for example, we're likely to try to capture more potential Higgs events than top quarks." Naturally.

The breadth, depth, and scope of the data that must be processed coming from the LHC is staggering. And so are the networking issues. The article goes on to say that Brookhaven is the primary U.S. interface for this data, and its main role "will simply be storing any data that makes it through the event filters as it arrives from CERN, and distributing it to various Tier 2 and 3 locations across the country (Brookhaven also houses a 10,000-core grid computing facility that will perform some analysis)."

Ofer Rind, a physicist at Brookhaven, describes all this as an "embarrassingly parallel problem," in computer science terms. Since each event is essentially independent, they can all be analyzed separately. As a result, the high-energy physics community has a great deal of experience with grid computing. "We've been doing this for a while, and with a lot less money than the cloud folks," Rind said. Take that, IT boys and girls!

So, when you're having some trouble with building an enterprise architecture blueprint or CRM's got you down or the supply-chain links are rusty--buck up. You could be dealing with an environment such as that created by the LHC, in which data produced will hit a total output of 15 petabytes per year (approx. 500 Mb/s) and the end result of your work would be "the answer [to] many of the most fundamental questions in physics [and] the deep structure of space and time, especially regarding the intersection of quantum mechanics and general relativity, where current theories and knowledge are unclear or break down altogether." Really, ERP ain't so bad.

 
 
 

2 Comments for "Makes IT Look Easy"

  • Mike Flaherty April 07, 2010 3:04 pm

    Hi Tim, Great points!! Grid was here before the "cloud", it was cheaper than some cloud solutions, and worked great! Colocation and managed hosting firms could take some nice lessons from your grid vs cloud story. Mike Flaherty Online Tech www.onlinetech.com

  • Mike Flaherty April 07, 2010 2:57 pm

    Hi Ed, Nice article. We don't have too many customers in our data centers looking for 15 petabytes per year! That would be a challenge! And you make a great point, the buzz word today is "cloud", and it's a bit messy in the colocation and managed hosting / storage world, everyone has their own version of the cloud. I remember some of the past SETI and genetics grid applications consumers could download...they worked fine, and cost much less than current cloud initiatives today. Mike Flaherty Online Tech www.onlinetech.com

Leave a Comment