Connectionists: Brain-like computing fanfare and big data fanfare

Balázs Kégl balazskegl at gmail.com
Sat Jan 25 04:57:39 EST 2014


> Forbes magazine estimated that finding the Higgs Boson cost over $13BB, conservatively.  The Higgs experiment was absolutely the opposite of a Big Data experiment - In fact, can you imagine the amount of money and time that would have been required if one had simply decided to collect all data at all possible energy levels?   The Higgs experiment is all the more remarkable because it had the nearly unified support of the high energy physics community, not that there weren’t and aren’t skeptics, but still, remarkable that the large majority could agree on the undertaking and effort.  The reason is, of course, that there was a theory - that dealt with the particulars and the details - not generalities. 

I agree with you on your argument for needing a model to collect data. At the same time, the LHC is also probably a good example for showing that even with a model you end up with huge data sets. The LHC generates petabytes of data per year, and this is after a real-time filtering of most of the uninteresting collision events (a cut of roughly six orders of magnitude). Ironically (to this discussion), the analysis of these petabytes makes good use of ML technologies developed in the 90s (they mostly use boosted decision trees, but neural networks are also popular) .

Balázs



—
Balazs Kegl
Research Scientist (DR2)
Linear Accelerator Laboratory
CNRS / University of Paris Sud
http://users.web.lal.in2p3.fr/kegl







More information about the Connectionists mailing list