Fermilab Computing Division

CS Document 5903-v1

CHEP 2016 - Big Data” in HEP: A comprehensive use case study

Document #:
Document type:
Submitted by:
Oliver Gutsche
Updated by:
Oliver Gutsche
Document Created:
30 Jan 2017, 11:02
Contents Revised:
30 Jan 2017, 11:02
Metadata Revised:
30 Jan 2017, 11:02
Viewable by:
  • Public document
Modifiable by:

Quick Links:
Latest Version

Other Versions:
31 Jan 2017, 03:42
Experimental Particle Physics has been at the forefront of analyzing the world’s largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called “Big Data” technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity.
Files in Document:
  • Oral presentation (161010 - CHEP2016 - Big Data in HEP.pdf, 5.7 MB)
  • Proceedings (170130 - CHEP 2016 Proceedings - CMS Big Data.pdf, 231.8 kB)
CHEP2016 big data
Associated with Events:
CHEP 2016 held from 10 Oct 2016 to 14 Oct 2016 in San Francisco
Publication Information:
DocDB Home ]  [ Search ] [ Authors ] [ Events ] [ Topics ]

DocDB Version 8.8.9, contact Document Database Administrators