literature
Abstract:
CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.
  • CMS
  • data management
  • programming
  • performance
  • [1]

    The Compact Muon Solenoid Computing Technical Proposal

    Collaboration
  • [2]

    The Computing project Technical Design Report

    Collaboration
  • [3]
    "PhEDEX high throughput data transfer management system, Proc. Int. Conf. on Computing in High Energy and Nuclear Physics (Mumbai) vol2 (Macmillan India ltd.) p. 1027
    • J. Rehn
  • [4]

    CRAB: a tool to enable CMS Distributed analysis

    • F.Fanzago