Three Case Studies of Large-Scale Data Flows
Citations Over Time
Abstract
We survey three examples of large-scale scientific workflows that we are working with at Cornell: the Arecibo sky survey, the CLEO high-energy particle physics experiment, and the Web Lab project for enabling social science studies of the Internet. All three projects face the same general challenges: massive amounts of raw data, expensive processing steps, and the requirement to make raw data or data products available to users nation- or world-wide. However, there are several differences that prevent a one-sizefits- all approach to handling their data flows. Instead, current implementations are heavily tuned by domain and data management experts. We describe the three projects, and we outline research issues and opportunities to integrate Grid technology into these workflows.
Related Papers
- → Analysis and processing aspects of data in big data applications(2020)20 cited
- → Research on the raw data processing method of the hydropower construction project(2018)1 cited
- → Hardware and networks for Gaia data processing(2010)
- → A Case Study of Data Grid and GIS Technology for E-science Application(2009)