The document discusses the Pacific Wave exchange and Pacific Research Platform (PRP). It provides an overview of Pacific Wave, including its history and connectivity across the Pacific and western US. It then discusses how the PRP will build on infrastructure projects to create a high-speed "big data freeway" for science across California universities. This will allow researchers to more easily share and analyze large datasets for projects in areas like climate modeling, cancer genomics, astronomy and particle physics. Details are provided on specific science applications and datasets that will benefit from the enhanced connectivity of the PRP.
3. Pacific Wave
• Began as first geographically distributed exchange in
2004
• Pacific Wave is an open exchange supporting both
commercial and R&E peers
• Currently serves 29 countries peering across the Pacific
and Western United States
• With PNWGP and TransPac, announced the first
100Gbps Trans-Pacific link from Tokyo to Seattle in
2015
5. R&E Exchanges within R&E
• Pacific Wave (Western US)
– CENIC and PNWGP
• StarLight (Chicago, IL)
– StarLight Consortium/MREN
• MANLAN (New York, NY)
– NYSERnet
• WIX (Washington, DC)
– University of Maryland/MAX GigaPOP
• AmLight (Miami, Florida)
– Florida International University/Florida LambdaRail
6. National/Global Activities
• NSF provides support of the R&E exchange points
through the competitive IRNC (International Research
Network Connections) program with funding for
backbone, infrastructure and innovation
• The Global Lambda Integrated Facility
– The GLIF brings together some of the world’s premier
networking engineers who are working together to
develop and international infrastructure
8. Pacific Wave and NSF/IRNC
• Pacific Wave has been partially supported
through three separate five-year National
Science Foundation grants supporting growth,
connectivity and innovation
• Current award promotes 100G expansion and
implementation of SDX capabilities within
Pacific Wave (ACI-1451050)
9. SDX = SDN + IXP
9
AS A Router
AS C Router
AS B Router
BGP Session
SDN Switch
SDX Controller
SDX
10. Pacific Wave
and WRN
• Pacific Wave and the Western Region Network provide
for a 100Gbps network spanning the Western United
States serving PNWGP, CENIC, FRGP, ABQGP and UH.
• Pacific Wave and NSF IRNC awardee PIREN (Univ of
Hawaii) work together supporting AARNet links to
California and Washington and expansion of high-
speed service through the Pacific Islands Region
w w w . p n w - g i g a p o p . n e t
11. Pacific Wave and Collaboration
• Pacific Wave and StarLight are working
together on the SDX implementation to
provide seamless cross-domain support for
participants of both exchanges.
• Have direct interconnectivity between the
exchanges with 2 100G links between Seattle-
Chicago and Denver-Chicago
13. The Pacific Research Platform (PRP)
• NSF CC-NIE and similar projects represent significant investments in
campus infrastructure including SDN, DMZ’s (~130 projects)
• But the scientists are still struggling with the complexity of using the
network and interoperability between different implementations of
DMZ’s
• PRP focuses on enabling the science communities to make effective use
of the high performance infrastructure that is available.
• The idea was hatched in December 2014 – take advantages of the
infrastructure, including a PERFSONAR grid for measurement.
• And DTN’s and common software suite to demonstrate a proof of
concept for the PRP
• Demonstrated at the CENIC Spring meeting (March 2015)
14. CENIC/PRP Backbone Sets Stage for 2016 Wireless Expansion
of HPWREN into Orange and Riverside Counties
• CENIC/PRP Will Connect
UCSD and SDSU
– Data Redundancy
– Disaster Recovery
– High Availability
• CENIC Extension to UCI & UCR
– Data Replication Sites
UCR
UCI
UCSD
SDSU
Source: Frank Vernon,
Greg Hidley, UCSD
15.
16. “Pacific Wave & Pacific Research Platform Update:
Big News for Big Data”
Invited Presentation with David Reese
Annual CENIC Conference 2016
UC Davis
March 21, 2016
Dr. Larry Smarr
Director, California Institute for Telecommunications and Information Technology
Harry E. Gruber Professor,
Dept. of Computer Science and Engineering
Jacobs School of Engineering, UCSD
http://lsmarr.calit2.net
16
17. For Big Data Science, One Needs Bandwidths Orders of Magnitude Higher
Than the Shared Internet
Bandwidth from My Office in Calit2’s
Qualcomm Institute
Bandwidth On the Pacific
Research Platform:
500 Times the Bandwidth of the Shared Internet!
18. How Prism@UCSD Transforms Big Data Microbiome Science:
Preparing for Knight/Smarr 1 Million Core-Hour Analysis
12 Cores/GPU
128 GB RAM
3.5 TB SSD
48TB Disk
10Gbps NIC
Knight Lab
10Gbps
Gordon
Prism@UCSD
Data Oasis
7.5PB,
200GB/s
Knight 1024 Cluster
In SDSC Co-Lo
CHERuB
100Gbps
Emperor & Other Vis Tools
64Mpixel Data Analysis Wall
120Gbps
40Gbps
1.3Tbps
19. Next Step: The Pacific Research Platform Creates
a Regional End-to-End Science-Driven “Big Data Freeway System”
NSF CC*DNI Grant
$5M 10/2015-10/2020
PI: Larry Smarr, UC San Diego Calit2
Co-Pis:
• Camille Crittenden, UC Berkeley CITRIS,
• Tom DeFanti, UC San Diego Calit2,
• Philip Papadopoulos, UC San Diego SDSC,
• Frank Wuerthwein, UC San Diego Physics and
SDSC
20. Dan Cayan
USGS Water Resources Discipline
Scripps Institution of Oceanography, UC San Diego
much support from Mary Tyree, Mike Dettinger, Guido Franco and other colleagues
NCAR Upgrading to 10Gbps Link from Wyoming and Boulder to CENIC/PRP
Sponsors:
California Energy Commission
NOAA RISA program
California DWR, DOE, NSF
Planning for climate change in California
substantial shifts on top of already high climate variability
UCSD Campus Climate Researchers Need to Download
Results from NCAR Remote Supercomputer Simulations
to Make Regional Climate Change Forecasts
21. average summer
afternoon temperature
average summer
afternoon temperature
Downscaling Supercomputer Climate Simulations
To Provide High Res Predictions for California Over Next 50 Years
21
Source: Hugo Hidalgo, Tapash Das, Mike Dettinger
22. Cancer Genomics Hub (UCSC) is Housed in SDSC:
Large Data Flows to End Users at UCSC, UCB, UCSF, …
1G
8G
Data Source: David Haussler,
Brad Smith, UCSC
15G
Jan 2016
Cancer Genomics Hub Users are Downloading 30,000 TB per Year
GE’s Industrial Internet Generates 10,000 TB per Day!
30,000 TB
Per Year
23. Two Automated Telescope Surveys
Creating Huge Datasets Will Drive PRP
300 images per night.
100MB per raw image
30GB per night
120GB per night
250 images per night.
530MB per raw image
150 GB per night
800GB per night
When processed
at NERSC
Increased by 4x
Source: Peter Nugent, Division Deputy for Scientific Engagement, LBL
Professor of Astronomy, UC Berkeley
Precursors to
LSST and NCSA
PRP Allows Researchers
to Bring Datasets from NERSC
to Their Local Clusters
for In-Depth Science Analysis
24. community resources. This facility depends on a range of common services, support activities, software,
and operational principles that coordinate the production of scientific knowledge through the DHTC
model. In April 2012, the OSG project was extended until 2017; it is jointly funded by the Department of
Energy and the National Science Foundation.
OSG Federates Clusters in 40/50 States:
Creating a Scientific Compute and Storage “Cloud”
Source: Miron Livny, Frank Wuerthwein, OSG
25. We are Experimenting with the PRP for Large Hadron Collider Data Analysis
Using The West Coast Open Science Grid on 10-100Gbps Optical Networks
Crossed
100 Million
Core-Hours/Month
In Dec 2015
Over 1 Billion
Data Transfers
Moved
200 Petabytes
In 2015
Supported Over
200 Million Jobs
In 2015
Source: Miron Livny, Frank Wuerthwein, OSG
ATLAS
CMS
26. PRP Will Support the Computation and Data Analysis
in the Search for Sources of Gravitational Radiation
Connecting Caltech at 10Gb/s
to SDSC Comet PetaFLOP supercomputer,
enabling LIGO computations
to enter via the same PRP “job cache” as for LHC.
800,000 core-hours LIGO data analysis
7 million core-hours computing waveforms
27. Forty Years of Computing Gravitational Waves
From Colliding Black Holes
1977
L. Smarr and K. Eppley
Gravitational Radiation Computed from
an Axisymmetric
Black Hole Collision 40 Years
2016
LIGO Consortium
Spiral Black Hole Collision
MegaFLOPS
PetaFLOPS
29. UCD
UCSF
Stanford
NASA
AMES/
NREN
UCSC
UCSB
Caltech
USC UCLA
UCI
UCSD SDSU
UCR
Esnet
DoE Labs
UW/
PNWGP
Seattle
Berkeley
UCM
Los
Nettos
Internet2
Internet2
Seattle
Note: This diagram represents a subset of sites and connections.
* Institutions with
Active Archaeology
Programs
“In an ideal world –
Extremely high bandwidth to
move large cultural heritage
datasets around the PRP cloud for
processing & viewing in CAVEs
around PRP with Unlimited
Storage for permanent archiving.”
-Tom Levy, UCSD
PRP is NOT Just for Big Data Science and Engineering:
Linking Cultural Heritage and Archaeology Datasets
Building on CENIC’s Expansion
To Libraries, Museums,
and Cultural Sites
30. Next Step: Global Research Platform
Building on CENIC/Pacific Wave and GLIF
Current
International
GRP Partners
31. Learn More About PRP on Wednesday
PRP Science Driver PI Workshop Description
[Flyers will be available at the CENIC registration desk]
Date and Time: Wed., March 23 from 9 - 11 am
Location: Executive Meeting Room, Hyatt Place UC Davis
Description: Panel discussion on capacity and promise of the
Pacific Research Platform, a high-speed, expanded
cyberinfrastructure that will move data 1,000 times faster than
today's inter-campus shared Internet.