The document summarizes updates and enhancements to earthquake analysis processing in Hazus version 3.2. Key points include:
- Hazard factor processing was improved by using spatial SQL instead of iterative row processing for faster updates of soil, liquefaction, landslide, and water depth values.
- Probability and ShakeMap analysis were enhanced by directly updating values in spatial tables using stored procedures instead of iterative feature processing.
- Performance testing showed the new methods led to up to a 10x speed increase in processing times for study regions.
- Data importing was simplified by directly projecting spatial files instead of recreating geometry, allowing faster and simpler imports.
3. Goals of Hazus 3.2 release
Migrate ArcObjects calls to Geoprocessing Tools
Eliminate pGDBs still needed for Earthquake
Leverage Spatial SQL capabilities
Improve Processing Speed
Simplify processing where possible
Make internal processing more accurate
3
4. Hazard Factors Processing
New spatial tables created in study region database:
eqSrSoil, eqSrLQF, eqSrLND and eqSrWaterDepth
Values updated when default value updated, user datamap
loaded, or as first step of analysis
ArcObjects uses ICursor an iterative row by row loop
processing method
Spatial SQL utilizes set based approach updating all values
in a single step
4
Soil, Liquefaction, Landslide, WaterDepth
5. Hazard Factors Processing
v3.1
Create fields dynamically
in pGDB (if not existing)
Loop over all features
and load the default
value
Loop over all features 1
by 1 and update values
by performing spatial
lookup
Push updated values
from pGDB to SQL via
stored procedure
v3.2
Use Spatial SQL via stored
procedure to update
features value in a set
based fashion
5
Soil, Liquefaction, Landslide, WaterDepth
7. Probability Analysis
v3.1
Processed central USGS grids to
USGS grids just of the study
region at aggregation
Loop over all pGDB tables and
add fields (PGA, PGV, SA03,
SA10)
Loop over all features 1 by 1 and
update values by performing
spatial lookup
Loop over all features and
compute their soil amplification
value in code
Push updated values from pGDB
to SQL via stored procedure
v3.2
Stored procedure to update
values directly using
central USGS grids
7
8. ShakeMap Analysis
v3.1
Loop over all pGDB tables
and add fields (PGA, PGV,
SA03, SA10)
Loop over all features 1 by
1 and update values by
performing spatial lookup
Push updated values from
pGDB to SQL via stored
procedure
v3.2
Stored procedure to update
values directly using spatial
SQL against new spatial
tables
8
9. Performance Comparison
Study Region Hazus 3.1 Hazus 3.2 Performance
increase
Charleston, SC 1hr 43min 10 min (Win10 32GB) 10.3x faster
Greater Los
Angeles area
1hr 58min 58 min (Win10 32GB) 2.03x faster
Kenai Alaska Could not import
50K AEBM
1hr 2min (Win10 32GB)
9
Aggregation times enhanced to less than 5 minutes*
10. Importing (UDF & AEBM)
v3.1
Tabular data was supplied as
input with lat long values as
fields
Geometry was created for
each row as it was imported
Temporary tables were
created and then joined and
records add to SQL through
a convoluted process
For some processes, 3.x
users needed elevated ESRI
license level to complete
v3.2
Views for each feature type
were created as templates
of the data structure
pGDB Feature classes are
the input and their
geometry is imported
directly by projecting to
Hazus wkid 4326
Import done in a single
step via parameterized SQL
10
11. Impacts of Import Changes
Spatial data is imported directly and not recreated
Paves the way for importing line and shape data
Process is greatly simplified for maintainability and future
enhancement
Speed is increased dramatically. Approx. 49,000 records in
5 minutes
11
Troy Schmidt with Factor INC Specialize in developing software that supports managing risk in both the private and public sectors
<Importance of Hazus> Hazus project team for nearly 3 years
Worked on Hazus for 2 years on the Modernization efforts
Worked on all three hazards in various capacities
Improve build steps
Upgrade VB6 code to C#
Remove pGDB dependency
Lessen ArcObjects dependency
Streamline code \ Holistic view
Released just last week.
Improvements as part of final stage of Modernization efforts.
Goals predicated on migration away from ArcObjects and the elimination of the study region pGDBs that were only still around for Earthquake.
Over the course of development additional goals were targeted.
4 variables that get loaded for every feature. Loaded from default values and / or user supplied data maps
At 3.2 four new spatial tables exist. They are used in the new stored procedure to do set based SQL updates.
Previously Hazus used ArcObjects and cursors to loop over each feature one at a time and update values. Now it uses a set based approach.
Before processed every feature twice and the data was duplicated.
Because the values are discrete it still updates based on point.
At 3.2, everything is performed on a set basis once in the stored procedure using spatial SQL.
First thing to talk about in the analysis is how the processing of the spatial lookup has changed.
PGA displayed Centroid would indicate 1.24 but it computes out to 1.022.
3.1 this was point based
3.2 it is shape based
Performance monitored SQL
GetPrimaryKeyColumn 15x faster
Can we improve things called A LOT
Can we improve things that take a LONG TIME
If we can, analysis runs faster
UPDATE E SET E.PGA = D.AVERAGE FROM (SELECT A.SchoolID, AVG(C.PGA100) AS AVERAGE FROM [syHazus].[dbo].[USGS] C, [dbo].[eqSchool] A
INNER JOIN [dbo].[hzSchool] B ON A.SchoolId = B.SchoolId WHERE C.Shape.STIntersects(B.Shape) = 1 GROUP BY A.SchoolID) D
INNER JOIN eqSchool E ON E.SchoolID = D.SchoolID
USGS grids are the National Seismic Hazard Map data.
The soil amplification is part of the new process by using special views and pivot tables to achieve the same answer just set based.
Aggregation from 20 min to 8 or less.
Very similar to Probabilistic with the exception of the Soil Amplification math.
Each study region impacts a different area of Earthquake analysis and results
Charleston SC because of Census pipeline data (largest EQ in Eastern US 1886)
Greater Los Angeles is perhaps the most important Earthquake study region in the world
Kenai Alaska has 49,000 AEBM points
Aggregation of 30 min for Greater LA down to 1min 44sec
Limitation of ESRI license level for Basic they can only view Microsoft SQL data. Import functionality broken and custom solution had to be used.