SlideShare uma empresa Scribd logo
1 de 18
Baixar para ler offline
wgrib2
Arulalan T
Scientist
India Meteorological Department
Introduction
• conda install wgrib
• conda install wgrib2
• Its free and open source scientific software to manage grib1 and grib2
(weather and climate data files)
• wgrib –v infile.grib1
• wgrib2 –v infile.grib2 (try V, v2)
(1) How do I convert the entire file to binary?
$ wgrib2 grib_file -bin out.bin
(2) How do I get a verbose listing of the 20th message only?
$ wgrib2 -V -d 20 grib_file
If grib message 20 has submessages, then d 20 will get the first field of
the grib message. I.e, this will be the same as -d 20.1
(3) How do I extract the 20th and 30th grib message?
$ wgrib2 grib_file -match "(^20:^30:)" -bin binary_file
(4) How do I extract the first 10 records as a grib file?
$ wgrib2 grib_file -for_n 1:10 -grib new_grib_file
(5) How do I extract records 34-66 from a file as a grib file?
$ wgrib2 grib_file -for 34:66 -grib new_grib_file
(6) How do I extract grib messages 10,12,19 from a grib file ?
$ wgrib2 old_grib -match '^(1|12|19):' -grib new_grib
(7) How do I extract a range of dates from a file?
Example 1. Analysis file $ wgrib2 z500.gdas.199704.f06
1:0:d=1997040100:HGT:500 mb:6 hour fcst:
2:228571:d=1997040106:HGT:500 mb:6 hour fcst: ...
120:27173104:d=1997043018:HGT:500 mb:6 hour fcst:
The file contains the 500 mb geopotential height for the April 1997. To obtain
the data for a range of date:
$ wgrib2 z500.gdas.199704.f06 | awk '{d=substr($3,3); if (d >=1997042400 &&
d <=1997042512) print $0}' FS=':' | wgrib2 z500.gdas.199704.f06 -i -grib
z500subset.grb
Example 2. Extract a range of forecast times
$ wgrib2 t250.03.2015080900.daily.grb2
1:0:d=2015080900:TMP:250 mb:6 hour fcst:
2:20236:d=2015080900:TMP:250 mb:12 hour fcst: ..
456:8689473:d=2015080900:TMP:250 mb:2736 hour fcst:
To extract a range of forecasts, you can put the verification time into the inventory
$ wgrib2 t250.03.2015080900.daily.grb2 -vt -s
1:0:vt=2015080906:d=2015080900:TMP:250 mb:6 hour fcst:
2:20236:vt=2015080912:d=2015080900:TMP:250 mb:12 hour fcst: ...
Modifying example 1
$ wgrib2 t250.03.2015080900.daily.grb2 -vt -s | awk '{d=substr($3,4); if (d
>=2015081000 && d <= 2014081012) print $0}' FS=':' | wgrib2
t250.03.2015080900.daily.grb2 -i -grib subset.grb
4:60273:d=2015080900:TMP:250 mb:24 hour fcst:
5:80131:d=2015080900:TMP:250 mb:30 hour fcst:
6:99799:d=2015080900:TMP:250 mb:36 hour fcst:
(8) CMC files have some variables which don't have a name. wgrib2 doesn't
include the local extensions from CMC. AFAIK, the CMC grib tables follows the
NCEP grib table except where they don't. (How does that helps me?) So you
may be able to get most of the fields right by using nt the NCEP tables. This can
be done changing the center flag to NCEP (7).
$wgrib2 -set center 7 infile.grib2 (In memory)
$wgrib2 -set center 7 infile.grib2 -grib_out outfile.grib2 (In outfile)
(9) Use append option to append on existing grib2 files
$wgrib2 -set center 7 infile.grib2 -append -grib_out outfile.grib2
(10) I am generating a text dump (CSV) and the size of the file is really large. If
you set the grid point values to "undefined", they won't print out.
$wgrib2 grib_file -undefine out-box 30:40 -10:10 -csv out.txt
The grid points outside of the box are set to undefined. The -csv options tries to
print the entire grid but finds only the values between 30N-40N and 10W-10E
are defined.
(11) I am generating a text dump (CSV) and the size of the file is really large, I
am only interested in variable XYZ when it has values larger than 10
$wgrib2 grib_file -match ':XYZ:' -undefine_val -9e20:10" -csv output.txt
(12) How do I extract small region from big domain in grib2 file
$wgrib2 -small_grib LonW:LonE LatS:LatN file_name
$wgrib2 fcst.grb2 -small_grib 10:20 -20:20 small_out.grb
(13) I tried using the small_grib option but the files are bigger than the original.
By default, The small_grib option writes a regional subset in simple packing.
This requires less CPU resources but produces larger files than jpeg or complex
packing.
To use the same packing as the original file, use the "-set_grib_type same"
option. OR –set_grib_type jpeg OR –set_grib_type complex OR –set_grib_type
complex2 Or –set_grib_type complex3
(14) How do I extract a few variables and write out into new grib2 file ?
$wgrib2 infile.grib2 -match "(VIS:surface:60 min fcst:|HGT:cloud ceiling:60 min
fcst:)" – grib out.grib2
$wgrib2 infile.grib2 -match "(VIS:surface:60 min fcst:|HGT:cloud ceiling:60 min
fcst:)" –grib_out out.grib2
$wgrib2 infile.grib2 -match "(VIS:surface:60 min fcst:|HGT:cloud ceiling:60 min
fcst:)" –append –grib_out out.grib2
(15) I want to modify some existing grib files. For example, I want to make sure
the humidity (RH, SPFH) is positive.
$wgrib2 test.grb -rpn "0:max" -set_grib_type c2 -grib_out new.grb
-rpn 0:max, replaces the grid point values with max(0,grid_point_value)
(16) My software won't handle the level "30-0 mb above ground". How do I
change the level to "15 mb above ground"?
IF structure
$wgrib2 in.grb -if ":30-0 mb above ground" -set_lev "15 mb above ground" -
endif -grib out.grb
(17) How do I combine multiple grib2 files
$ cat infile1.g2 infile2.g2 infile3.g2 infile3.g2 > outfile.grib2
(18) Masking data: I want set all oceanic values to undefined. Suppose LAND is a
grib file with the land mask (land=1, water=0) with the same grid as the data file
(data.old) Make a new file with the land mask as the first record.
$ cat LAND data.old > data.tmp
$ wgrib2 data.tmp 
-if '^1:’ -rpn '1:==:sto_1’ -fi 
-not_if '^1:' -rpn 'rcl_1:mask' -set_grib_type same  -set_scaling same
-grib_out data.new
(19) My grib file uses the special value of -999.9 as the undefine value. (Grid
points with a value of -999.9 are considered to be undefined). Using special
values as undefined values is outside of the grib standard and should be
avoided. To convert a file that uses -999.9 to the grib standard, you should do
$wgrib2 IN.grb -undefine_val 999.9 -set_grib_type same -grib_out OUT.grb
(20) The -merge_fcst option will try to combine adjacent messages by
increasing the time period for forecast average, accumulation, minimum
and maximum. For example,
$ wgrib2 prate.201404.ts -for 1:4
1:0:d=2014040100:PRATE:surface:0-1 hour ave fcst:
2:13083:d=2014040100:PRATE:surface:1-2 hour ave fcst:
3:26587:d=2014040100:PRATE:surface:2-3 hour ave fcst:
4:40437:d=2014040100:PRATE:surface:3-4 hour ave fcst:
$ wgrib2 preas.201404.ts -for 1:4 -merge_fcst 4 all.grb (listing of input)
$ wgrib2 all.grb
1:0:d=2014040100:PRATE:surface:0-4 hour ave fcst:
Similar way, automatically compute accumulation, minimum, maximum based
on input meta information available in the input.grib2 file.
(20) The -merge_fcst - make daily accumulated rainfall from four 6-hourly accumulated
rain
$ cat fhr06 fhr12 fhr18 fhr24 > IN.grb
$ wgrib2 IN.grb | sort -t: -k4,4 -k5,5 -k6,6n | wgrib2 -i IN.grb -grib OUT1.grb
$ wgrib2 OUT1.grb -merge_fcst 4 OUT2.grb
$ wgrib2 OUT1.grb -if ":APCP:” -merge_fcst 4 OUT2.grb
(21) how to change the date code in a grib2 file ex. change the analysis/initial time date to
12Z of 5th Jan 2023
$wgrib2 -match "^[0-9]*(:|.1)" IN.grb -set_date 2023010512 -GRIB OUT.grb
(22) How do I extract a particular lat, lon values? get values at (30N, -90W) and (40N, -80N)
$wgrib2 IN.grb -lon -90 30 > location1.txt
$wgrib2 IN.grb -lon -80 40 > location2.txt
$wgrib2 IN.grb -lon -90 30 -80 40 > locations.txt
(23) The GFS model has output every 3 hours in the files: gfs.t00z.pgrb2.1p00.fHHH HHH=000, 003, 006, 009, etc
The APCP (accumulated precip) fields of the following format
gfs.t00z.pgrb2.1p00.f003: has APCP:surface:0-3 hour acc fcst:
gfs.t00z.pgrb2.1p00.f006: has APCP:surface:0-6 hour acc fcst:
gfs.t00z.pgrb2.1p00.f009: has APCP:surface:6-9 hour acc fcst:
gfs.t00z.pgrb2.1p00.f012: has APCP:surface:6-12 hour acc fcst:
gfs.t00z.pgrb2.1p00.f015: has APCP:surface:12-15 hour acc fcst: ..
I want a file with the 3 hourly accumulations like
1:0:d=2017041800:APCP:surface:0-3 hour acc fcst:
2:97943:d=2017041800:APCP:surface:3-6 hour acc fcst:
3:195886:d=2017041800:APCP:surface:6-9 hour acc fcst:
4:301974:d=2017041800:APCP:surface:9-12 hour acc fcst:
5:383627:d=2017041800:APCP:surface:12-15 hour acc fcst: ..
You can use the -ncep_norm to make the 3 hourly accumulations of the APCP by
$ cat gfs.t00z.pgrb2.1p00.f0?? | wgrib2 - -match ':APCP: ' -if ':APCP:' -ncep_norm apcp.grb2
Results are $ wgrib2 apcp.grb2
1:0:d=2017041800:APCP:surface:0-3 hour acc fcst:
2:97943:d=2017041800:APCP:surface:3-6 hour acc fcst:
3:195886:d=2017041800:APCP:surface:6-9 hour acc fcst:
4:301974:d=2017041800:APCP:surface:9-12 hour acc fcst:
(24) In the operational CFSv2 time series, V does not follow U and the forecasts are not in order. When using -
new_grid, the vector fields must be adjacent (u-component then v-component).
$wgrib2 IN.grb | sed -e 's/:UGRD:/:UGRDa:/' -e 's/:VGRD:/:UGRDb:/' | sort -t: -k3,3 -k6n,6 -k5,5 -k4,4 | wgrib2 -
i IN.grb -grib OUT.grb
(25) Compute wind speed and direction and append into same input file. We need to make sure both u-wind
and v-wind are aligned properly
$wgrib2 -match "(UGRD|VGRD)" infile.grb2 -wind_speed tmpfile -wind_dir tmpfile
$cat tmpfile >> infile.grb2
(26) How do I interpolate/extrapolate to different grid resolution of winds ?
$wgrib2 IN.grb | sed -e 's/:d=([0-9]*):([A-Z])([^:]*)/:d=1:3_2:/' | sort -t: -k3,3 -k5,5 -k6,6 -k4,4 | wgrib2
-i IN.grb -new_grid_winds earth -new_grid latlon 0:144:2.5 -90:73:2.5 OUT.grb
(27) I have grib2 files in WE:NS order, how do I put them into WE:SN order?
Use -new_grid to a new grid. Just make the new grid the same as the old grid except for a WE:SN order.
(28) I wanted to find the min TMP2M for a given year from CFSR. IN.grb = time series of TMP2m (only) for the
given year # create an inventory where record 1 is last
$wgrib2 IN.grb -for 2:: > junk.inv
$ wgrib2 IN.grb -d 1 >> junk.inv # process the file with record 1 last
$wgrib2 -i <junk.inv IN.grb -if "^2:" -rpn "sto_1" -fi -rpn "rcl_1:min:sto_1" -if "^1:" -set_ave "0-1 year min anl" -
grib_out OUT.grb
(29) How do I convert grib2 to netcdf format ?
$ wgrib2 infile.grib2 -netcdf outfile.nc
$ wgrib2 infile.grib2 -nc_nlev 4 -netcdf outfile.nc # to bring proper 4 dimensional axis
(30) The -lola option with nlat=nlon=1 and dlat=dlon=1 (or any non-zero value)
binary: $ wgrib2 IN.grb -no_header -lola "10:1:1" "20:1:1" grid.bin bin
text: $ wgrib2 IN.grb -lola "10:1:1" "20:1:1" grid.txt text
CSV: $ wgrib2 IN.grb -lola "10:1:1" "20:1:1" grid.grb grib (converting to grib)
$ wgrib2 grid.grb -csv grid.csv (converting grib to CSV)
(31) I have a file with var246 that I want to change to the standard UGRD at 100 m above
ground.
$ wgrib2 IN.grb -if ":var246:" -set_var UGRD -set_lev "100 m above ground" -fi -grib OUT.grb
(32) I have 80 ensemble members that have WIND (wind speed) at 10m. I want to find the
probability the the WIND is greater or equal to 15 m/s based on the percentile of ens
members >= 15 m/s.
ensemble data is at $input/flx.memNNN.YYMMDDHH where NNN=001..080 WIND is only
defined at 10 meters above ground
$ cat flx.mem??? | wgrib2 - -match ":WIND:" -rpn "15:>=" -ens_processing ensstat.grb x
$wgrib2 ensstat.grb -match ":ens mean" -set_grib_type c3 -set_prob 1 1 1 15 15 -grib
wind_ge_15.grb
g2ctl.pl
For analyses:
$ g2ctl -O grib2_file > grib2_file.ctl
$ gribmap -O -i grib2_file.ctl
$ grads
ga-> open grib2_file.ctl
For forecasts (end of averaging period):
$ g2ctl grib2_file > grib2_file.ctl
$ gribmap -i grib2_file.ctl
$ grads
ga-> open grib2_file.ctl
For forecasts (start of averaging period):
$ g2ctl -b grib2_file > grib2_file.ctl
$ gribmap -b -i grib2_file.ctl
$ grads
ga-> open grib2_file.ctl
-verf .. use end of ave/acc period or
fcst time (default)
run gribmap with no option
-0 .. use analysis times
run gribmap with gribmap -0
-b .. use start of
averaging/accumulation period
run gribmap with gribmap -b
-365 .. use 365 day calendar
-no_profile .. no z coordinate
-raw .. use a raw grid
-ens "e1,..,en" .. a list of quoted
ensemble names
References
https://www.cpc.ncep.noaa.gov/products/wesley/wgrib2/
https://www.ftp.cpc.ncep.noaa.gov/wd51we/wgrib2/tricks.w
grib2
https://www.ftp.cpc.ncep.noaa.gov/wd51we/wgrib2/tricks.ch
eap
https://www.cpc.ncep.noaa.gov/products/wesley/g2ctl.html

Mais conteúdo relacionado

Mais procurados

File organization and indexing
File organization and indexingFile organization and indexing
File organization and indexingraveena sharma
 
VTU 7TH SEM CSE DATA WAREHOUSING AND DATA MINING SOLVED PAPERS OF DEC2013 JUN...
VTU 7TH SEM CSE DATA WAREHOUSING AND DATA MINING SOLVED PAPERS OF DEC2013 JUN...VTU 7TH SEM CSE DATA WAREHOUSING AND DATA MINING SOLVED PAPERS OF DEC2013 JUN...
VTU 7TH SEM CSE DATA WAREHOUSING AND DATA MINING SOLVED PAPERS OF DEC2013 JUN...vtunotesbysree
 
6-Practice Problems - LL(1) parser-16-05-2023.pptx
6-Practice Problems - LL(1) parser-16-05-2023.pptx6-Practice Problems - LL(1) parser-16-05-2023.pptx
6-Practice Problems - LL(1) parser-16-05-2023.pptxvenkatapranaykumarGa
 
Recovery with concurrent transaction
Recovery with concurrent transactionRecovery with concurrent transaction
Recovery with concurrent transactionlavanya marichamy
 
Normalization | (1NF) |(2NF) (3NF)|BCNF| 4NF |5NF
Normalization | (1NF) |(2NF) (3NF)|BCNF| 4NF |5NFNormalization | (1NF) |(2NF) (3NF)|BCNF| 4NF |5NF
Normalization | (1NF) |(2NF) (3NF)|BCNF| 4NF |5NFBiplap Bhattarai
 
Data structures using C
Data structures using CData structures using C
Data structures using CPdr Patnaik
 
7-Operator Precedence Parser-23-05-2023.pptx
7-Operator Precedence Parser-23-05-2023.pptx7-Operator Precedence Parser-23-05-2023.pptx
7-Operator Precedence Parser-23-05-2023.pptxvenkatapranaykumarGa
 
Validation based protocol
Validation based protocolValidation based protocol
Validation based protocolBBDITM LUCKNOW
 

Mais procurados (20)

Introduction to database
Introduction to databaseIntroduction to database
Introduction to database
 
Functional dependency
Functional dependencyFunctional dependency
Functional dependency
 
DISE - Database Concepts
DISE - Database ConceptsDISE - Database Concepts
DISE - Database Concepts
 
AVL Tree Data Structure
AVL Tree Data StructureAVL Tree Data Structure
AVL Tree Data Structure
 
File organization and indexing
File organization and indexingFile organization and indexing
File organization and indexing
 
StructuresPointers.pptx
StructuresPointers.pptxStructuresPointers.pptx
StructuresPointers.pptx
 
VTU 7TH SEM CSE DATA WAREHOUSING AND DATA MINING SOLVED PAPERS OF DEC2013 JUN...
VTU 7TH SEM CSE DATA WAREHOUSING AND DATA MINING SOLVED PAPERS OF DEC2013 JUN...VTU 7TH SEM CSE DATA WAREHOUSING AND DATA MINING SOLVED PAPERS OF DEC2013 JUN...
VTU 7TH SEM CSE DATA WAREHOUSING AND DATA MINING SOLVED PAPERS OF DEC2013 JUN...
 
6-Practice Problems - LL(1) parser-16-05-2023.pptx
6-Practice Problems - LL(1) parser-16-05-2023.pptx6-Practice Problems - LL(1) parser-16-05-2023.pptx
6-Practice Problems - LL(1) parser-16-05-2023.pptx
 
Programming Language (chapter 5 for class 11 and 12)
Programming Language (chapter 5 for class 11 and 12)Programming Language (chapter 5 for class 11 and 12)
Programming Language (chapter 5 for class 11 and 12)
 
Chapter 7 Run Time Environment
Chapter 7   Run Time EnvironmentChapter 7   Run Time Environment
Chapter 7 Run Time Environment
 
Sorting Methods.pptx
Sorting Methods.pptxSorting Methods.pptx
Sorting Methods.pptx
 
Recovery with concurrent transaction
Recovery with concurrent transactionRecovery with concurrent transaction
Recovery with concurrent transaction
 
Normalization | (1NF) |(2NF) (3NF)|BCNF| 4NF |5NF
Normalization | (1NF) |(2NF) (3NF)|BCNF| 4NF |5NFNormalization | (1NF) |(2NF) (3NF)|BCNF| 4NF |5NF
Normalization | (1NF) |(2NF) (3NF)|BCNF| 4NF |5NF
 
Data structures using C
Data structures using CData structures using C
Data structures using C
 
Database Triggers
Database TriggersDatabase Triggers
Database Triggers
 
Input-Buffering
Input-BufferingInput-Buffering
Input-Buffering
 
DBMS Unit - 6 - Transaction Management
DBMS Unit - 6 - Transaction ManagementDBMS Unit - 6 - Transaction Management
DBMS Unit - 6 - Transaction Management
 
7-Operator Precedence Parser-23-05-2023.pptx
7-Operator Precedence Parser-23-05-2023.pptx7-Operator Precedence Parser-23-05-2023.pptx
7-Operator Precedence Parser-23-05-2023.pptx
 
Validation based protocol
Validation based protocolValidation based protocol
Validation based protocol
 
Stack - Data Structure
Stack - Data StructureStack - Data Structure
Stack - Data Structure
 

Semelhante a wgrib2

Jma hr gsm_data_gr_ads_20130529
Jma hr gsm_data_gr_ads_20130529Jma hr gsm_data_gr_ads_20130529
Jma hr gsm_data_gr_ads_20130529JMA_447
 
Pygrib documentation
Pygrib documentationPygrib documentation
Pygrib documentationArulalan T
 
How To Crack RSA Netrek Binary Verification System
How To Crack RSA Netrek Binary Verification SystemHow To Crack RSA Netrek Binary Verification System
How To Crack RSA Netrek Binary Verification SystemJay Corrales
 
Oracle-GoldenGate-18c-Workshop-Lab-16.docx
Oracle-GoldenGate-18c-Workshop-Lab-16.docxOracle-GoldenGate-18c-Workshop-Lab-16.docx
Oracle-GoldenGate-18c-Workshop-Lab-16.docxtricantino1973
 
Oracle cluster installation with grid and nfs
Oracle cluster  installation with grid and nfsOracle cluster  installation with grid and nfs
Oracle cluster installation with grid and nfsChanaka Lasantha
 
Ns2: OTCL - PArt II
Ns2: OTCL - PArt IINs2: OTCL - PArt II
Ns2: OTCL - PArt IIAjit Nayak
 
Oracle cluster installation with grid and iscsi
Oracle cluster  installation with grid and iscsiOracle cluster  installation with grid and iscsi
Oracle cluster installation with grid and iscsiChanaka Lasantha
 
DOCKING#MOLECULARSIMULATION#ENERGYMINIMISATION
DOCKING#MOLECULARSIMULATION#ENERGYMINIMISATIONDOCKING#MOLECULARSIMULATION#ENERGYMINIMISATION
DOCKING#MOLECULARSIMULATION#ENERGYMINIMISATIONblalbiotech72
 
Presentation about the use of R3BRoot for data analysis
Presentation about the use of R3BRoot for data analysisPresentation about the use of R3BRoot for data analysis
Presentation about the use of R3BRoot for data analysisJoseLuisRodriguezSan16
 
GLX, DRI, and i965
GLX, DRI, and i965GLX, DRI, and i965
GLX, DRI, and i965Chia-I Wu
 
クラウドDWHとしても進化を続けるPivotal Greenplumご紹介
クラウドDWHとしても進化を続けるPivotal Greenplumご紹介クラウドDWHとしても進化を続けるPivotal Greenplumご紹介
クラウドDWHとしても進化を続けるPivotal Greenplumご紹介Masayuki Matsushita
 
NS2-tutorial.ppt
NS2-tutorial.pptNS2-tutorial.ppt
NS2-tutorial.pptWajath
 
Building a DSL with GraalVM (VoxxedDays Luxembourg)
Building a DSL with GraalVM (VoxxedDays Luxembourg)Building a DSL with GraalVM (VoxxedDays Luxembourg)
Building a DSL with GraalVM (VoxxedDays Luxembourg)Maarten Mulders
 
Cloud burst tutorial
Cloud burst tutorialCloud burst tutorial
Cloud burst tutorial주영 송
 
All I know about rsc.io/c2go
All I know about rsc.io/c2goAll I know about rsc.io/c2go
All I know about rsc.io/c2goMoriyoshi Koizumi
 
2.1 ### uVision Project, (C) Keil Software .docx
2.1   ### uVision Project, (C) Keil Software    .docx2.1   ### uVision Project, (C) Keil Software    .docx
2.1 ### uVision Project, (C) Keil Software .docxtarifarmarie
 

Semelhante a wgrib2 (20)

Jma hr gsm_data_gr_ads_20130529
Jma hr gsm_data_gr_ads_20130529Jma hr gsm_data_gr_ads_20130529
Jma hr gsm_data_gr_ads_20130529
 
Pygrib documentation
Pygrib documentationPygrib documentation
Pygrib documentation
 
Brief GAUT tutorial
Brief GAUT tutorialBrief GAUT tutorial
Brief GAUT tutorial
 
How To Crack RSA Netrek Binary Verification System
How To Crack RSA Netrek Binary Verification SystemHow To Crack RSA Netrek Binary Verification System
How To Crack RSA Netrek Binary Verification System
 
Oracle-GoldenGate-18c-Workshop-Lab-16.docx
Oracle-GoldenGate-18c-Workshop-Lab-16.docxOracle-GoldenGate-18c-Workshop-Lab-16.docx
Oracle-GoldenGate-18c-Workshop-Lab-16.docx
 
Oracle cluster installation with grid and nfs
Oracle cluster  installation with grid and nfsOracle cluster  installation with grid and nfs
Oracle cluster installation with grid and nfs
 
Ns2: OTCL - PArt II
Ns2: OTCL - PArt IINs2: OTCL - PArt II
Ns2: OTCL - PArt II
 
Oracle cluster installation with grid and iscsi
Oracle cluster  installation with grid and iscsiOracle cluster  installation with grid and iscsi
Oracle cluster installation with grid and iscsi
 
DOCKING#MOLECULARSIMULATION#ENERGYMINIMISATION
DOCKING#MOLECULARSIMULATION#ENERGYMINIMISATIONDOCKING#MOLECULARSIMULATION#ENERGYMINIMISATION
DOCKING#MOLECULARSIMULATION#ENERGYMINIMISATION
 
Presentation about the use of R3BRoot for data analysis
Presentation about the use of R3BRoot for data analysisPresentation about the use of R3BRoot for data analysis
Presentation about the use of R3BRoot for data analysis
 
GLX, DRI, and i965
GLX, DRI, and i965GLX, DRI, and i965
GLX, DRI, and i965
 
Boosting Developer Productivity with Clang
Boosting Developer Productivity with ClangBoosting Developer Productivity with Clang
Boosting Developer Productivity with Clang
 
クラウドDWHとしても進化を続けるPivotal Greenplumご紹介
クラウドDWHとしても進化を続けるPivotal Greenplumご紹介クラウドDWHとしても進化を続けるPivotal Greenplumご紹介
クラウドDWHとしても進化を続けるPivotal Greenplumご紹介
 
NS2-tutorial.ppt
NS2-tutorial.pptNS2-tutorial.ppt
NS2-tutorial.ppt
 
Building a DSL with GraalVM (VoxxedDays Luxembourg)
Building a DSL with GraalVM (VoxxedDays Luxembourg)Building a DSL with GraalVM (VoxxedDays Luxembourg)
Building a DSL with GraalVM (VoxxedDays Luxembourg)
 
Cloud burst tutorial
Cloud burst tutorialCloud burst tutorial
Cloud burst tutorial
 
All I know about rsc.io/c2go
All I know about rsc.io/c2goAll I know about rsc.io/c2go
All I know about rsc.io/c2go
 
Deathstar
DeathstarDeathstar
Deathstar
 
Data recovery using pg_filedump
Data recovery using pg_filedumpData recovery using pg_filedump
Data recovery using pg_filedump
 
2.1 ### uVision Project, (C) Keil Software .docx
2.1   ### uVision Project, (C) Keil Software    .docx2.1   ### uVision Project, (C) Keil Software    .docx
2.1 ### uVision Project, (C) Keil Software .docx
 

Mais de Arulalan T

Climate Data Operators (CDO)
Climate Data Operators (CDO)Climate Data Operators (CDO)
Climate Data Operators (CDO)Arulalan T
 
CDAT - graphics - vcs - xmgrace - Introduction
CDAT - graphics - vcs - xmgrace - Introduction CDAT - graphics - vcs - xmgrace - Introduction
CDAT - graphics - vcs - xmgrace - Introduction Arulalan T
 
CDAT - cdms2, maskes, cdscan, cdutil, genutil - Introduction
CDAT - cdms2, maskes, cdscan, cdutil, genutil - Introduction CDAT - cdms2, maskes, cdscan, cdutil, genutil - Introduction
CDAT - cdms2, maskes, cdscan, cdutil, genutil - Introduction Arulalan T
 
CDAT - cdms numpy arrays - Introduction
CDAT - cdms numpy arrays - IntroductionCDAT - cdms numpy arrays - Introduction
CDAT - cdms numpy arrays - IntroductionArulalan T
 
Python an-intro-python-month-2013
Python an-intro-python-month-2013Python an-intro-python-month-2013
Python an-intro-python-month-2013Arulalan T
 
Python an-intro v2
Python an-intro v2Python an-intro v2
Python an-intro v2Arulalan T
 
Thermohaline Circulation & Climate Change
Thermohaline Circulation & Climate ChangeThermohaline Circulation & Climate Change
Thermohaline Circulation & Climate ChangeArulalan T
 
Python an-intro - odp
Python an-intro - odpPython an-intro - odp
Python an-intro - odpArulalan T
 
Testing in-python-and-pytest-framework
Testing in-python-and-pytest-frameworkTesting in-python-and-pytest-framework
Testing in-python-and-pytest-frameworkArulalan T
 
Lesson1 python an introduction
Lesson1 python an introductionLesson1 python an introduction
Lesson1 python an introductionArulalan T
 
Python An Intro
Python An IntroPython An Intro
Python An IntroArulalan T
 
Final review contour
Final review  contourFinal review  contour
Final review contourArulalan T
 
Contour Ilugc Demo Presentation
Contour Ilugc Demo Presentation Contour Ilugc Demo Presentation
Contour Ilugc Demo Presentation Arulalan T
 
Contour Ilugc Demo Presentation
Contour Ilugc Demo PresentationContour Ilugc Demo Presentation
Contour Ilugc Demo PresentationArulalan T
 
Edit/correct India Map In Cdat Documentation - With Edited World Map Data
Edit/correct India Map In Cdat  Documentation -  With Edited World Map Data Edit/correct India Map In Cdat  Documentation -  With Edited World Map Data
Edit/correct India Map In Cdat Documentation - With Edited World Map Data Arulalan T
 
matplotlib-installatin-interactive-contour-example-guide
matplotlib-installatin-interactive-contour-example-guidematplotlib-installatin-interactive-contour-example-guide
matplotlib-installatin-interactive-contour-example-guideArulalan T
 
"contour.py" module
"contour.py" module"contour.py" module
"contour.py" moduleArulalan T
 
contour analysis and visulaization documetation -1
contour analysis and visulaization documetation -1contour analysis and visulaization documetation -1
contour analysis and visulaization documetation -1Arulalan T
 
Automatic B Day Remainder Program
Automatic B Day Remainder ProgramAutomatic B Day Remainder Program
Automatic B Day Remainder ProgramArulalan T
 

Mais de Arulalan T (20)

Climate Data Operators (CDO)
Climate Data Operators (CDO)Climate Data Operators (CDO)
Climate Data Operators (CDO)
 
CDAT - graphics - vcs - xmgrace - Introduction
CDAT - graphics - vcs - xmgrace - Introduction CDAT - graphics - vcs - xmgrace - Introduction
CDAT - graphics - vcs - xmgrace - Introduction
 
CDAT - cdms2, maskes, cdscan, cdutil, genutil - Introduction
CDAT - cdms2, maskes, cdscan, cdutil, genutil - Introduction CDAT - cdms2, maskes, cdscan, cdutil, genutil - Introduction
CDAT - cdms2, maskes, cdscan, cdutil, genutil - Introduction
 
CDAT - cdms numpy arrays - Introduction
CDAT - cdms numpy arrays - IntroductionCDAT - cdms numpy arrays - Introduction
CDAT - cdms numpy arrays - Introduction
 
Python an-intro-python-month-2013
Python an-intro-python-month-2013Python an-intro-python-month-2013
Python an-intro-python-month-2013
 
Python an-intro v2
Python an-intro v2Python an-intro v2
Python an-intro v2
 
Thermohaline Circulation & Climate Change
Thermohaline Circulation & Climate ChangeThermohaline Circulation & Climate Change
Thermohaline Circulation & Climate Change
 
Python an-intro - odp
Python an-intro - odpPython an-intro - odp
Python an-intro - odp
 
Testing in-python-and-pytest-framework
Testing in-python-and-pytest-frameworkTesting in-python-and-pytest-framework
Testing in-python-and-pytest-framework
 
Lesson1 python an introduction
Lesson1 python an introductionLesson1 python an introduction
Lesson1 python an introduction
 
Python An Intro
Python An IntroPython An Intro
Python An Intro
 
Final review contour
Final review  contourFinal review  contour
Final review contour
 
Contour Ilugc Demo Presentation
Contour Ilugc Demo Presentation Contour Ilugc Demo Presentation
Contour Ilugc Demo Presentation
 
Contour Ilugc Demo Presentation
Contour Ilugc Demo PresentationContour Ilugc Demo Presentation
Contour Ilugc Demo Presentation
 
Edit/correct India Map In Cdat Documentation - With Edited World Map Data
Edit/correct India Map In Cdat  Documentation -  With Edited World Map Data Edit/correct India Map In Cdat  Documentation -  With Edited World Map Data
Edit/correct India Map In Cdat Documentation - With Edited World Map Data
 
Nomography
NomographyNomography
Nomography
 
matplotlib-installatin-interactive-contour-example-guide
matplotlib-installatin-interactive-contour-example-guidematplotlib-installatin-interactive-contour-example-guide
matplotlib-installatin-interactive-contour-example-guide
 
"contour.py" module
"contour.py" module"contour.py" module
"contour.py" module
 
contour analysis and visulaization documetation -1
contour analysis and visulaization documetation -1contour analysis and visulaization documetation -1
contour analysis and visulaization documetation -1
 
Automatic B Day Remainder Program
Automatic B Day Remainder ProgramAutomatic B Day Remainder Program
Automatic B Day Remainder Program
 

Último

Taming Distributed Systems: Key Insights from Wix's Large-Scale Experience - ...
Taming Distributed Systems: Key Insights from Wix's Large-Scale Experience - ...Taming Distributed Systems: Key Insights from Wix's Large-Scale Experience - ...
Taming Distributed Systems: Key Insights from Wix's Large-Scale Experience - ...Natan Silnitsky
 
UI5ers live - Custom Controls wrapping 3rd-party libs.pptx
UI5ers live - Custom Controls wrapping 3rd-party libs.pptxUI5ers live - Custom Controls wrapping 3rd-party libs.pptx
UI5ers live - Custom Controls wrapping 3rd-party libs.pptxAndreas Kunz
 
Introduction Computer Science - Software Design.pdf
Introduction Computer Science - Software Design.pdfIntroduction Computer Science - Software Design.pdf
Introduction Computer Science - Software Design.pdfFerryKemperman
 
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024StefanoLambiase
 
Implementing Zero Trust strategy with Azure
Implementing Zero Trust strategy with AzureImplementing Zero Trust strategy with Azure
Implementing Zero Trust strategy with AzureDinusha Kumarasiri
 
How to submit a standout Adobe Champion Application
How to submit a standout Adobe Champion ApplicationHow to submit a standout Adobe Champion Application
How to submit a standout Adobe Champion ApplicationBradBedford3
 
A healthy diet for your Java application Devoxx France.pdf
A healthy diet for your Java application Devoxx France.pdfA healthy diet for your Java application Devoxx France.pdf
A healthy diet for your Java application Devoxx France.pdfMarharyta Nedzelska
 
Folding Cheat Sheet #4 - fourth in a series
Folding Cheat Sheet #4 - fourth in a seriesFolding Cheat Sheet #4 - fourth in a series
Folding Cheat Sheet #4 - fourth in a seriesPhilip Schwarz
 
Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Andreas Granig
 
VK Business Profile - provides IT solutions and Web Development
VK Business Profile - provides IT solutions and Web DevelopmentVK Business Profile - provides IT solutions and Web Development
VK Business Profile - provides IT solutions and Web Developmentvyaparkranti
 
Software Project Health Check: Best Practices and Techniques for Your Product...
Software Project Health Check: Best Practices and Techniques for Your Product...Software Project Health Check: Best Practices and Techniques for Your Product...
Software Project Health Check: Best Practices and Techniques for Your Product...Velvetech LLC
 
英国UN学位证,北安普顿大学毕业证书1:1制作
英国UN学位证,北安普顿大学毕业证书1:1制作英国UN学位证,北安普顿大学毕业证书1:1制作
英国UN学位证,北安普顿大学毕业证书1:1制作qr0udbr0
 
Powering Real-Time Decisions with Continuous Data Streams
Powering Real-Time Decisions with Continuous Data StreamsPowering Real-Time Decisions with Continuous Data Streams
Powering Real-Time Decisions with Continuous Data StreamsSafe Software
 
Ahmed Motair CV April 2024 (Senior SW Developer)
Ahmed Motair CV April 2024 (Senior SW Developer)Ahmed Motair CV April 2024 (Senior SW Developer)
Ahmed Motair CV April 2024 (Senior SW Developer)Ahmed Mater
 
Precise and Complete Requirements? An Elusive Goal
Precise and Complete Requirements? An Elusive GoalPrecise and Complete Requirements? An Elusive Goal
Precise and Complete Requirements? An Elusive GoalLionel Briand
 
Call Us🔝>༒+91-9711147426⇛Call In girls karol bagh (Delhi)
Call Us🔝>༒+91-9711147426⇛Call In girls karol bagh (Delhi)Call Us🔝>༒+91-9711147426⇛Call In girls karol bagh (Delhi)
Call Us🔝>༒+91-9711147426⇛Call In girls karol bagh (Delhi)jennyeacort
 
Odoo 14 - eLearning Module In Odoo 14 Enterprise
Odoo 14 - eLearning Module In Odoo 14 EnterpriseOdoo 14 - eLearning Module In Odoo 14 Enterprise
Odoo 14 - eLearning Module In Odoo 14 Enterprisepreethippts
 
Comparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfComparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfDrew Moseley
 
20240415 [Container Plumbing Days] Usernetes Gen2 - Kubernetes in Rootless Do...
20240415 [Container Plumbing Days] Usernetes Gen2 - Kubernetes in Rootless Do...20240415 [Container Plumbing Days] Usernetes Gen2 - Kubernetes in Rootless Do...
20240415 [Container Plumbing Days] Usernetes Gen2 - Kubernetes in Rootless Do...Akihiro Suda
 
Balasore Best It Company|| Top 10 IT Company || Balasore Software company Odisha
Balasore Best It Company|| Top 10 IT Company || Balasore Software company OdishaBalasore Best It Company|| Top 10 IT Company || Balasore Software company Odisha
Balasore Best It Company|| Top 10 IT Company || Balasore Software company Odishasmiwainfosol
 

Último (20)

Taming Distributed Systems: Key Insights from Wix's Large-Scale Experience - ...
Taming Distributed Systems: Key Insights from Wix's Large-Scale Experience - ...Taming Distributed Systems: Key Insights from Wix's Large-Scale Experience - ...
Taming Distributed Systems: Key Insights from Wix's Large-Scale Experience - ...
 
UI5ers live - Custom Controls wrapping 3rd-party libs.pptx
UI5ers live - Custom Controls wrapping 3rd-party libs.pptxUI5ers live - Custom Controls wrapping 3rd-party libs.pptx
UI5ers live - Custom Controls wrapping 3rd-party libs.pptx
 
Introduction Computer Science - Software Design.pdf
Introduction Computer Science - Software Design.pdfIntroduction Computer Science - Software Design.pdf
Introduction Computer Science - Software Design.pdf
 
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
 
Implementing Zero Trust strategy with Azure
Implementing Zero Trust strategy with AzureImplementing Zero Trust strategy with Azure
Implementing Zero Trust strategy with Azure
 
How to submit a standout Adobe Champion Application
How to submit a standout Adobe Champion ApplicationHow to submit a standout Adobe Champion Application
How to submit a standout Adobe Champion Application
 
A healthy diet for your Java application Devoxx France.pdf
A healthy diet for your Java application Devoxx France.pdfA healthy diet for your Java application Devoxx France.pdf
A healthy diet for your Java application Devoxx France.pdf
 
Folding Cheat Sheet #4 - fourth in a series
Folding Cheat Sheet #4 - fourth in a seriesFolding Cheat Sheet #4 - fourth in a series
Folding Cheat Sheet #4 - fourth in a series
 
Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024
 
VK Business Profile - provides IT solutions and Web Development
VK Business Profile - provides IT solutions and Web DevelopmentVK Business Profile - provides IT solutions and Web Development
VK Business Profile - provides IT solutions and Web Development
 
Software Project Health Check: Best Practices and Techniques for Your Product...
Software Project Health Check: Best Practices and Techniques for Your Product...Software Project Health Check: Best Practices and Techniques for Your Product...
Software Project Health Check: Best Practices and Techniques for Your Product...
 
英国UN学位证,北安普顿大学毕业证书1:1制作
英国UN学位证,北安普顿大学毕业证书1:1制作英国UN学位证,北安普顿大学毕业证书1:1制作
英国UN学位证,北安普顿大学毕业证书1:1制作
 
Powering Real-Time Decisions with Continuous Data Streams
Powering Real-Time Decisions with Continuous Data StreamsPowering Real-Time Decisions with Continuous Data Streams
Powering Real-Time Decisions with Continuous Data Streams
 
Ahmed Motair CV April 2024 (Senior SW Developer)
Ahmed Motair CV April 2024 (Senior SW Developer)Ahmed Motair CV April 2024 (Senior SW Developer)
Ahmed Motair CV April 2024 (Senior SW Developer)
 
Precise and Complete Requirements? An Elusive Goal
Precise and Complete Requirements? An Elusive GoalPrecise and Complete Requirements? An Elusive Goal
Precise and Complete Requirements? An Elusive Goal
 
Call Us🔝>༒+91-9711147426⇛Call In girls karol bagh (Delhi)
Call Us🔝>༒+91-9711147426⇛Call In girls karol bagh (Delhi)Call Us🔝>༒+91-9711147426⇛Call In girls karol bagh (Delhi)
Call Us🔝>༒+91-9711147426⇛Call In girls karol bagh (Delhi)
 
Odoo 14 - eLearning Module In Odoo 14 Enterprise
Odoo 14 - eLearning Module In Odoo 14 EnterpriseOdoo 14 - eLearning Module In Odoo 14 Enterprise
Odoo 14 - eLearning Module In Odoo 14 Enterprise
 
Comparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdfComparing Linux OS Image Update Models - EOSS 2024.pdf
Comparing Linux OS Image Update Models - EOSS 2024.pdf
 
20240415 [Container Plumbing Days] Usernetes Gen2 - Kubernetes in Rootless Do...
20240415 [Container Plumbing Days] Usernetes Gen2 - Kubernetes in Rootless Do...20240415 [Container Plumbing Days] Usernetes Gen2 - Kubernetes in Rootless Do...
20240415 [Container Plumbing Days] Usernetes Gen2 - Kubernetes in Rootless Do...
 
Balasore Best It Company|| Top 10 IT Company || Balasore Software company Odisha
Balasore Best It Company|| Top 10 IT Company || Balasore Software company OdishaBalasore Best It Company|| Top 10 IT Company || Balasore Software company Odisha
Balasore Best It Company|| Top 10 IT Company || Balasore Software company Odisha
 

wgrib2

  • 2. Introduction • conda install wgrib • conda install wgrib2 • Its free and open source scientific software to manage grib1 and grib2 (weather and climate data files) • wgrib –v infile.grib1 • wgrib2 –v infile.grib2 (try V, v2)
  • 3. (1) How do I convert the entire file to binary? $ wgrib2 grib_file -bin out.bin (2) How do I get a verbose listing of the 20th message only? $ wgrib2 -V -d 20 grib_file If grib message 20 has submessages, then d 20 will get the first field of the grib message. I.e, this will be the same as -d 20.1 (3) How do I extract the 20th and 30th grib message? $ wgrib2 grib_file -match "(^20:^30:)" -bin binary_file (4) How do I extract the first 10 records as a grib file? $ wgrib2 grib_file -for_n 1:10 -grib new_grib_file
  • 4. (5) How do I extract records 34-66 from a file as a grib file? $ wgrib2 grib_file -for 34:66 -grib new_grib_file (6) How do I extract grib messages 10,12,19 from a grib file ? $ wgrib2 old_grib -match '^(1|12|19):' -grib new_grib (7) How do I extract a range of dates from a file? Example 1. Analysis file $ wgrib2 z500.gdas.199704.f06 1:0:d=1997040100:HGT:500 mb:6 hour fcst: 2:228571:d=1997040106:HGT:500 mb:6 hour fcst: ... 120:27173104:d=1997043018:HGT:500 mb:6 hour fcst: The file contains the 500 mb geopotential height for the April 1997. To obtain the data for a range of date: $ wgrib2 z500.gdas.199704.f06 | awk '{d=substr($3,3); if (d >=1997042400 && d <=1997042512) print $0}' FS=':' | wgrib2 z500.gdas.199704.f06 -i -grib z500subset.grb
  • 5. Example 2. Extract a range of forecast times $ wgrib2 t250.03.2015080900.daily.grb2 1:0:d=2015080900:TMP:250 mb:6 hour fcst: 2:20236:d=2015080900:TMP:250 mb:12 hour fcst: .. 456:8689473:d=2015080900:TMP:250 mb:2736 hour fcst: To extract a range of forecasts, you can put the verification time into the inventory $ wgrib2 t250.03.2015080900.daily.grb2 -vt -s 1:0:vt=2015080906:d=2015080900:TMP:250 mb:6 hour fcst: 2:20236:vt=2015080912:d=2015080900:TMP:250 mb:12 hour fcst: ... Modifying example 1 $ wgrib2 t250.03.2015080900.daily.grb2 -vt -s | awk '{d=substr($3,4); if (d >=2015081000 && d <= 2014081012) print $0}' FS=':' | wgrib2 t250.03.2015080900.daily.grb2 -i -grib subset.grb 4:60273:d=2015080900:TMP:250 mb:24 hour fcst: 5:80131:d=2015080900:TMP:250 mb:30 hour fcst: 6:99799:d=2015080900:TMP:250 mb:36 hour fcst:
  • 6. (8) CMC files have some variables which don't have a name. wgrib2 doesn't include the local extensions from CMC. AFAIK, the CMC grib tables follows the NCEP grib table except where they don't. (How does that helps me?) So you may be able to get most of the fields right by using nt the NCEP tables. This can be done changing the center flag to NCEP (7). $wgrib2 -set center 7 infile.grib2 (In memory) $wgrib2 -set center 7 infile.grib2 -grib_out outfile.grib2 (In outfile) (9) Use append option to append on existing grib2 files $wgrib2 -set center 7 infile.grib2 -append -grib_out outfile.grib2
  • 7. (10) I am generating a text dump (CSV) and the size of the file is really large. If you set the grid point values to "undefined", they won't print out. $wgrib2 grib_file -undefine out-box 30:40 -10:10 -csv out.txt The grid points outside of the box are set to undefined. The -csv options tries to print the entire grid but finds only the values between 30N-40N and 10W-10E are defined. (11) I am generating a text dump (CSV) and the size of the file is really large, I am only interested in variable XYZ when it has values larger than 10 $wgrib2 grib_file -match ':XYZ:' -undefine_val -9e20:10" -csv output.txt (12) How do I extract small region from big domain in grib2 file $wgrib2 -small_grib LonW:LonE LatS:LatN file_name $wgrib2 fcst.grb2 -small_grib 10:20 -20:20 small_out.grb
  • 8. (13) I tried using the small_grib option but the files are bigger than the original. By default, The small_grib option writes a regional subset in simple packing. This requires less CPU resources but produces larger files than jpeg or complex packing. To use the same packing as the original file, use the "-set_grib_type same" option. OR –set_grib_type jpeg OR –set_grib_type complex OR –set_grib_type complex2 Or –set_grib_type complex3 (14) How do I extract a few variables and write out into new grib2 file ? $wgrib2 infile.grib2 -match "(VIS:surface:60 min fcst:|HGT:cloud ceiling:60 min fcst:)" – grib out.grib2 $wgrib2 infile.grib2 -match "(VIS:surface:60 min fcst:|HGT:cloud ceiling:60 min fcst:)" –grib_out out.grib2 $wgrib2 infile.grib2 -match "(VIS:surface:60 min fcst:|HGT:cloud ceiling:60 min fcst:)" –append –grib_out out.grib2
  • 9. (15) I want to modify some existing grib files. For example, I want to make sure the humidity (RH, SPFH) is positive. $wgrib2 test.grb -rpn "0:max" -set_grib_type c2 -grib_out new.grb -rpn 0:max, replaces the grid point values with max(0,grid_point_value) (16) My software won't handle the level "30-0 mb above ground". How do I change the level to "15 mb above ground"? IF structure $wgrib2 in.grb -if ":30-0 mb above ground" -set_lev "15 mb above ground" - endif -grib out.grb (17) How do I combine multiple grib2 files $ cat infile1.g2 infile2.g2 infile3.g2 infile3.g2 > outfile.grib2
  • 10. (18) Masking data: I want set all oceanic values to undefined. Suppose LAND is a grib file with the land mask (land=1, water=0) with the same grid as the data file (data.old) Make a new file with the land mask as the first record. $ cat LAND data.old > data.tmp $ wgrib2 data.tmp -if '^1:’ -rpn '1:==:sto_1’ -fi -not_if '^1:' -rpn 'rcl_1:mask' -set_grib_type same -set_scaling same -grib_out data.new (19) My grib file uses the special value of -999.9 as the undefine value. (Grid points with a value of -999.9 are considered to be undefined). Using special values as undefined values is outside of the grib standard and should be avoided. To convert a file that uses -999.9 to the grib standard, you should do $wgrib2 IN.grb -undefine_val 999.9 -set_grib_type same -grib_out OUT.grb
  • 11. (20) The -merge_fcst option will try to combine adjacent messages by increasing the time period for forecast average, accumulation, minimum and maximum. For example, $ wgrib2 prate.201404.ts -for 1:4 1:0:d=2014040100:PRATE:surface:0-1 hour ave fcst: 2:13083:d=2014040100:PRATE:surface:1-2 hour ave fcst: 3:26587:d=2014040100:PRATE:surface:2-3 hour ave fcst: 4:40437:d=2014040100:PRATE:surface:3-4 hour ave fcst: $ wgrib2 preas.201404.ts -for 1:4 -merge_fcst 4 all.grb (listing of input) $ wgrib2 all.grb 1:0:d=2014040100:PRATE:surface:0-4 hour ave fcst: Similar way, automatically compute accumulation, minimum, maximum based on input meta information available in the input.grib2 file.
  • 12. (20) The -merge_fcst - make daily accumulated rainfall from four 6-hourly accumulated rain $ cat fhr06 fhr12 fhr18 fhr24 > IN.grb $ wgrib2 IN.grb | sort -t: -k4,4 -k5,5 -k6,6n | wgrib2 -i IN.grb -grib OUT1.grb $ wgrib2 OUT1.grb -merge_fcst 4 OUT2.grb $ wgrib2 OUT1.grb -if ":APCP:” -merge_fcst 4 OUT2.grb (21) how to change the date code in a grib2 file ex. change the analysis/initial time date to 12Z of 5th Jan 2023 $wgrib2 -match "^[0-9]*(:|.1)" IN.grb -set_date 2023010512 -GRIB OUT.grb (22) How do I extract a particular lat, lon values? get values at (30N, -90W) and (40N, -80N) $wgrib2 IN.grb -lon -90 30 > location1.txt $wgrib2 IN.grb -lon -80 40 > location2.txt $wgrib2 IN.grb -lon -90 30 -80 40 > locations.txt
  • 13. (23) The GFS model has output every 3 hours in the files: gfs.t00z.pgrb2.1p00.fHHH HHH=000, 003, 006, 009, etc The APCP (accumulated precip) fields of the following format gfs.t00z.pgrb2.1p00.f003: has APCP:surface:0-3 hour acc fcst: gfs.t00z.pgrb2.1p00.f006: has APCP:surface:0-6 hour acc fcst: gfs.t00z.pgrb2.1p00.f009: has APCP:surface:6-9 hour acc fcst: gfs.t00z.pgrb2.1p00.f012: has APCP:surface:6-12 hour acc fcst: gfs.t00z.pgrb2.1p00.f015: has APCP:surface:12-15 hour acc fcst: .. I want a file with the 3 hourly accumulations like 1:0:d=2017041800:APCP:surface:0-3 hour acc fcst: 2:97943:d=2017041800:APCP:surface:3-6 hour acc fcst: 3:195886:d=2017041800:APCP:surface:6-9 hour acc fcst: 4:301974:d=2017041800:APCP:surface:9-12 hour acc fcst: 5:383627:d=2017041800:APCP:surface:12-15 hour acc fcst: .. You can use the -ncep_norm to make the 3 hourly accumulations of the APCP by $ cat gfs.t00z.pgrb2.1p00.f0?? | wgrib2 - -match ':APCP: ' -if ':APCP:' -ncep_norm apcp.grb2 Results are $ wgrib2 apcp.grb2 1:0:d=2017041800:APCP:surface:0-3 hour acc fcst: 2:97943:d=2017041800:APCP:surface:3-6 hour acc fcst: 3:195886:d=2017041800:APCP:surface:6-9 hour acc fcst: 4:301974:d=2017041800:APCP:surface:9-12 hour acc fcst:
  • 14. (24) In the operational CFSv2 time series, V does not follow U and the forecasts are not in order. When using - new_grid, the vector fields must be adjacent (u-component then v-component). $wgrib2 IN.grb | sed -e 's/:UGRD:/:UGRDa:/' -e 's/:VGRD:/:UGRDb:/' | sort -t: -k3,3 -k6n,6 -k5,5 -k4,4 | wgrib2 - i IN.grb -grib OUT.grb (25) Compute wind speed and direction and append into same input file. We need to make sure both u-wind and v-wind are aligned properly $wgrib2 -match "(UGRD|VGRD)" infile.grb2 -wind_speed tmpfile -wind_dir tmpfile $cat tmpfile >> infile.grb2 (26) How do I interpolate/extrapolate to different grid resolution of winds ? $wgrib2 IN.grb | sed -e 's/:d=([0-9]*):([A-Z])([^:]*)/:d=1:3_2:/' | sort -t: -k3,3 -k5,5 -k6,6 -k4,4 | wgrib2 -i IN.grb -new_grid_winds earth -new_grid latlon 0:144:2.5 -90:73:2.5 OUT.grb (27) I have grib2 files in WE:NS order, how do I put them into WE:SN order? Use -new_grid to a new grid. Just make the new grid the same as the old grid except for a WE:SN order.
  • 15. (28) I wanted to find the min TMP2M for a given year from CFSR. IN.grb = time series of TMP2m (only) for the given year # create an inventory where record 1 is last $wgrib2 IN.grb -for 2:: > junk.inv $ wgrib2 IN.grb -d 1 >> junk.inv # process the file with record 1 last $wgrib2 -i <junk.inv IN.grb -if "^2:" -rpn "sto_1" -fi -rpn "rcl_1:min:sto_1" -if "^1:" -set_ave "0-1 year min anl" - grib_out OUT.grb (29) How do I convert grib2 to netcdf format ? $ wgrib2 infile.grib2 -netcdf outfile.nc $ wgrib2 infile.grib2 -nc_nlev 4 -netcdf outfile.nc # to bring proper 4 dimensional axis (30) The -lola option with nlat=nlon=1 and dlat=dlon=1 (or any non-zero value) binary: $ wgrib2 IN.grb -no_header -lola "10:1:1" "20:1:1" grid.bin bin text: $ wgrib2 IN.grb -lola "10:1:1" "20:1:1" grid.txt text CSV: $ wgrib2 IN.grb -lola "10:1:1" "20:1:1" grid.grb grib (converting to grib) $ wgrib2 grid.grb -csv grid.csv (converting grib to CSV)
  • 16. (31) I have a file with var246 that I want to change to the standard UGRD at 100 m above ground. $ wgrib2 IN.grb -if ":var246:" -set_var UGRD -set_lev "100 m above ground" -fi -grib OUT.grb (32) I have 80 ensemble members that have WIND (wind speed) at 10m. I want to find the probability the the WIND is greater or equal to 15 m/s based on the percentile of ens members >= 15 m/s. ensemble data is at $input/flx.memNNN.YYMMDDHH where NNN=001..080 WIND is only defined at 10 meters above ground $ cat flx.mem??? | wgrib2 - -match ":WIND:" -rpn "15:>=" -ens_processing ensstat.grb x $wgrib2 ensstat.grb -match ":ens mean" -set_grib_type c3 -set_prob 1 1 1 15 15 -grib wind_ge_15.grb
  • 17. g2ctl.pl For analyses: $ g2ctl -O grib2_file > grib2_file.ctl $ gribmap -O -i grib2_file.ctl $ grads ga-> open grib2_file.ctl For forecasts (end of averaging period): $ g2ctl grib2_file > grib2_file.ctl $ gribmap -i grib2_file.ctl $ grads ga-> open grib2_file.ctl For forecasts (start of averaging period): $ g2ctl -b grib2_file > grib2_file.ctl $ gribmap -b -i grib2_file.ctl $ grads ga-> open grib2_file.ctl -verf .. use end of ave/acc period or fcst time (default) run gribmap with no option -0 .. use analysis times run gribmap with gribmap -0 -b .. use start of averaging/accumulation period run gribmap with gribmap -b -365 .. use 365 day calendar -no_profile .. no z coordinate -raw .. use a raw grid -ens "e1,..,en" .. a list of quoted ensemble names