-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
- Workflow details:
- Use: `run_model raster_met scenario/date location/dataset [temp_dir=auto]```
- Note: We should either split the
metflow into a separate MM OR change it's arg order todataset, date, becausedatasetis more likescenariothandateis.
- Note: We should either split the
- run_steps=
download process import analyze segment/data id: datescenario/conf set: data source (nldas2, PRISM, etc.)- modules=
met,geo,amalgamate
- Use: `run_model raster_met scenario/date location/dataset [temp_dir=auto]```
- Meta Model modules
- met - obtains and format the base meteorology data — see Workflow
metmodel_meteorology#64- Output/Intermediate Files:
- geotiff clipped to coverage, uses original file name
[original file].CBP_ext.gtiff - Imports into database with postigs
- geotiff clipped to coverage, uses original file name
- Output/Intermediate Files:
- geo Extract met data for a given coverage, analyze according to a USGS gage and produce an accuracy assessment
- amalgamate see Workflow
amalgamatemodel_meteorology#66- Assembles a set of rasters from an assortment of candidate sources based on accuracy rank for individual time periods
- wdm: creates an hourly precip WDM (TBD: include other met vars )
- Handles temporal disaggregation for storage efficiency
- see Workflow
wdmmodel_meteorology#72
- met - obtains and format the base meteorology data — see Workflow
model_configdetails:- get source file names
- nldas2 - for each date there are 24 files (script nldas2_file_names YEAR JDAY ):
- PRISM: 1 file for each date
- from
MODEL_ROOT/config/control/met/[scenario].con:precip_methodprecip_data_sources(default: nldas2, prism, daymet)
- get julian day
- tmp directory (for intermediate data products and
problemfile in case of error that needs to stop workflow) - tiff storage directory path
- get source file names
- How to make persistent data?
- stash as json or text file things like
src_files.txt,import_files.txt - part of
model_configcan obtain dates and file matches
- stash as json or text file things like
- Testing:
- Process after import
/opt/model/meta_model/run_model raster_met "2021-01-02" nldas2 auto met process - Import:
/opt/model/meta_model/run_model raster_met "2021-01-01" nldas2 auto met import
- Process after import
Configuration (before any runs)
MODEL_ROOT=/backup/meteorology/
MODEL_BIN=$MODEL_ROOT
SCRIPT_DIR=/opt/model/model_meteorology/sh
export MODEL_ROOT MODEL_BIN SCRIPT_DIR
Run examples:
# run all steps for a given date and dataset (nldas2, PRISM, ...)
/opt/model/meta_model/run_model raster_met "2021-01-03" nldas2 auto
# just to the import step
/opt/model/meta_model/run_model raster_met "2021-01-03" nldas2 auto met import
# just the timestamp setting portion of the post-download processing workflow
/opt/model/meta_model/run_model raster_met "2021-01-03" nldas2 auto met process 3
Draft: still working out the details of where and when to do each step.
- TBD: how do we tell the system what things to amalgamate?
- Select options/preferences in the
config/met/[scenario].confile - could have auto generation of best fit but this is less likely than selecting from pre-determined in a config file.
- Select options/preferences in the
- Generate a amalgamated dataset: Repeat for each USGS gage coverage that has data available for 1 or more of the requested candidate data sources and methods
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels