Skip to content

Design Notes for python

water-e edited this page Oct 10, 2024 · 1 revision

Currently there is some code to take individual variable time series and break them up into Inputs with dimension (batch, time). There is also a function to concatenate these into (batch,time,feature).

You should create Inputs for individual variables. Lags can be calculated offline using create_antecedent_inputs, which produces either the oddly lagged CalSIM style or they can be introduced using the simpler LSTM style. Recommend doing this one variable at a time. stack_inputs will combine multiple individual inputs each with lags (nbatch, ntime) into nbatch, ntime, nfeature

Scaling should be embedded in the model. You could do it after the lag structure is created but before concatenation. Recommend you not allow different lags to have different scaling.

If you want linear combinations, another thing you can do is create a custom layer just does the tensor arithmetic to combinesvariables. For instance NDO could be constructed from Sac, Exports, consumptive use and SJR flows using weights of +1sac -1exports -1CU + 1sjr. Consider waiting until after that to scale. Note you could leave those linear combinations fixed and allow others to be fit.

In models.py I stuck some code from the LSTM and MLPs we've been working on. LSTM is multivariate. The idea is to build models using functions that take "inputs" as arguments, since you have to do all the lagging and scaling first. I haven't thought the models stuff out too much but we could start testing them.

When I get back, I'd like to wrap the model fitting into Leave-1-Out like Ryan did. I'd like to collect the data, and selectively drop half of each of the cases. If we follow the standards, the cases will be in a column.

One other idea I'd like to record for my own (Eli) reference is that I'd like to see if including the altered scenario in a residual actually helps the baseline condition model settle on a good internal state that is good for transfer.

Clone this wiki locally