-
Notifications
You must be signed in to change notification settings - Fork 697
[ENH] Add Tide
to test framework of ptf-v2
#1889
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
phoeenniixx
wants to merge
134
commits into
sktime:main
Choose a base branch
from
phoeenniixx:tide
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
134 commits
Select commit
Hold shift + click to select a range
b3644a6
test suite
fkiraly a1d64c6
Merge branch 'main' into test-suite
fkiraly 4b2486e
skeleton
fkiraly 02b0ce6
skeleton
fkiraly 41cbf66
Update test_all_estimators.py
fkiraly cef62d3
Update _base_object.py
fkiraly bc2e93b
Update _lookup.py
fkiraly eee1c86
Update _lookup.py
fkiraly 164fe0d
base metadatda
fkiraly 20e88d0
registry
fkiraly 318c1fb
fix private name
fkiraly 012ab3d
Update _base_object.py
fkiraly 86365a0
test failure
fkiraly f6dee46
Update test_all_estimators.py
fkiraly 9b0e4ec
Update test_all_estimators.py
fkiraly 7de5285
Update test_all_estimators.py
fkiraly 57dfe3a
test folders
fkiraly c9f12db
Update test.yml
fkiraly fa8144e
test integration
fkiraly 232a510
fixes
fkiraly 1c8d4b5
Update _conftest.py
fkiraly f632e32
try scenarios
fkiraly 252598d
D1, D2 layer commit
phoeenniixx d0d1c3e
remove one comment
phoeenniixx 80e64d2
model layer commit
phoeenniixx 6364780
update docstring
phoeenniixx 82b3dc7
Merge branch 'refactor-d1-d2' into refactor-model
phoeenniixx 257183c
update data_module.py
phoeenniixx 9cdcb19
update data_module.py
phoeenniixx a83bf32
Merge branch 'refactor-d1-d2' into refactor-model
phoeenniixx ac56d4f
Add disclaimer
phoeenniixx 0e7e36f
Merge branch 'refactor-d1-d2' into refactor-model
phoeenniixx 4bfff21
update docstring
phoeenniixx ef98273
Merge branch 'refactor-d1-d2' into refactor-model
phoeenniixx 8a53ed6
Add tests for D1,D2 layer
phoeenniixx 9f9df31
Merge branch 'main' into refactor-d1-d2
phoeenniixx cdecb77
Code quality
phoeenniixx 86360fd
Merge branch 'refactor-d1-d2' into refactor-model
phoeenniixx 20aafb7
refactor file
fkiraly 043820d
warning
fkiraly 1720a15
linting
fkiraly af44474
move coercion to utils
fkiraly a3cb8b7
linting
fkiraly 75d7fb5
Update _timeseries_v2.py
fkiraly 1b946e6
Update __init__.py
fkiraly 3edb08b
Update __init__.py
fkiraly a4bc9d8
Merge branch 'main' into pr/1811
fkiraly 4c0d570
Merge branch 'pr/1811' into pr/1812
fkiraly ef37f55
Merge branch 'main' into test-suite
fkiraly a669134
Update _lookup.py
fkiraly d78bf5d
Update _lookup.py
fkiraly e350291
update tests
phoeenniixx f90c94f
Merge branch 'refactor-d1-d2' into refactor-model
phoeenniixx 3099691
update tft_v2
phoeenniixx 77cb979
warnings and init attr handling
fkiraly 28df3c3
Merge branch 'refactor-d1-d2' of https://github.com/phoeenniixx/pytor…
fkiraly f8c94e6
simplify TimeSeries.__getitem__
fkiraly c289255
Update _timeseries_v2.py
fkiraly 9467f38
Update data_module.py
fkiraly c3b40ad
backwards compat of private/public attrs
fkiraly c007310
Merge branch 'refactor-d1-d2' into refactor-model
phoeenniixx 2e25052
Merge branch 'main' into refactor-model
phoeenniixx 38c28dc
add tests
phoeenniixx 9d80eb8
add tests
phoeenniixx a8ccfe3
add tests
phoeenniixx f900ba5
add more docstrings
phoeenniixx ed1b799
add note about the commented out tests
phoeenniixx c947910
Merge branch 'main' into refactor-model
phoeenniixx c0ceb8a
add the commented out tests
phoeenniixx 3828c26
remove note
phoeenniixx 6d6d18e
Merge branch 'main' into refactor-model
phoeenniixx 3144865
Merge branch 'test-suite' of https://github.com/sktime/pytorch-foreca…
phoeenniixx 30b541b
make the modules private
phoeenniixx 3f1e11f
Merge remote-tracking branch 'origin/refactor-model' into refactor-model
phoeenniixx 5cc3ff1
initial commit
phoeenniixx 1bcf181
Merge branch 'refactor-model' into test-framework
phoeenniixx f18e09d
add TFTMetadata class
phoeenniixx e1e360e
add TFTMetadata class
phoeenniixx 168e16a
Merge branch 'main' into test-framework
phoeenniixx 92c12bf
add TFT tests
phoeenniixx 1d478d5
remove refactored TFT
phoeenniixx f9992f2
Merge branch 'main' into test-framework
phoeenniixx d049019
update test_all_estimators
phoeenniixx e72486b
linting
phoeenniixx 7443b0b
Merge branch 'main' into test-framework
phoeenniixx a734f26
refactor
phoeenniixx 7f466b2
Add more test_params
phoeenniixx 0968452
Add metadata tests
phoeenniixx 525bbb9
Merge branch 'main' into test-framework
phoeenniixx 4267da6
Merge branch 'main' into test-framework
phoeenniixx 4e8f863
add object-filter to ptf-v1
phoeenniixx c117092
Merge branch 'main' into test-framework
phoeenniixx f6d39fe
Merge branch 'main' into test-framework
phoeenniixx 2c518ee
add new base classes
phoeenniixx 7a5c58f
remove try block
phoeenniixx cb3e944
Merge branch 'main' into test-framework
phoeenniixx 3b9de6d
add support for multiple datamodules
phoeenniixx 032a7b0
typo
phoeenniixx 4d9a19a
Merge branch 'main' into test-framework
phoeenniixx 03c06e8
Merge branch 'main' into test-framework
phoeenniixx 33ae311
add Tide
phoeenniixx 8b0087e
linting
phoeenniixx 0e1debd
Merge branch 'test-framework' into tide
phoeenniixx 63f1eb7
softdep
phoeenniixx d328fae
Merge branch 'main' into test-framework
phoeenniixx 62c3f83
Merge branch 'test-framework' into tide
phoeenniixx 7dfba67
softdep
phoeenniixx f020229
add the error causing param
phoeenniixx 43a837e
remove embs from params
phoeenniixx 68df4b6
merge main
phoeenniixx 57d635b
add pkg name to v2
phoeenniixx 9798ff1
Merge branch 'test-framework' into tide
phoeenniixx 1c88de0
add pkg name to v2
phoeenniixx 8436793
Merge branch 'main' into tide
phoeenniixx 6096b90
Merge branch 'main' into pr/1889
fkiraly 0fbbf00
Update _tide_pkg.py
fkiraly ab94060
revert
fkiraly f4d4f37
Delete tft_v2_metadata.py
fkiraly d0b8677
revert
fkiraly 3f89a45
revert
fkiraly ad00566
rename
fkiraly 53131e4
Update tide_v2_pkg.py
fkiraly 52fefc3
update tide.py
phoeenniixx 2d20eb3
refactor
phoeenniixx 5968af7
merge main
phoeenniixx 8d94371
refactor code
phoeenniixx 6cbb6aa
remove unused base class
phoeenniixx 58f0b60
remove beautify string util
phoeenniixx 89939e8
remove unused imports
phoeenniixx 92e88ad
add docstrings
phoeenniixx c11fb4d
update docstrings
phoeenniixx d7eeeec
Merge branch 'main' into tide
phoeenniixx 8f4a831
Merge branch 'main' into tide
phoeenniixx 7619147
Merge branch 'main' into tide
phoeenniixx File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
from pytorch_forecasting.layers._dsipts._residual_block_dsipts import ResidualBlock | ||
from pytorch_forecasting.layers._dsipts._sub_nn import embedding_cat_variables | ||
|
||
__all__ = ["ResidualBlock", "embedding_cat_variables"] |
50 changes: 50 additions & 0 deletions
50
pytorch_forecasting/layers/_dsipts/_residual_block_dsipts.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,50 @@ | ||
import torch.nn as nn | ||
|
||
|
||
class ResidualBlock(nn.Module): | ||
def __init__( | ||
self, in_size: int, out_size: int, dropout_rate: float, activation_fun: str = "" | ||
): | ||
"""Residual Block as basic layer of the archetecture. | ||
|
||
MLP with one hidden layer, activation and skip connection | ||
Basically dimension d_model, but better if input_dim and output_dim are explicit | ||
|
||
in_size and out_size to handle dimensions at different stages of the NN | ||
|
||
Parameters | ||
---------- | ||
in_size: int | ||
input size | ||
out_size: int | ||
output size | ||
dropout_rate: float | ||
dropout | ||
activation_fun: str, Optional | ||
activation function to use in the Residual Block. Defaults to nn.ReLU. | ||
""" # noqa: E501 | ||
import ast | ||
|
||
super().__init__() | ||
|
||
self.direct_linear = nn.Linear(in_size, out_size, bias=False) | ||
|
||
if activation_fun == "": | ||
self.act = nn.ReLU() | ||
else: | ||
activation = ast.literal_eval(activation_fun) | ||
self.act = activation() | ||
self.lin = nn.Linear(in_size, out_size) | ||
self.dropout = nn.Dropout(dropout_rate) | ||
|
||
self.final_norm = nn.LayerNorm(out_size) | ||
|
||
def forward(self, x, apply_final_norm=True): | ||
direct_x = self.direct_linear(x) | ||
|
||
x = self.dropout(self.lin(self.act(x))) | ||
|
||
out = x + direct_x | ||
if apply_final_norm: | ||
return self.final_norm(out) | ||
return out |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,101 @@ | ||
from typing import Union | ||
|
||
import torch | ||
import torch.nn as nn | ||
|
||
|
||
class embedding_cat_variables(nn.Module): | ||
# at the moment cat_past and cat_fut together | ||
def __init__(self, seq_len: int, lag: int, d_model: int, emb_dims: list, device): | ||
"""Class for embedding categorical variables, adding 3 positional variables during forward | ||
|
||
Parameters | ||
---------- | ||
seq_len: int | ||
length of the sequence (sum of past and future steps) | ||
lag: (int): | ||
number of future step to be predicted | ||
hiden_size: int | ||
dimension of all variables after they are embedded | ||
emb_dims: list | ||
size of the dictionary for embedding. One dimension for each categorical variable | ||
device : torch.device | ||
""" # noqa: E501 | ||
super().__init__() | ||
self.seq_len = seq_len | ||
self.lag = lag | ||
self.device = device | ||
self.cat_embeds = emb_dims + [seq_len, lag + 1, 2] # | ||
self.cat_n_embd = nn.ModuleList( | ||
[nn.Embedding(emb_dim, d_model) for emb_dim in self.cat_embeds] | ||
) | ||
|
||
def forward( | ||
self, x: Union[torch.Tensor, int], device: torch.device | ||
) -> torch.Tensor: | ||
"""All components of x are concatenated with 3 new variables for data augmentation, in the order: | ||
|
||
- pos_seq: assign at each step its time-position | ||
- pos_fut: assign at each step its future position. 0 if it is a past step | ||
- is_fut: explicit for each step if it is a future(1) or past one(0) | ||
|
||
Parameters | ||
---------- | ||
x: torch.Tensor | ||
`[bs, seq_len, num_vars]` | ||
|
||
Returns | ||
------ | ||
torch.Tensor: | ||
`[bs, seq_len, num_vars+3, n_embd]` | ||
""" # noqa: E501 | ||
if isinstance(x, int): | ||
no_emb = True | ||
B = x | ||
else: | ||
no_emb = False | ||
B, _, _ = x.shape | ||
|
||
pos_seq = self.get_pos_seq(bs=B).to(device) | ||
pos_fut = self.get_pos_fut(bs=B).to(device) | ||
is_fut = self.get_is_fut(bs=B).to(device) | ||
|
||
if no_emb: | ||
cat_vars = torch.cat((pos_seq, pos_fut, is_fut), dim=2) | ||
else: | ||
cat_vars = torch.cat((x, pos_seq, pos_fut, is_fut), dim=2) | ||
cat_vars = cat_vars.long() | ||
cat_n_embd = self.get_cat_n_embd(cat_vars) | ||
return cat_n_embd | ||
|
||
def get_pos_seq(self, bs): | ||
pos_seq = torch.arange(0, self.seq_len) | ||
pos_seq = pos_seq.repeat(bs, 1).unsqueeze(2).to(self.device) | ||
return pos_seq | ||
|
||
def get_pos_fut(self, bs): | ||
pos_fut = torch.cat( | ||
( | ||
torch.zeros((self.seq_len - self.lag), dtype=torch.long), | ||
torch.arange(1, self.lag + 1), | ||
) | ||
) | ||
pos_fut = pos_fut.repeat(bs, 1).unsqueeze(2).to(self.device) | ||
return pos_fut | ||
|
||
def get_is_fut(self, bs): | ||
is_fut = torch.cat( | ||
( | ||
torch.zeros((self.seq_len - self.lag), dtype=torch.long), | ||
torch.ones((self.lag), dtype=torch.long), | ||
) | ||
) | ||
is_fut = is_fut.repeat(bs, 1).unsqueeze(2).to(self.device) | ||
return is_fut | ||
|
||
def get_cat_n_embd(self, cat_vars): | ||
cat_n_embd = torch.Tensor().to(cat_vars.device) | ||
for index, layer in enumerate(self.cat_n_embd): | ||
emb = layer(cat_vars[:, :, index]) | ||
cat_n_embd = torch.cat((cat_n_embd, emb.unsqueeze(2)), dim=2) | ||
return cat_n_embd |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
"""DSIPTS Tide Implementation for V2""" | ||
|
||
from pytorch_forecasting.models.tide.tide_dsipts._tide_v2 import TIDE | ||
from pytorch_forecasting.models.tide.tide_dsipts._tide_v2_pkg import TIDE_pkg_v2 | ||
|
||
__all__ = ["TIDE", "TIDE_pkg_v2"] |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why are we doing this? This seems like a significant change to the internal contract of
TimeSeriesDataSet
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We discussed it some months back, that there was no
target_past
inEncoderDecoderDataModule
and some models do require it.TSLibDataModule
already implements it ashistory_target
(see here).Tide
also needs it, and we were already calculatingtarget_past
for thetarget_scale
, so I just renamed the variable and added it to the return(if you see line 509, we are still returning
target_scale
as well)