-
Notifications
You must be signed in to change notification settings - Fork 18
Description
Hi @ChFrenkel !
When I run the main.py using Parameters:
--dataset MNIST --train-mode FA --optimizer Adam --loss CE --epochs 100 --batch-size 100 --lr 5e-3 --topology FC_4096_FC_4096_FC_10
to test the FA method (BTW, the DFA and BP can work), the Loss and Accuracy keep dropping like:
Summary of epoch 1:
[Training set] Loss: 1.635768, Accuracy: 80.70%
[ Testing set] Loss: 1.636767, Accuracy: 80.57%
100%|███████████| 600/600 [00:17<00:00, 33.86it/s]
Summary of epoch 2:
[Training set] Loss: 1.617745, Accuracy: 80.44%
[ Testing set] Loss: 1.615939, Accuracy: 80.95%
100%|███████████| 600/600 [00:17<00:00, 33.83it/s]
Summary of epoch 3:
[Training set] Loss: 1.601169, Accuracy: 84.36%
0%| | 0/600 [00:00<?, ?it/s] [ Testing set] Loss: 1.602587, Accuracy: 84.58%
100%|███████████| 600/600 [00:17<00:00, 33.57it/s]
Summary of epoch 4:
[Training set] Loss: 1.658640, Accuracy: 78.71%
[ Testing set] Loss: 1.655648, Accuracy: 79.05%
100%|███████████| 600/600 [00:17<00:00, 33.52it/s]
Summary of epoch 5:
[Training set] Loss: 1.613513, Accuracy: 76.86%
0%| | 0/600 [00:00<?, ?it/s] [ Testing set] Loss: 1.611184, Accuracy: 77.55%
100%|███████████| 600/600 [00:17<00:00, 33.47it/s]
Summary of epoch 6:
[Training set] Loss: 1.682992, Accuracy: 73.08%
[ Testing set] Loss: 1.680608, Accuracy: 73.28%
100%|███████████| 600/600 [00:17<00:00, 33.51it/s]
Summary of epoch 7:
[Training set] Loss: 1.703431, Accuracy: 70.98%
0%| | 0/600 [00:00<?, ?it/s] [ Testing set] Loss: 1.700103, Accuracy: 70.82%
100%|███████████| 600/600 [00:17<00:00, 33.48it/s]
Summary of epoch 8:
[Training set] Loss: 1.701123, Accuracy: 73.86%
0%| | 0/600 [00:00<?, ?it/s] [ Testing set] Loss: 1.698275, Accuracy: 73.92%
100%|███████████| 600/600 [00:17<00:00, 33.41it/s]
Summary of epoch 9:
[Training set] Loss: 1.742773, Accuracy: 68.16%
0%| | 0/600 [00:00<?, ?it/s] [ Testing set] Loss: 1.739864, Accuracy: 68.04%
100%|███████████| 600/600 [00:17<00:00, 33.38it/s]
Summary of epoch 10:
[Training set] Loss: 1.781167, Accuracy: 65.89%
0%| | 0/600 [00:00<?, ?it/s] [ Testing set] Loss: 1.776957, Accuracy: 66.57%
100%|███████████| 600/600 [00:17<00:00, 33.34it/s]
Summary of epoch 11:
[Training set] Loss: 1.794598, Accuracy: 65.41%
[ Testing set] Loss: 1.792383, Accuracy: 65.61%
100%|███████████| 600/600 [00:17<00:00, 33.39it/s]
Summary of epoch 12:
[Training set] Loss: 1.860354, Accuracy: 64.36%
0%| | 0/600 [00:00<?, ?it/s] [ Testing set] Loss: 1.859995, Accuracy: 64.94%
100%|███████████| 600/600 [00:17<00:00, 33.43it/s]
Summary of epoch 13:
[Training set] Loss: 1.886954, Accuracy: 59.08%
[ Testing set] Loss: 1.893155, Accuracy: 58.46%
100%|███████████| 600/600 [00:17<00:00, 33.44it/s]
Summary of epoch 14:
[Training set] Loss: 1.829343, Accuracy: 62.78%
0%| | 0/600 [00:00<?, ?it/s] [ Testing set] Loss: 1.828762, Accuracy: 63.10%
100%|███████████| 600/600 [00:17<00:00, 33.44it/s]
Summary of epoch 15:
[Training set] Loss: 1.956390, Accuracy: 46.87%
[ Testing set] Loss: 1.948871, Accuracy: 47.33%
100%|███████████| 600/600 [00:17<00:00, 33.44it/s]
Summary of epoch 16:
[Training set] Loss: 2.111546, Accuracy: 30.09%
[ Testing set] Loss: 2.108156, Accuracy: 30.43%
100%|███████████| 600/600 [00:17<00:00, 33.43it/s]
Summary of epoch 17:
[Training set] Loss: 2.083381, Accuracy: 30.34%
[ Testing set] Loss: 2.081901, Accuracy: 30.70%
100%|███████████| 600/600 [00:17<00:00, 33.43it/s]
Summary of epoch 18:
[Training set] Loss: 2.151237, Accuracy: 23.26%
[ Testing set] Loss: 2.146791, Accuracy: 23.35%
100%|███████████| 600/600 [00:17<00:00, 33.42it/s]
Summary of epoch 19:
[Training set] Loss: 2.161384, Accuracy: 21.71%
[ Testing set] Loss: 2.161011, Accuracy: 21.65%
100%|███████████| 600/600 [00:18<00:00, 32.18it/s]
Summary of epoch 20:
[Training set] Loss: 2.268198, Accuracy: 11.27%
[ Testing set] Loss: 2.267482, Accuracy: 11.40%
It seems like the model try to forget the knowledge. Why does this happen? Did I miss anything important?
Looking forward for your reply!