-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could you show your code on Tiny-ImageNet dataset? #7
Comments
Hi, thanks for your question. You may need to set a smaller learning rate for tiny-imagenet. |
Thank you! By the way, I set lr=0.1, batch size m = 128, and number of training epochs 100 with transition epochs {75, 90} on the training dataset. And here is my transforms: |
Hi, (1) for transformations, test_loader = torch.utils.data.DataLoader( (2) we use WideResNet-34-10 and the teacher model WideResNet-34-10. Other hyper-parameters keep the same with CIFAR. You can have a try with this setting. We can talk about it further if it still can not work well. |
@jiequancui Thanks for your help, but I have followed your transformations and teacher model setting, and robust/clean acc still is 0.01 when epoch =21. My torch==1.9.0, is there anything else I can do? |
Hi, Let me check it. |
Hello, I'm very surprised to see your work, which is simple but effective. However, when I coded your method on tiny-imagenet dataset, I met some trouble, the robust acc and natural acc always 0.01. So could you show your code on Tiny-ImageNet dataset? Thanks a lot.
The text was updated successfully, but these errors were encountered: