You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am quite interested in your paper and thanks a lot for sharing the code. In terms of Table 3: VGG, CIFAR10. I am trying to re-implement the experiment but it seems I could not get 80.38 top-1 accuracy for your algorithm.
The model you offered on Googledrive("VGG/vgg_pretrain/prune_precfg_epoch160_variance1") seems to be vgg16 with 'cfg': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 512]. However using "sh VGG_cifar/scripts/pruning_vgg_my_method.sh", we are seemingly using the model with cfg= [32, 64, 128, 128, 256, 256, 256, 256, 256, 256, 256, 256, 256].
In terms of the "accuracy without finetuning", I am not sure I am getting it right. Is it the accuracy after one-time masking the original pretrained model (line 197 in file "VGG_cifar/pruning_cifar_vgg.py ")? If so, for the model given on GoogleDrive as mentioned above, I could only get about 51%.
Could you please tell me which type of vgg16 are we using? It would be really nice if you could inform me how to re-implement the 80.38 accuracy for pruned model without training:-)
Sincerely thank you!!!
Mia
The text was updated successfully, but these errors were encountered:
Hi He,
I am quite interested in your paper and thanks a lot for sharing the code. In terms of Table 3: VGG, CIFAR10. I am trying to re-implement the experiment but it seems I could not get 80.38 top-1 accuracy for your algorithm.
The model you offered on Googledrive("VGG/vgg_pretrain/prune_precfg_epoch160_variance1") seems to be vgg16 with 'cfg': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 512]. However using "sh VGG_cifar/scripts/pruning_vgg_my_method.sh", we are seemingly using the model with cfg= [32, 64, 128, 128, 256, 256, 256, 256, 256, 256, 256, 256, 256].
In terms of the "accuracy without finetuning", I am not sure I am getting it right. Is it the accuracy after one-time masking the original pretrained model (line 197 in file "VGG_cifar/pruning_cifar_vgg.py ")? If so, for the model given on GoogleDrive as mentioned above, I could only get about 51%.
Could you please tell me which type of vgg16 are we using? It would be really nice if you could inform me how to re-implement the 80.38 accuracy for pruned model without training:-)
Sincerely thank you!!!
Mia
The text was updated successfully, but these errors were encountered: