Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accuracy of pruned vgg without pruning #67

Open
robinmiali opened this issue Sep 7, 2020 · 0 comments
Open

Accuracy of pruned vgg without pruning #67

robinmiali opened this issue Sep 7, 2020 · 0 comments

Comments

@robinmiali
Copy link

Hi He,

I am quite interested in your paper and thanks a lot for sharing the code. In terms of Table 3: VGG, CIFAR10. I am trying to re-implement the experiment but it seems I could not get 80.38 top-1 accuracy for your algorithm.
The model you offered on Googledrive("VGG/vgg_pretrain/prune_precfg_epoch160_variance1") seems to be vgg16 with 'cfg': [64, 64, 'M', 128, 128, 'M', 256, 256, 256, 'M', 512, 512, 512, 'M', 512, 512, 512]. However using "sh VGG_cifar/scripts/pruning_vgg_my_method.sh", we are seemingly using the model with cfg= [32, 64, 128, 128, 256, 256, 256, 256, 256, 256, 256, 256, 256].

In terms of the "accuracy without finetuning", I am not sure I am getting it right. Is it the accuracy after one-time masking the original pretrained model (line 197 in file "VGG_cifar/pruning_cifar_vgg.py ")? If so, for the model given on GoogleDrive as mentioned above, I could only get about 51%.

Could you please tell me which type of vgg16 are we using? It would be really nice if you could inform me how to re-implement the 80.38 accuracy for pruned model without training:-)

Sincerely thank you!!!

Mia

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant