Skip to content
This repository has been archived by the owner on Mar 17, 2022. It is now read-only.

I found why it doesn't work! #16

Open
Golbstein opened this issue Feb 12, 2019 · 6 comments
Open

I found why it doesn't work! #16

Golbstein opened this issue Feb 12, 2019 · 6 comments

Comments

@Golbstein
Copy link

the "view" in pytorch and "flatten" in keras work differently.
To fix this problem add this layer before "Flatten":

x = Lambda(lambda x: K.permute_dimensions(x, (0, 3, 1, 2)))(x)

@gzuidhof
Copy link
Owner

Thanks! I would love to see a PR for this. Let me know if I can help

@Golbstein
Copy link
Author

https://github.com/Golbstein/pytorch_to_keras

@rjarbour
Copy link

rjarbour commented Feb 13, 2019

Great work!

Despite your solution I dont seem to get accurate results, I havent done a direct comparison yet, Ive just scanned over the results. Im going to tinker with it a bit more before I post my model and results. Im not really sure if it will be useful though since its a pretty deep model.

I can say now that the model uses Relu activation layers, Maxpool2D, Conv2d, and batchnorm.

@Golbstein
Copy link
Author

In my project I used exactly this layers: Conv2d, MaxPool, BatchNorm and relu activation and I could get the same output

@rjarbour
Copy link

Im not really sure if Ive made a mistake, do you think MaxPool2D works properly? I'll try to step through it later to see if it produces the right output.

@Golbstein
Copy link
Author

It worked for me

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants