Skip to content

Commit

Permalink
Fix running MeloTTS models on GPU. (#1379)
Browse files Browse the repository at this point in the history
We need to use opset 18 to export the model to onnx.
  • Loading branch information
csukuangfj authored Sep 26, 2024
1 parent 69c8e7b commit 12d04ce
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion scripts/melo-tts/export-onnx.py
Original file line number Diff line number Diff line change
Expand Up @@ -229,7 +229,7 @@ def main():

torch_model = ModelWrapper(model)

opset_version = 13
opset_version = 18
x = torch.randint(low=0, high=10, size=(60,), dtype=torch.int64)
print(x.shape)
x_lengths = torch.tensor([x.size(0)], dtype=torch.int64)
Expand Down

0 comments on commit 12d04ce

Please sign in to comment.