-
Notifications
You must be signed in to change notification settings - Fork 834
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error during inference : Kernel size can't be greater than actual input size #1260
Comments
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
On certain files, on sliding window mode, the inference will crash because the input is smaller than the kernel.
For example given this inference:
The inference might crashed if it's applied to certain files.
These "certain files" appear to be files where the last window to be computed will be too short for the model. For example with a inference==step, this UEM crashes:
3b79017c-4d42-40fc-a1bb-4a20bc8ebca7 1 0.000 300.002
(last window will be 0.002 seconds long)
but this one does not:
fe0eab73-f908-400a-a25b-fdcc9b86a029 1 0.000 300.000
Full error log
The text was updated successfully, but these errors were encountered: