Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in prediction classes array for inference method in evaluator.py #5397

Open
MurielleMardenli200 opened this issue Nov 22, 2024 · 1 comment

Comments

@MurielleMardenli200
Copy link

MurielleMardenli200 commented Nov 22, 2024

Issue with evaluation:

I am using detectron in order to make predictions for 1 class only (with a value of 0). When I try to use the inference_on_dataset() method, I am getting the following error:

[11/22 12:05:36 d2.evaluation.coco_evaluation]: Preparing results for COCO format ...
Validation run stopped due to:A prediction has class=62, but the dataset only has 1 classes and predicted class id should be in [0, 0].

When I try to look at outputs variable containing the pred_classes array in the inference_on_dataset() method (line 165), I see that it contains an array with some 0 values (correct class prediction), but some random values ranging from 0-100 as well, with the first number of the array being 62 (the number in the error). The number of zeros was shortened for readability:

tensor([ 0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,
         0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,  0,
         0,  0,  0,  0,  0,  0,  0,  0, 62,  0, 60,  0,  0,  0,  0,  0,  0,  0,
         0,  0,  0,  0,  0,  0,  0, 59,  0,  0,  0,  0,  0,  0,  0,  0,  8,  0,
         0,  0,  0, 60,  0, 53,  0,  0,  0, 25,  0, 60, 73, 58, 63,  0,  4,  0,
        60, 69,  0,  0,  0,  6,  0, 37,  0,  0,  0,  0, 60, 74, 55, 60,  0,  0,
        71,  0, 13,  0,  0, 57, 56,  2, 60, 73, 72, 13,  0, 60, 72, 39, 32, 37,
        60, 37, 59, 13,  5, 50, 53, 73, 60, 60, 59])

The pred_classes array has the same size as the pred_boxes array, which shows it does contain the right predictions. When I replace the pred_classes array with an array of the same size filled with zeros (expected behavior), the method does not throw an error anymore.

How can I fix this issue?

@github-actions github-actions bot added the needs-more-info More info is needed to complete the issue label Nov 22, 2024
Copy link

You've chosen to report an unexpected problem or bug. Unless you already know the root cause of it, please include details about it by filling the issue template.
The following information is missing: "Instructions To Reproduce the Issue and Full Logs"; "Your Environment";

@github-actions github-actions bot removed the needs-more-info More info is needed to complete the issue label Nov 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant