Best practice for QA for annotations created in hasty.ai

Hi everyone,

I’ve had some help creating labels for me in Hasty, but I’m not super happy with the quality of the work. So I have to re-check some of the images and I was wondering how you do QA and if you can share some best practices?

Thanks,
Kevin

Hi Kevin,

I also have an intern doing the labelling work for me to save some of my time. We do QA by using the image status and the manual review tool.

To be more precise: whenever he is done labelling an image, he sets the status to to review. Then, I open the manual review feature from the project settings. This makes checking for the quality of the labels super easy. The Hasty team did a great job here.

Let me know if you find a better workflow.

2 Likes

@simon.dreisig: I couldn’t have explained how to use the manual review feature better myself! Thanks for jumping in.

At Hasty, we are also working on another feature to automate the QA process for you: the error finder. Our assistants will check for you if it can find any badly done annotations and you’ll save a lot of time which you’re spending atm to find the errors in the first place.

We’ll launch the feature quite soon, so stay tuned!

Best,
Tobias

1 Like

Really looking forward to try the error finder! It sounds great.

1 Like