It would be very useful to being able to download a JSON file of the annotations generated by the ML model, over the TEST set.
Currently it is possible to download such JSON file with the annotations done by hand.
Getting the ones of the ML over the TEST set, will make possible to compare what human annotators have done, with what the model has. So it will be feasible to identify which are the specific issues that must be fixed, in order to improve performance.
|Who would benefit from this IDEA?||I will be able to compare in a practical way, what was annotated by had and what was generated by the model, so fine tuning will be possible|
NOTICE TO EU RESIDENTS: per EU Data Protection Policy, if you wish to remove your personal information from the IBM ideas portal, please login to the ideas portal using your previously registered information then change your email to "firstname.lastname@example.org" and first name to "anonymous" and last name to "anonymous". This will ensure that IBM will not send any emails to you about all idea submissions