Skip to content

Annotation Quality #2

@dakshvar22

Description

@dakshvar22

Hi,

Thank you for sharing this dataset. I've read the paper introducing the dataset and it seems to suggest that the dataset was collected and annotated by domain experts. I went through the dataset manually and observed a lot of wrong annotations. For example, intent label CHAT_WITH_AN_AGENT in Powerplay11 dataset is highly overlapping with other intents which express some user problem, for e.g. - WINNINGS, etc.
Could you please clarify what was the extent of manual supervision applied during the collection of the dataset? Moreover, any inter-annotator agreement scheme that may have been used would be useful to know.

Regards,
Daksh

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions