Beschreibung |
Let's be honest: Most (comparative) product review websites are utter rubbish (low effort, low quality, deceptive, fake, you name it) and commercial search engines fail to deal with it.
To fix this mess, we want to build a tool that rates the quality of review websites and helps users make better buying decisions. As a prerequisite for such a tool, we first need a lot of website quality annotations. We have already developed a questionnaire to assess a website's quality and collected screenshots of more than 200,000 potential review websites. We now want to develop a crowdsourcing task using Amazon Mechanical Turk (MTurk) to let paid workers create the annotations for us. Creating this task involves (1) UX design to guide the untrained workers, (2) optimizing the questionaire (data-driven) to streamline the annotations, and (3) developing evaluation methods to weed out faulty annotations. |