The dangers of Webcrawled datasets

Authors

  • Graeme Baxter Bell Murdoch University

DOI:

https://doi.org/10.5210/fm.v15i2.2739

Keywords:

internet, webcrawler, webcrawling, data gathering, image-processing, information forensics

Abstract

This article highlights legal, ethical and scientific problems arising from the use of large experimental datasets gathered from the Internet - in particular, image datasets. Such datasets are currently used within research into topics such as information forensics and image-processing. This paper strongly recommends against webcrawling as a means for generating experimental datasets, and proposes safer alternatives.

Author Biography

Graeme Baxter Bell, Murdoch University

Graeme Bell is a Lecturer in the Faculty of Creative Technologies and Media at Murdoch University in Perth, Western Australia. He holds a Ph.D. in computer science from the University of St Andrews, U.K. He was the top Science graduate from the University of St Andrews in 2001 and also the winner of the 2001 Scottish Young Software Engineer of the Year award. His research interests include artificial intelligence, bioinformatics, robotics, image-processing, steganography, and the Internet.

Downloads

Published

2010-02-06

How to Cite

Bell, G. B. (2010). The dangers of Webcrawled datasets. First Monday, 15(2). https://doi.org/10.5210/fm.v15i2.2739