Have a personal or library account? Click to login
Implementing State-of-the-Art Deep Learning Approaches for Archaeological Object Detection in Remotely-Sensed Data: The Results of Cross-Domain Collaboration Cover

Implementing State-of-the-Art Deep Learning Approaches for Archaeological Object Detection in Remotely-Sensed Data: The Results of Cross-Domain Collaboration

Open Access
|Dec 2021

Abstract

The ever-increasing amount of remotely-sensed data pertaining to archaeology renders human-based analysis unfeasible, especially considering the expert knowledge required to correctly identify structures and objects in these type of data. Therefore, robust and reliable computer-based object detectors are needed, which can deal with the unique challenges of not only remotely-sensed data, but also of the archaeological detection task.

In this research – across-domain collaboration between archaeology and computer science — the latest developments in object detection and Deep Learning — for both natural and satellite imagery — are used to develop an object detection approach, based on the YOLOv4 framework, and modified to the specific task of detecting archaeology in remotely-sensed LiDAR data from theVeluwe(the Netherlands). Experiments show that a general version of the YOLOv4 architecture outperforms current object detection workflows used in archaeology, while the modified version of YOLOv4, geared towards the archaeological task, reaches even higher performance. The research shows the potential and benefit of cross-domain collaboration, where expert knowledge from different research fields is used to create a more reliable detector.

DOI: https://doi.org/10.5334/jcaa.78 | Journal eISSN: 2514-8362
Language: English
Submitted on: Jul 5, 2021
|
Accepted on: Oct 25, 2021
|
Published on: Dec 8, 2021
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2021 Martin Olivier, Wouter Verschoof-van der Vaart, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.