Budka, M., Bennett, M. R., Reynolds, S. C., Barefoot, S., Reel, S., Reidy, S. and Walker, J., 2021. Sexing white 2D footprints using convolutional neural networks. PLoS ONE, 16 (8), e0255630.
Full text available as:
|
PDF
2108.01554v1.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial. 3MB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
Official URL: http://dx.doi.org/10.1371/journal.pone.0255630
DOI: 10.1371/journal.pone.0255630
Abstract
Footprints are left, or obtained, in a variety of scenarios from crime scenes to anthropological investigations. Determining the sex of a footprint can be useful in screening such impressions and attempts have been made to do so using single or multi landmark distances, shape analyses and via the density of friction ridges. Here we explore the relative importance of different components in sexing two-dimensional foot impressions namely, size, shape and texture. We use a machine learning approach and compare this to more traditional methods of discrimination. Two datasets are used, a pilot data set collected from students at Bournemouth University (N = 196) and a larger data set collected by podiatrists at Sheffield NHS Teaching Hospital (N = 2677). Our convolutional neural network can sex a footprint with accuracy of around 90% on a test set of N = 267 footprint images using all image components, which is better than an expert can achieve. However, the quality of the impressions impacts on this success rate, but the results are promising and in time it may be possible to create an automated screening algorithm in which practitioners of whatever sort (medical or forensic) can obtain a first order sexing of a two-dimensional footprint.
Item Type: | Article |
---|---|
ISSN: | 1932-6203 |
Additional Information: | Data Availability: The data on which this study is based is available for replication purposes at: https://doi.org/10.18746/bmth.data.00000157. The scripts are available at https://github.com/bosmart. Funded by Quantifying the mosaic: testing modern analogues for African palaeoenvironments Edit Funded by Reconstructing past environmental contexts in East and South Africa between 3.5 and 0.8 million years ago Edit Funded by Using fossil antelope teeth and remote sensing of contemporary African vegetation community types to reconstruct hominin habitats around the Omo-Turkana basin between 3.5 – 1.6 Ma ago Edit Funded by Using fossil antelope teeth and remote sensing of contemporary African vegetation community types to reconstruct hominin habitats around the Omo-Turkana basin between 3.5 – 1.6 Ma ago Edit Funded by Using fossil antelope teeth and remote sensing of contemporary African vegetation community types to reconstruct hominin habitats around the Omo-Turkana basin between 3.5 – 1.6 Ma ago Edit Funded by Using fossil antelope teeth and remote sensing of contemporary African vegetation community types to reconstruct hominin habitats around the Omo-Turkana basin between 3.5 – 1.6 Ma ago Edit Funded by Using fossil antelope teeth and remote sensing of contemporary African vegetation community types to reconstruct hominin habitats around the Omo-Turkana basin between 3.5 – 1.6 Ma ago Edit Funded by Using fossil antelope teeth and remote sensing of contemporary African vegetation community types to reconstruct hominin habitats around the Omo-Turkana basin between 3.5 – 1.6 Ma ago Edit Funded by Using fossil antelope teeth to reconstruct hominin habitats in the Omo-Turkana basin between 3.5-1.6Ma ago |
Uncontrolled Keywords: | Adolescent;Adult;Aged;Aged, 80 and over;Algorithms;Female;Foot;Humans;Image Processing, Computer-Assisted;Imaging, Three-Dimensional;Machine Learning;Male;Middle Aged;Neural Networks, Computer;Pilot Projects;Sex Factors;Walking;Young Adult |
Group: | Faculty of Science & Technology |
ID Code: | 37622 |
Deposited By: | Symplectic RT2 |
Deposited On: | 04 Oct 2022 09:58 |
Last Modified: | 04 Oct 2022 09:58 |
Downloads
Downloads per month over past year
Repository Staff Only - |