Skip to main content

Sexing Caucasian 2D footprints using convolutional neural networks.

Budka, M., Bennett, M. R., Reynolds, S. C., Barefoot, S., Reel, S., Reidy, S. and Walker, J., 2021. Sexing Caucasian 2D footprints using convolutional neural networks. PLOS One, 16 (8), e0255630.

Full text available as:

[img]
Preview
PDF (OPEN ACCESS ARTICLE)
journal.pone.0255630.pdf - Published Version
Available under License Creative Commons Attribution.

3MB
[img] PDF
Sexing a 2D footprint_Revised.pdf - Accepted Version
Restricted to Repository staff only
Available under License Creative Commons Attribution.

254kB

DOI: 10.1371/journal.pone.025563

Abstract

Footprints are left, or obtained, in a variety of scenarios from crime scenes to anthropological investigations. Determining the sex of a footprint can be useful in screening such impressions and attempts have been made to do so using single or multi landmark distances, shape analyses and via the density of friction ridges. Here we explore the relative importance of different components in sexing two-dimensional foot impressions namely, size, shape and texture. We use a machine learning approach and compare this to more traditional methods of discrimination. Two datasets are used, a pilot data set collected from students at Bournemouth University (N=196) and a larger data set collected by podiatrists at Sheffield NHS Teaching Hospital (N=2677). Our convolutional neural network can sex a footprint with accuracy of around 90% on a test set of N=267 footprint images using all image components, which is better than an expert can achieve. However, the quality of the impressions impacts on this success rate, but the results are promising and in time it may be possible to create an automated screening algorithm in which practitioners of whatever sort (medical or forensic) can obtain a first order sexing of a two-dimensional footprint.

Item Type:Article
ISSN:1932-6203
Additional Information:Data Availability: The data on which this study is based is available for replication purposes at: https://doi.org/10.18746/bmth.data.00000157. The scripts are available at https://github.com/bosmart.
Group:Faculty of Science & Technology
ID Code:35819
Deposited By: Unnamed user with email symplectic@symplectic
Deposited On:22 Jul 2021 16:06
Last Modified:06 Sep 2021 10:47

Downloads

Downloads per month over past year

More statistics for this item...
Repository Staff Only -