Dersleri yüzünden oldukça stresli bir ruh haline sikiş hikayeleri bürünüp özel matematik dersinden önce rahatlayabilmek için amatör pornolar kendisini yatak odasına kapatan genç adam telefonundan porno resimleri açtığı porno filmini keyifle seyir ederek yatağını mobil porno okşar ruh dinlendirici olduğunu iddia ettikleri özel sex resim bir masaj salonunda çalışan genç masör hem sağlık hem de huzur sikiş için gelip masaj yaptıracak olan kadını gördüğünde porn nutku tutulur tüm gün boyu seksi lezbiyenleri sikiş dikizleyerek onları en savunmasız anlarında fotoğraflayan azılı erkek lavaboya geçerek fotoğraflara bakıp koca yarağını keyifle okşamaya başlar

GET THE APP

The Unwritten Laws of American Fingerprinting | OMICS International
ISSN: 2169-0170
Journal of Civil & Legal Sciences
Make the best use of Scientific Research and information from our 700+ peer reviewed, Open Access Journals that operates with the help of 50,000+ Editorial Board Members and esteemed reviewers and 1000+ Scientific associations in Medical, Clinical, Pharmaceutical, Engineering, Technology and Management Fields.
Meet Inspiring Speakers and Experts at our 3000+ Global Conferenceseries Events with over 600+ Conferences, 1200+ Symposiums and 1200+ Workshops on Medical, Pharma, Engineering, Science, Technology and Business

The Unwritten Laws of American Fingerprinting

Logan Stickel*

University College London, UK

*Corresponding Author:
Logan Stickel
University College London, UK
Tel: +1 (415) 942 4387
E-mail: lstickel@hushmail.com

Received Date: September 22, 2016; Accepted Date: September 28, 2016; Published Date: September 30, 2016

Citation: Stickel L (2016) The Unwritten Laws of American Fingerprinting. J Civil Legal Sci 5:210. doi: 10.4172/2169-0170.1000210

Copyright: © 2016 Stickel L. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Visit for more related articles at Journal of Civil & Legal Sciences

Abstract

The opprobrious state of fingerprinting has prompted the need for serious methodological revision. A chronological examination of the development of fingerprinting reveals that the unchallenged historical premises of individualization in matching protocols led to the superficial evolution of collection and identification techniques that not only lack purposeful systemization and standardization, but are comparatively absent of the same scientific rigor found elsewhere in forensic science. In light of the new evidential requirements created under the Daubert Standard and the follies demonstrated in significant miscarriages of justices, fingerprinting processes must be steeled to avoid being a biometric anachronism. To bolster its adjudicative and scientific validity, fingerprinting should become a multimodal fusion examined in accordance with a unified organizational process. Findings similarly must be reported using Bayesian statistics whereby relative error rates are given and should be judged circumstantially, decreasing their evaluative weight. As the remedies to validity already pertain to exist, the challenge remains largely in their utilization in social systems.

Keywords

Fingerprinting; Methodology; Techniques; Standardization; Miscarriages of justice; Bayesian statistics; Multimodal; History

Introduction

Fingerprints are arguable the epitome of false belief in criminal investigations. Once heralded as the infallible individual identifier, they are now subject to significant criticism in light of highly publicized miscarriages of justice and when juxtaposed to the scientific rigor of more contemporary forensic techniques [1]. Utilizing methodological approaches predating the 20th century, the foundational logical premises of fingerprinting have remained in relative stasis due to legal precedence and ubiquitous use, effectively grandfathering dangerous tautological frameworks [2]. Nonetheless, the current crisis of validity is not irreparable for the issues do not stem from the concept of individualization, but from the absence of uniform systematization [3]. Thus the mandate for standardization means prescriptive measures must be taken to create universal procedures that separate scientific fact from conjecture, steel recognition formulas, and acknowledge identification errors.

To deconstruct the confounding issues of validity that surround fingerprinting and contextualize corrective measures, the following analysis will ensue. First, the history of fingerprinting as a form of scientific identification will be overviewed to uncover the premises upon which current methodologies rest. Second, these premises will be used to demonstrate their role in the shortcomings in contemporary techniques. Third, the questioned the status of fingerprinting will be surmised though key refutations to understand pressing sociolegal demands. Lastly, the core revisions and adaptations necessary to revamp fingerprinting will be discussed. This chronologic approach pertains to demonstrate how historic traditions are the root source of current folly and how it will be necessary to purge antiquated ideologies for fingerprinting to become robust, statistically determined evidence.

Historical Foundations

Originations

Recorded history reveals fingerprints were first used as a true form of individual identification a millennium ago in China where they were used apart of seals, illustrating the long union between fingerprints and the law [4]. Although the notion of fingerprints being unique biological signatures has remained, they were not studied with scrutiny until the late 1800s in Western Europe when justice systems for the first time sought to scientifically catalogue criminals [4]. In this tumultuous environment of criminological change Francis Galton crafted his seminal work on the biometric examination of fingerprints, demonstrating that topographical features of a fingerprint could be described precisely by an analysis of the following morphological themes: arches, loops and whorls. After the initial observational filtering, more precise inspection then could be made by tracing friction ridge paths, looking for specific breaks, enclosures, bifurcations, and islands [5].

Scotland Yard in 1901 took these axioms and incorporated them into the Henry System which utilized all ten fingerprints and described each within one of the three thematic codes. Based on the code and relative finger position, numeric values would then be assigned creating 1,024 divisions for administrative filing and tracking [6]. The motifs of these categorical procedures were then subsequently transplanted globally and remain the logical foundation underpinning fingerprint classification systems to date [4].

Usage

As fingerprinting became vogue in justice systems, it soon was used not only in accordance with offender accountings, but in ongoing investigations. Adhering to the Locard Exchange Principle, the active detection of fingerprints was sought on crime scenes to demonstrate an individual’s presence [7]1. It was know that offenders in their active manipulation of the target and environment often left traces, especially with their hands. Unless the tactile surfaces of the hands were covered, impressions would be left on items touched [8]. Being the only widely recognized type of individualistic forensic evidence prior to DNA profiling, the others pertaining to be largely class identifiers, collection was deemed high priority from its probative adjudicative value [9]2.

Legal precedence

When in the early 20th century fingerprint evidence was first admitted in courts, it was seen with skepticism, but after the People v. Jenny (1911) case it began to be taken largely at face value, ushering in an era of de jure recognition in the “fingerprints examiner’s fallacy” [10]. The later creation of the Frye Standard in Frye v. United States (1923) meant that in America all forensic evidence became subject to the criteria of case relevance and general methodological acceptance [11]. This precedent greatly influenced court proceedings for over half a century, allowing for profuse use of trending scientific approaches to arguably bring guilty verdicts to many crimes that previously would have not been convictable.

Uncovered premises

Fingerprints have been assumed to be unique, persistent biometric identifiers, largely negating the possibility for error and extraneous factors dealing with environmental mutability [12]. This led to the notion that fingerprints had significant evidential weight and routinely meant it was invoked with a higher standard than other forensic evidence, even as fingerprinting methods remained in relative stasis. Although in hindsight the roots of these tenants appear true, their blind adherence obfuscates modern methodologies.

Shortcomings of Contemporary Techniques

Superficial evolvement

With the underlying foundations largely unchallenged since their legal inauguration, processes were free to evolve without restriction. Major advancements came predominately in specimen collection and detection techniques as well as with the computerization of data and matching schemes [4]. These have primarily succeeded in maximizing the processing capabilities of law enforcement professionals to abet in investigations [13,14]. The major drawback is that the accuracy of these fingerprint identifications schemes were largely assumed as the historical premises of deducing a match remained.

Collection

The procurement of prints has markedly improved with the advent of modern reagents, powders and specialized lighting [15]. This has in particular benefited the detection of latents and partials which can be illuminated, transposed and reconstructed without significantly degrading the specimen quality [4]3,4. Regrettably, many forensic collectors, despite their advanced toolset, are not cognizant of the stricture involved to avoid evidence contamination as very few jurisdictions require any specific educational credentials and training programs are often minimal [16].

For any fingerprint specimen to be useful though, it has to be able to match either a set of prints on file or that of particular suspect. Here the issue of validating the comparison source and ensuring it is a high quality exemplar is pivotal, but this is another issue area [17]. Many exemplars are not taken properly by law enforcement personnel and thus those on file may not be accurate representations of the subject [18]. Even with the best technology, with a dubious exemplar sample, by definition, a match cannot be made.

Cataloging

The creation of networked computer repositories and modern high resolution scanning devices led to a mass digital aggregation of criminal biometrics. The largest centralized system of its type is called Next Generation Identification (NGI), which is based upon the architecture of the Integrated Automatic Fingerprint Identification Service (IAFIS)5. NGI is a comprehensive system containing over 100 million individual files that incorporates individual demographics with fingerprint, palm, iris and face scans [19]. The combinative value of these significantly improves the detection profiles as they serve as probabilistic multipliers [20]. This repository and most emulations thereof allows for crosscomparison searches in subscribed jurisdictions which has led naturally to more unified reporting practices. The speed at which searches can be made has expedited overall investigations and apprehensions, bolstering criminal procedure [13]. They also ensure that data, visible to a plethora of agencies, is not manipulated or obfuscated given their intra agency accessibility.

The issue with this system is that once again the component of human error. Here the direct linkage of local jurisdictions with larger databases means that individuals at the micro level can be responsible for the upload of input errors at the macro level, plaguing the entire system [17]. The power of this system similarly leads to an increased perception of verisimilitude, when in reality an inherent skepticism should be still taken to increase error reduction [21]. Also these systems are not inclusive of all citizens within their jurisdictions further limiting their efficacy.

Matching

Matching historically has been difficult endeavor due to intraclass variations caused by differences within the same print specimen caused by transposition discrepancies and condition of the dermis the time of surface contact [18]6. Interclass variations have also been problematic as the comparison of fingers from the same subject can be made more difficult when prints exhibit multiple themes, which is a semantic limitation of the Henry System [13]. As a result, identification schemes have become increasingly dehumanized, in the aims of increasing accuracy and efficiency.

Generally prints are digitally uploaded and processed through pattern recognition programs that induce matches based on the number commonalties with exemplars [22]. These programs typically use one or more calculative approaches in an increasing scales analysis, moving from minutiae to skeletal correlations of the entire print [23]. After this initial computerized filtering, a human examiner generally is required to assess the match [18]. Unfortunately, there are no uniform regulations that programs or examiners must abide by. And despite such technological advancements, fingerprint examiners remain the final intrinsic part in match verifications [24].

Macro dilemmas for validity

The validity of contemporary fingerprinting examination essentially reduces down to human and systems failures. As the fingerprint examiner retains the supreme determinative role in identifications, which the legal system reinforces, they must be highly versed in all science of fingerprint analysis and adhere to consistent, documentable procedures [25]. The systems failure represents a larger issue as it is inclusive of the human element itself. The fact that human error presents a conflagration is indicative of the lack of quality controls by the systems in place [26].

Publicized Inquiry

New burdens of proof

Amidst controversy over the usage of forensic evidence under the Frye Standard, the superlative Daubert Standard was created in Daubert v. Merrell Dow Pharmaceuticals (1993). Elaborated in the subsequently in General Electric Co v. Joiner (1997) and Kuhmo Tire v. Carmichael (1999), collectively known as the Daubert Trilogy, they posit five general criteria. Evidence must be falsifiable, peer reviewed, have an error rate, disclose techniques, and be generally accepted [27]. Furthermore, as expressed in Federal Evidence Rule 702, the expert witness must be subject to the same scrutiny as the premised method itself, which acknowledges the intrinsic human subjective element in identifications [28]. Under the christened Daubert Standard, in the United States v. Havvard (2000) and United States v. Plaza (2002) it was professed that all fingerprint examinations liable due to the absence of systemization [2]. Fingerprinting of course was not outright discounted, hence its continued use, but merely determined to lack the rigor of modern scientific discipline.

Miscarriages of justice and the press gambit

During the same time period numerous miscarriages of justice brought the issue of false positive fingerprint identifications on to mainstream new circuits to palpable hysteria. Proffered by Dr. French at a recent presentation at UCL, two cases that epitomize the potential for error are that involving Brandon Mayfield (US) and Shirley McKie (UK). Each demonstrates the gross negligence not only by individual examiners, but on the part of whole criminal investigative agencies despite internal controls. In these, erroneous fingerprint matching techniques led to accusations of criminal activity [29]. Mayfield was connected with the 2004 Madrid Bombings by the FBI despite having no affirmable affiliation and was monitored for covertly for months [30,31]. McKie was made a prime suspect in the 1997 Kilmarnock Murder by the Scottish Criminal Record Office and was even tried in open court [32]. In both a myopia and false belief in the methodology of determining matches blinded investigators to competing theories and led the seeking of evidence to justify a false initial premise.

The National Research Council discovered that the examiner error rate is relevant cause for concern, especially the 7.5 percent false negative error figure. Additionally, the inter-rater reliability was subject to considerable variance, suggesting serendipitous practices [31]. As case law suggests, these subjectivities are the source of much current legal ensnarement. When this potential for error is viewed retrospectively, it presents a huge moral dilemma. In America alone, it is estimated that over 100,000 indictments have been made on a general preponderance of fingerprint evidence, evidence now know to be refutable and fallacious [21]. This issue is at the forefront of all prospective changes to fingerprinting issues.

Systemic Revision

Organizational standardization

Fingerprinting dissension though harkened in Daubert is rooted in the need for parallelism in matches for consistent adjudications in all venues and circumstances. Courts at this date in the United States generally only require six to eight points, a modest sum, but this is subject to increase and courts are increasingly requiring the procedure and error rate declared to accord with federal edicts [33]. The simple delineation of the ACE-V system is quickly leading to its adoption as new standard framework in fingerprint examination and thus should be implemented in jurisdictions requiring change [31]7. Unfortunately, ACE-V in itself does not bring match standardization, rather only affects the general administrative organization [21]. Nevertheless, as the examples of Mayfield and McKie illustrate, bureaucratic structures that are not critical of evidence developed are just a suspect for the construction of error as the analysis of ridge minutiae.

Baysianism and probabilities

It has been proposed that, as in many other arenas of scientific identification, that a form of probabilistic identification be used instead or conjunction with a verbal scale so as to objectify reporting and greater inhibit the use of absolute match determinations [34]. Ideally a likelihood ratio (LR) should be used where the approximated error rate is stated [35]. The inclusion of posterior odds would further increase the dynamism of reporting for they combine LRs and previous odds to represent possible change [36]. These figures when used in the true Bayesian approach of competing theories emphasize the concept of falsification so much a part of science.

This transition would likely bring an end to the deleterious theory of absolute individualization of fingerprinting that has tainted judicial judgments [21]. Given omnipresent error rates, individualization should functionally be thought as a fallacy that only serves to unduly contaminate judgments, particularly by swaying jurors. Although the notion of singularity and personal uniqueness of fingerprints is well supported by medical research, it must be admitted that testing procedures do not allow for an absolute determination [37]. Imperfection is a given in any system and thus must be acknowledged [26].

Adjudging on circumstantiality

One area of improvement needed is divvying appropriate evidential weight of a fingerprint match. In most common law systems, there are two constitutive types of evidence, direct and circumstantial [38]. This demarcation is determined essentially by a sentient presence in the event of dispute, rendering eyewitness testimony as direct evidence. Other forms of evidence, including fingerprints and most forensics, are considered circumstantial [33]. Circumstantial evidence by definition should mean that it should only be used collectively with a preponderance of evidence to prove beyond a reasonable doubt. Historically fingerprints have not been used in this manner and have routinely been the only piece of evidence used in prosecutions [21]. Similarly, eyewitness testimony, although considered direct evidence, has consistently been shown through decades of research to also be highly subject to error, memories subject to deterioration and internal biases [39]. Thus, even today, judges do not recognize the boundaries of evidential types, falling heavily upon interpretation and discretion which should be removed by enhanced regulatory oversight.

Incorporating multimodalities

To further validate fingerprint matching processes, they should move beyond the predilection of only analyzing ridge patterns. The multimodal fusion of other finger biometric identifiers serves to add statistical power to matches, which would decrease the potentially of specious determinations [40]. Two items that can be readily mapped with modern fingerprint scanners are the formations of pores and veins. A larger sum of development has been put forth into pores (i.e., poroscopy) as they are remain at the surface of the integument and could also use a point-based mathematical identification process [41- 43]. Veins, although subcutaneous, can be detected with advanced imaging which is advantageous as it makes forgeries extremely difficult to create [44].

Conclusion

Fingerprints will not soon be voided by nature of their integration into contemporary identification procedures. Nonetheless, its current opprobrious state demonstrates that significant revision to the historic axioms of fingerprinting methodologies is warranted. The issues presented by the media and judicial system demonstrate that the human and system failures apart of recent folly require increased standardization and adherence to scientific procedure to mitigate issues to validity. As the solutions already pertain to exist, the challenge lies in their implementation into complex social systems. In closing, it should be understood that the very term forensic science denotes the amalgam of legal and scientific traditions, and although fundamentally different, are linked in their purpose to ascertain the truth. Thus fingerprint evidence must be sufficiently contextualized in each hemisphere to have legitimate continuance into the 21st century and beyond.

1Locard Exchange Principle posits that every environmental interaction leaves a forensic trace.

2Class evidence can only be used to identify group characteristics, it cannot individualize.

3Latentsare invisible to the naked eye and are formed by oily dermal secretions.

4Partials are incomplete fingerprints.

5NGI became operational in September 2014.

6Dermis condition defined by superficial state (i.e. oil content, swelling, abrasions, etc.).

7ACE-V acronym stands for Analyze, Compare, Evaluation and Verifywhich is a procedural emulation of the scientific method.

References

--
Post your comment

Share This Article

Article Usage

  • Total views: 12992
  • [From(publication date):
    September-2016 - Jul 18, 2024]
  • Breakdown by view type
  • HTML page views : 12245
  • PDF downloads : 747
Top