Introduction
39.1. Any review of practice has to be pragmatic and consider the efficient use of resources. Perspective is given by the statistics in chapter 34 which show that the vast majority of fingerprint work contributes to police intelligence without resulting in a need for evidence to be prepared for use in court. Speed may be at a premium in that context and it may be desirable to keep bureaucratic requirements to a minimum so as not to diminish effective and efficient detection of crime and criminals. Imposing strict operational constraints on fingerprint examiners in every circumstance could unduly impede the efficiency of their work. The studies referred to in chapter 35 are relevant to striking a balance.1 They suggest that there is less variability in the conclusions of practitioners when dealing with marks at either extreme of a spectrum of quality and that variability, and hence the scope for error, is greater when dealing with 'complex' marks.
39.2. The need to balance the efficiency of fingerprint comparison work and the tendency for variability to have most impact in the case of complex marks support the view of Professor Champod that there is a need to design a process whereby simple marks and complex marks are handled differently.2
39.3. The focus is on complex marks generally and not simply those that may be perceived at the time of comparison to have evidential significance. It has to be recalled that when Y7 was first identified as Ms McKie's mark it could not have been readily foreseen that that would be a finding of evidential significance since it was simply the elimination of one of a number of individuals believed to have had an innocent reason for having been present at the scene of the crime. Procedures should allow for the fact that what may initially be an insignificant elimination may turn out to be a critical identification and that fingerprint examiners cannot know with certainty that their findings will not have relevance beyond the intelligence gathering phase of an investigation.
39.4. The working assumption is that the complexity will derive from the quality of the mark but the same logic would apply where the comparison is complex for any other reason.
Definition of 'complex' marks
39.5. Some marks are of poorer quality than others and such marks were described at the Inquiry as being 'complex' or 'difficult'.
39.6. Professor Champod explained that the definition of a complex mark is not resolved in specialist literature and that there is room for research to provide a definition.3 Some witnesses provided guidance as to what is meant. The practice in the Netherlands is discussed later in this chapter. Professor Champod stated that some Swiss identification bureaux use a 12-point rule and other criteria to make a distinction between complex and simple cases. If a mark has less than 12 characteristics it will be considered as complex, as will a mark with signs of disturbance, difficulties or what practitioners would refer to as 'red flags'.4 Mr Chamberlain said that a complex or difficult mark was one where the characteristics were not very distinct within the image, with a lot of distortion or overlays. A mark with between six and eight observable characteristics would be characterised as complex.5
39.7. Mrs Tierney was unaware of any SPSA instructions specifically about the assessment of the quality of marks.6 She thought that creating a prescriptive procedure for the assessment of complexity would be challenging since each mark has to be judged on its own merit7 and she doubted whether it would be beneficial to set a rigid statistical criterion for the quality of marks.8 That is exemplified by the experience with Y7. If a mark were to be defined as 'complex' solely by the criterion that fewer than, say, 12 characteristics can be observed, those who identified 16 characteristics would have been justified in treating it as not complex. The same could be said of QI2 Ross.
39.8. The absence of an agreed objective standard by which the complexity of a mark can be assessed creates difficulties in establishing whether a mark is 'complex' or not but the difficulties in defining complexity may be overstated, as Mr Chamberlain and Professor Champod's evidence suggests that complexity is capable of a 'high level' definition. For example, where one of the examiners involved in the ACE-V process has had doubts sufficient to merit seeking advice from a colleague that would seem eminently a case of a 'complex' mark.9 Further research is required to identify factors that may produce either variability in the opinion of examiners or 'error' that can inform the decision whether a 'mark' is 'complex' and meriting special procedures. The Metropolitan Police 'tipping point' study10 is a good example of research to ascertain the critical variables that produce practitioner inconsistency. The June 2011 OIG Review reports that after the Mayfield case the FBI and other organisations conducted research on a number of fronts including a study to measure the accuracy and consensus of latent print examiner decisions and the development of software to provide examiners with a tool to assess fingerprint quality and a quantitative metric capable of measuring sufficiency.11 The results of that research merit close scrutiny.
Position in practice
39.9. SPSA does not have a separate process for complex marks.
39.10. Some bureaux do treat complex marks differently. At the FSS, note-taking and examination tends to be more extensive if the mark is complex and/or of poor quality.12 At the Metropolitan Police a fourth verification is undertaken of marks with low levels of disclosed detail and no unexplained features in disagreement.13 In Switzerland the level of training of examiners who are allowed to verify a complex case is different.14
The Netherlands: 'multiple procedure'
39.11. In the Netherlands a distinction is drawn between complex marks and other marks. In an ordinary case a total of two examiners will be involved in the identification and verification of a mark but in the case of a complex mark a total of three examiners are engaged15 and they follow a specific procedure: the 'multiple procedure'.16 That procedure is used in a number of situations, some of which indicate complexity, for example where examiners differ in their conclusions, where an examiner thinks that a mark is of borderline quality, or when an examiner finds anything questionable in his identification.17
39.12. Key aspects of the procedure are:
(i) Three independent experts look at the mark separately. They will not have had previous involvement with the mark. Each is provided with a copy of the mark and the fingerprint form. They work separately and complete a detailed form.18
(ii) The examiner carries out an analysis of the mark and records the findings digitally on the image of the mark using software. On the form the examiner records his views about matters such as quality, how many points are present, the location of the points, the significance of the points and possible problems in the mark. The examiner will ascribe confidence levels to the features at this stage. At the end of the analysis stage the examiner considers if the mark is suitable for identification. It is rare for an examiner to proceed to a comparison if the mark is not considered to be suitable.
(iii) During comparison the examiner will grade points of similarity and note any differences and any explanation for them. The findings are recorded. If an examiner has designated a feature as low value during the analysis stage it should not be upgraded.
(iv) A separate evaluation stage follows, and the policy in the Netherlands is to postpone evaluation until analysis and comparison is complete. At this stage the examiner carries out an evaluation, records his conclusion and notes any points to discuss with colleagues.
(v) Once each examiner has completed ACE the examiners will consult and discuss their findings in detail.19 They discuss their analyses first and if everybody agrees that a comparison should have taken place they will discuss the comparison. Thereafter the examiners discuss their respective evaluations, both the points of similarity and the points of difference. The mark is identified only if the examiners agree that the requisite standard is met (in the Netherlands examiners operate to a numerical standard of 10 to 12 points depending on the clarity of the mark).20 The examiners must agree not only in relation to the decision but also in relation to each point relied upon in the identification. In arriving at their conclusion they proceed on the lowest common denominator so it is possible for a decision to be reached that a mark is not identified even where all three have independently found 12 or more points of similarity if they differ as to the points in similarity and, for example, after discussion can agree on no more than nine points. There is, though, the option for Mr Zeelenberg still to give evidence to the Dutch courts in that situation and he would do so on the basis that he was 'unable to exclude'.21
Separate process for complex marks
39.13. The Dutch 'multiple procedure' accords with the key requirements that Professor Champod envisaged as being necessary for complex marks: more in-depth analysis, documentation and enhanced verification.
39.14. Analysis as part of the ACE-V process is discussed in chapter 36 and the need for documentation or note-taking in chapter 37. There is a need to analyse the whole mark, not just a target area, and for complex marks it is recommended that notes be taken at each stage of the ACE-V process.
39.15. In relation to enhanced verification, Professor Champod made two suggestions. The first was that blind verification should be considered for complex marks22 as there is a greater risk that examiners might be influenced by other examiners' findings.23 Secondly, the process of verification should not be confined to a consensus as to the conclusion but should extend to a technical review of the points relied upon by each examiner at the stages of analysis and comparison because inconsistencies relating to individual points may highlight issues concerning: the levels of tolerance being applied; and the reliability of characteristics believed to be in common or explanations for any difference.24 That is consistent with the findings of the OIG in the Mayfield case and of this Inquiry in relation to Y7 and QI2 Ross.25
39.16. Differences between examiners, for example, in relation to the interpretation of observable 'events' can raise questions about the reliability of the conclusion and it should not be assumed that merely because there is agreement between examiners on a finding of identification or exclusion that the finding is suitably robust if they differ on the routes by which they arrive at that conclusion. In particular, it is not necessarily sufficient that they agree that there is some common 'event' in mark and print if they differ as to the interpretation of the precise nature of the characteristic. For example, Mr MacPherson and Mr Mackenzie agreed the identification of QI2 Ross but with material inconsistencies between them as to their reasons. For example, while they agreed at the generic level that there were matching 'events' at (a) SCRO points 1/10/16 and (b) SCRO points 11/12, they differed as to the interpretation of the events with (a) being either an eyelet or a spur shape and (b) being either a lake or a bell. Those differences in interpretation were symptomatic of the lack of clarity in the mark and the application of inappropriate tolerances in the comparison.
39.17. A 'technical review' is a discussion between the examiners involved in the ACE-V comparison of a mark on the substance of their findings. The objective is to afford examiners an opportunity to reflect on any differences between them on matters of detail, with a view to assessing whether the detail on which they rely is sufficiently reliable to support the conclusion.
Commentary
39.18. The evidence to the Inquiry does support the conclusion that enhanced procedures do require to be put in place for the handling of 'complex' marks.
39.19. While further research is required to identify the factors that may be relevant to the definition of 'complexity' for this purpose, meantime there are precedents to be found in the practice in the Netherlands, Switzerland and England and practitioners should be able pragmatically to know a complex mark when they see it.
39.20. There is merit in the proposal that the procedures for handling 'complex' marks should include a technical review of the substantive reasoning of each examiner. If examiners have resolved a doubt by mutually exclusive reasoning (take, for example, the conflicting interpretations by Mr MacPherson and Mr Mackenzie of SCRO points 1, 10 and 16, and 11 and 12 in QI2 Ross)26 the inconsistency ought to give rise to careful reflection on (a) the quality of the assumed similarities on which the conclusion depends or, as the case may be, (b) the cogency of the explanations for any difference. Requiring examiners to participate in a technical review would expose any inconsistency in reasoning and afford them an opportunity to reflect on the robustness of the finding, which is the essence of verification.
Questioned marks
39.21. When Ms McKie denied that she had gone as far as the bathroom in Miss Ross's house and thereby cast doubt on the identification of Y7 both the police and SCRO were presented with a novel situation and neither had appropriate procedures.27 They resorted to the ad hoc arrangements described in chapter 7 that suffered from the deficiencies identified in that chapter and in chapter 28.28
39.22. There should be a clear procedure to meet this eventuality. It is a matter for the police and COPFS whether they would wish the comparison to be checked by SPSA or by some other agency or person. Assuming that the police or COPFS require a fingerprint comparison to be reconsidered by SPSA because they are aware of a conflict with other evidence, or for any other reason, the mark should be referred to the review panel who should consider the matter in accordance with procedures as recommended in chapter 36.29
Recommendations
Research
39.23. Research should be undertaken into which marks ought to be assessed as complex.
Procedure for complex marks
39.24. The SPSA should develop a process to ensure that complex marks such as Y7 and QI2 Ross are treated differently. Such a process should include the following principal elements:
(i) Examination should be by three suitably qualified examiners.
(ii) Notes should be taken at each stage of ACE-V by every examiner involved in the process. Those notes should record the information specified in paragraph 116 of chapter 37.
(iii) No examiner should disclose his or her conclusion to another examiner until all three examiners have reached their independent conclusions.
(iv) After all three examiners have completed their individual comparisons they should meet and review the substantive basis of their conclusions. The reasons each has for their respective conclusions should be explored, even when they agree that an identification can be made. Any differences of opinion among them should be discussed in order to determine whether the conclusion is reliable. A note should be kept of the matters discussed at the technical review meeting.
Questioned marks
39.25. Where the police or COPFS require a fingerprint comparison to be reconsidered by SPSA for any reason the matter should be referred to the review panel to be addressed in accordance with the procedures recommended in chapter 36, paragraph 121.
1. See chapter 35 para 4ff
2. Professor Champod 25 November page 101
3. Professor Champod 25 November page 129; see, subsequently, the SWGFAST definition in 'Standard Terminology of Friction Ridge Examination' - http://www.swgfast.org/documents/terminology/110323_Standard-Terminology_3.0.pdf; and paragraph 2.1.2 of the 'Standard for the application of blind verification of friction ridge examinations' - http://www.swgfast.org/documents/blind-verification/110315_Blind-Verification_1.0.pdf)
4. Professor Champod 25 November pages 101-102
5. Mr Chamberlain 18 November pages 12-14
6. FI_0152 para 75 Inquiry Witness Statement of Mrs Tierney
7. Mrs Tierney 12 November pages 98-99
8. FI_0152 paras 77-78 Inquiry Witness Statement of Mrs Tierney
9. See chapter 36 para 93
10. See chapter 35 para 4
11. US Department of Justice, Office of the Inspector General (2011) A Review of the FBI's Progress in Responding to the Recommendations in the Office of the Inspector General Report on the Fingerprint Misidentification in the Brandon Mayfield Case, URL: http://www.latent-prints.com/images/FBI%20Mayfield%20Progress%20062011.pdf, pdf pages 7, 19-21
12. Mr Chamberlain 18 November pages 10-11
13. Miss Hall, Mr Pugh 24 November page 67
14. Professor Champod 25 November pages 101-102
15. Mr Zeelenberg 8 October pages 6-7
16. Mr Zeelenberg 8 October pages 3ff, 31ff and FI_0201 para 4 Inquiry Witness Statement (Supp.) of Mr Zeelenberg
17. FI_0201 para 4 Inquiry Witness Statement (Supp.) of Mr Zeelenberg
18. A version of which is DB_0768
19. Mr Zeelenberg 8 October page 31ff
20. Mr Zeelenberg 8 October pages 22-23
21. Mr Zeelenberg 8 October pages 34-37
22. Professor Champod 25 November page 79
23. Professor Champod 25 November pages 111-113
24. Professor Champod 25 November pages 110-113
25. See chapter 35 para 39ff
26. See chapter 26 para 21ff
27. See chapter 7 para 184
28. See chapter 28 para 29
29. Para 122