> Back to BCG Learning Center

> From OnBoard–Newsletter of BCG


Laurel T. Baty, CG, “Avoiding Common Portfolio Pitfalls,” OnBoard 24 (May 2018): 9–10, 15.

Why are some BCG applications unsuccessful?

No portfolio is perfect; even successful ones have shortcomings. Some weaknesses are not a barrier to certification as long as the work meets most documenting, research, and writing standards–and any unmet standard is easily remediable. Samples of passing portfolios are available at many major conferences and institutes. Few people, however, see an unsuccessful portfolio besides the applicant and its three or four BCG evaluators. What flaws characterize a portfolio that does not pass?

Understanding how some applications fail to meet BCG’s evaluation criteria–The BCG Application Guide, rubrics, and genealogical standards1–might help other applicants avoid the same missteps and prepare a successful portfolio. Thirty-six unsuccessful applications were analyzed to identify commonalities among the evaluations. Results are summarized here.

Work Sample Choice

Poor work sample choice is a major cause of unsuccessful applications. A work sample that does not meet the Application Guide requirements usually results in insufficient material to evaluate core skills and can reduce a portfolio’s chance of passing. Common problems include

  • a research report that does not demonstrate “an in-depth and skillful use of a range of sources” (44 per cent of portfolios),
  • a kinship-determination project that omits the two required proof summaries or proof arguments (42 per cent of portfolios), and
  • a case study that does not fulfill requirements in The BCG Application Guide (28 per cent of portfolios).

Work samples should be chosen to demonstrate knowledge of records and problem-solving skills. If the BCG-supplied document is a will, choosing another type of record for the applicant-supplied document (such as a deed) offers more opportunity to display skills than choosing another will. A research report that describes a record retrieval contains insufficient material to evaluate analysis and correlation skills. A kinship-determination project focused on a family that created few records or lived in an area of record loss due to fire or war limits the ability to display knowledge of multiple record types.

The most common mistake with the case study is selecting a problem that does not use one of the three required techniques specified in the Application Guide.2 If a single piece of direct evidence answers the research question and no conflicting evidence exists, the problem is not suitable for the case study requirement. Another frequent error is a case study that does not resolve “a significant problem of relationship or identity.” Examples of unsuitable cases include determining an exact birth date or birthplace,3 or resolving minor spelling variations in names.4

Extent of Research

Reasonably exhaustive research–one of the five components of the Genealogical Proof Standard (GPS)5–is evaluated in the case study by rubric CS1 and in the kinship-determination project by rubric KD1.6 Research reports often have time constraints that limit their extent of research. Research report rubric RR2 measures the extent and efficiency of research in the allotted time.7 Multiple standards address extent of research: Standard 12 (broad context), Standard 14 (topical breadth), Standard 17 (extent), Standard 19 (data-collection scope), Standard 41 (evidence scope), and Standard 51 (research scope).8

Unsuccessful applications frequently rely on insufficient research and often overlook commonly used sources such as land and probate records, instead concentrating on easily accessible census and vital records.9 Unsuccessful applications bypassed commonly used sources in

  • 94 per cent of the case studies
  • 78 per cent of the kinship-determination projects, and
  • 50 per cent of the research reports.

If research is limited to a few record types, then potentially relevant information may be missed. Executing thorough research ensures that conclusions are accurate and unlikely to be overturned.10

Quality of Evidence

Quality of evidence is evaluated in the research report by rubric RR6, in the case study by rubric CS3, and in the kinship-determination project by rubric KD3.11 Standards clearly state that genealogists prefer to reason from original records.12 Inexperienced genealogists frequently do not distinguish between original and derivative sources. Unsuccessful applications had problems with evidence quality in

  • 75 per cent of the kinship-determination projects,
  • 61 per cent of the case studies, and
  • 39 per cent of the research reports.

Meeting standards for quality of evidence is one of the easily avoided pitfalls. Replacing indexes and derivative records–such as published abstracts–with original records will improve the evaluations and meet standards. A reference note to an index or abstract should explain why the original no longer exists or is unavailable. An explanation that the index was used because images of the original are not available online is not sufficient. A trip to a courthouse or archives may be necessary to meet standards for quality of evidence and extent of research.

Evidence Analysis and Correlation

Evidence-analysis skills are evaluated in both document work samples by rubrics DW6 through DW9.13 The document work rubrics evaluate the ability to analyze document reliability, background context, information, and evidence. Analysis and correlation skills are evaluated in the research report by rubric RR7, the case study by rubric CS4, and the kinship-determination project by rubric KD4.14 The correlation and assembly of evidence rubrics evaluate the ability to recognize and discuss connections and contradictions in evidence.

Unsuccessful portfolios frequently omit analysis and correlation of evidence. Analysis of reliability and background context was missing in 75 per cent of the document work samples as evaluated by DW6 and DW7. Insufficient or missing analysis and correlation was found in

  • 50 per cent of the kinship-determination projects,
  • 47 per cent of the research reports, and
  • 33 per cent of the case studies.

Analysis and correlation of evidence is a critical component of the GPS. Developing this core skill takes study and practice.15 One of the best ways to learn about evidence analysis and correlation is by studying articles in peer-reviewed journals, such as the National Genealogical Society Quarterly.

DNA

DNA evidence is increasingly used in portfolios. DNA is evaluated using the same documentation, research, and writing standards as other evidence. Common problems seen in work samples that incorporate DNA evidence include missing documentation, unfamiliarity with DNA terminology, not choosing the best DNA test for the problem, insufficient DNA evidence, and missing or oversimplified analysis and correlation. Each asserted relationship used to support DNA evidence and analysis needs to be documented.16 Learning to analyze and interpret DNA evidence requires concentrated study.17 DNA testing is used in conjunction with traditional documentary evidence and care should be taken to meet extent of research standards before reaching a conclusion.

DNA evidence is subject to privacy considerations and addressed in The Genealogist’s Code.18 Permission to use DNA test results should be included with any work sample that uses DNA evidence.

Renewals

Renewing associates can also learn from errors found in unsuccessful new applications. While the pass rate for renewal portfolios is substantially higher, similar flaws occur in unsuccessful renewals. The most frequent problem is poor work sample choice. The Application Guide requires that “at least one work sample must demonstrate use of the Genealogical Proof Standard.”19 Inattention to this requirement can result in denial of recertification. As with new applications, the most common reasons for failure to meet the GPS are superficial research, the use of unreliable evidence, and lack of evidence analysis and correlation.

Conclusion

Understanding the problems common to unsuccessful portfolios can help avoid mistakes made by others. Concentrating on good work sample selection, extent of research, quality of evidence, and analysis and correlation will improve chances for a successful outcome.

 

Notes

Websites were viewed 18 April 2018.

1. The BCG Application Guide, 2017, PDF, Board for Certification of Genealogists (https://bcgcertification.org/wp-content/uploads/2017/10/BCG-Application-Guide-2017.pdf). “Rubrics for Evaluating New Applications for BCG Certification,” 2018, PDF, Board for Certification of Genealogists (https://bcgcertification.org/wp-content/uploads/2017/11/BCG-New-Application-Rubrics-2018.pdf). Board for Certification of Genealogists, Genealogy Standards (Nashville, Tenn.: Ancestry.com, 2014).

2. The BCG Application Guide, 6.

3. Thomas W. Jones, Mastering Genealogical Proof (Arlington, Va.: National Genealogical Society, 2013), 8. Jones defines birthplace and birth date as supporting questions that “help guide genealogical research to answer the major questions of relationship, identity, and activity.”

4. See the definitions of conflicting and compatible evidence in Genealogy Standards, 65.

5. Genealogy Standards, 1–3.

6. “Rubrics for Evaluating New Applications for BCG Certification,” 5 for CS1, 6 for KD1.

7. “Rubrics for Evaluating New Applications for BCG Certification,” 3.

8. Genealogy Standards, 12, 13, 14, 16, 25, 31.

9. “Rubrics for Evaluating New Applications for BCG Certification,” 3. The rubrics define “commonly used sources” as those “addressed by chapter titles in part 2 of Val D. Greenwood, The Researcher’s Guide to American Genealogy, 3d edition (Baltimore: Genealogical Publishing Co., 2000).” These are compiled sources and newspapers, vital records, census records, probate records, land records, court records, church records, immigration records, military records, and cemetery and burial records.

10. For reasonably exhaustive research, see Jones, Mastering Genealogical Proof, 23–29.

11. “Rubrics for Evaluating New Applications for BCG Certification,” 4 for RR6, 5 for CS3, and 6 for KD3.

12. Genealogy Standards, 23 for Standard 38, “Source preference.”

13. “Rubrics for Evaluating New Applications for BCG Certification,” 2.

14. “Rubrics for Evaluating New Applications for BCG Certification,” 4 for RR7, 5 for CS4, and 6 for KD4.

15. For an overview see Jones, Mastering Genealogical Proof, 53–78.

16. Genealogy Standards, 5-6.

17. Reference texts include Blaine T. Bettinger and Debbie Parker Wayne, Genetic Genealogy in Practice (Arlington, Va.: National Genealogical Society, 2016); and Blaine T. Bettinger, The Family Tree Guide to DNA Testing and Genetic Genealogy (Cincinnati, Ohio: Family Tree Books, 2016).

18. Genealogy Standards, 45—48. Also, Judy G. Russell, “The Ethics of DNA Testing,” OnBoard 21 (January 2015):1–2, 7; online edition, Board for Certification of Genealogists (https://bcgcertification.org/skillbuilding-the-ethics-of-dna-testing/).

19. The BCG Application Guide, 15.

 

Laurel T. Baty, CG®


This article was originally published in OnBoard, BCG’s educational newsletter and is protected by copyright. Individuals may download and print copies for their personal study. Educators are granted permission to provide copies to their students as long as BCG, OnBoard, and the appropriate author are credited as the source of the material. Republication elsewhere is not permitted.