24 July 2012

Probability in genealogy

Following on my post Lineage societies and the genealogical proof standard  leading genealogy blogger Randy Seaver and others commented "I'm unclear as to your meaning of "probabilistic approach." How would that work?"

It isn't straightforward. Neither is mastering the skills needed to be a certified genealogist yet many rise to the challenge.

The previous post quoted Elizabeth Mills at http://learn.ancestry.com/LearnMore/Article.aspx?id=803

"The most we can do is to establish probability through an expert analysis of the evidence known to date."
There's further enlightenment in Elizabeth Mills, “Working with Historical Evidence: Genealogical Principles and Standards,” Evidence: A Special Issue of the NATIONAL GENEALOGICAL SOCIETY QUARTERLY, NGSQ 87 (September 1999): 165–84, available at http://www.voicespast.com/NGS/091222_1141/appendix/evid.pdf
"Levels of Confidence
Within sound genealogical studies, information statements about dates, identities, places, relationships, and similar matters are frequently prefaced by such terms as apparently, likely, possibly, or probably—all  denoting that the stated “fact” is clouded by doubt. To date, these terms have no concrete definitions; practically speaking, they take on whatever shade each individual researcher provides with his or her supporting detail."
With no concrete definitions the use of these terms is anything but transparent, hardly the hallmark of professionalism.

The same article continues by referring to, while not endorsing, a three level probability scale:
• Possibility, used at the “speculation” stage—a term comparable to the math/physics concepts intuition and guess.
• Probability, used at the “hypothesis” stage—a term comparable to the math/physics concepts proposal and conjecture.
• (Reasonable) certainty, used at the “proof” stage—a term signifying a convincing degree that is comparable to the math/physics concept verification.
Finally, Mills, Elizabeth Shown. Professional Genealogy: A Manual for Researchers, Writers, Editors, Lecturers, and Librarians. Baltimore: Genealogical Pub., 2001, page 463 refers to a range of probability expressions:

Possibly     There is a remote possibility .......He could have
Probably     There is a slight chance .......He must have
Certainly    There is a chance ..... Her certainly must have

All those references are at least a decade old. Shouldn't time move on in developing genealogical methodology?

A recent book, Carrier, Richard C. Proving History: Bayes's Theorem and the Quest for the Historical Jesus. Amherst, NY: Prometheus, 2012, suggests a Canon of Probabilities for use in historical studies:

Virtually Impossible = 0.0001%
Extremely Improbable = 1%
Very Improbable = 5%
Improbable = 20%
Slightly Improbable = 40%
Even Odds = 50%
Slightly Probable = 60%
Probable = 80%
Very Probable = 95%
Extremely Probable = 99%
Virtually Certain = 99.9999%

If the genealogical profession were to agree they saw advantages in adopting some such probability scale as a standard then the profession could move on to how to assess the probability.

Those whose minds are open to the possibility of such an approach should read the first few chapters of Richard Carrier's book or watch him explain the use of Bayes' Theorem in exploring historical issues at http://www.youtube.com/watch?v=HHIz-gR4xHo to get an idea of how that might work. Don't be put off by his style of presentation. There are other YouTube videos on Bayes' Theorem including a series starting at http://www.youtube.com/watch?v=XR1zovKxilw which I found to be one of the most understandable.

Neither of these are presented in a genealogical context.

More to come later ...

9 comments:

BDM said...

The GPS is totally all about assessing the probability of identity and relationships. Percentage numbers may satisfy some amongst us, but it’s the written language of analysis that justifies (or not) each individual genealogical conclusion.

AncestralManor said...

Dear John, as a recovering mathematician, there is a basic flaw in using Bayes Theorem to "calculate" genealogical evidence.

Math, and more particularly statistical probability, assumes that you have a reasonable way to estimate and/or count the universe of data you are assessing turning that count into numbers, what characteristics are applicable to the population of that universe or community of people involved.

On the other hand, a qualitative evaluation of evidence relies upon THIS or THAT in the context of rules of logic that are centuries old - AND something that everyone uses in everyday life.

"If THIS (information) is true THAT (information) cannot be true," is a basic premise for the need to resolve conflicting evidence.

Resolving conflicting evidence follows many other rules of logic, in varying dimensions that include evaluating the quality of each source based on characteristics of the source - all specific to the situation at hand with the information we have at the point in time that we are evaluating the source.

Then there are threads of consistency or inconsistency through many other sources over time and place and source type.

You have to remember that statistics seeks to "normalize" a population into numbers for probability. Great for engineering - the task of harnessing the most reliable black box results for predicting what kind of outcome an input will produce, but is irrelevant for tracking the life of one specific atomic person crashing in space over time.

Particularly, since you don't have any way to actually measure every characteristic and event in the past.

Scientific experiments rely on controlled environments. Engineering applications attempt to "aim" these results into the future, not the past.

One of the most illuminating aspects of genealogical research applying the GPS is to learn real history in various microcosms and what is flawed in the historical generalizations that attempt to "normalize" any population at any point in time and place.

Sharon Sergeant

JDR said...

I appreciate you commenting Sharon as it gives the opportunity to point to another book on the application of Bayes' Theorem to the type of unique situation we encounter in family history. Check out "The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy" by Sharon Bertsch McGrayne.

Elizabeth Shown Mills said...

John, may I add a point to your extract from PROGEN? Someone has just asked me about a passage in your post and they do not have the book (Aach! Perish the thought!) to check it for themselves.

You wrote:
>Finally, Mills, Elizabeth Shown. Professional Genealogy: A Manual for Researchers, Writers, Editors, Lecturers, and Librarians. Baltimore: Genealogical Pub., 2001, page 463 refers to a range of probability expressions:

>Possibly There is a remote possibility .......He could have
>Probably There is a slight chance .......He must have
>Certainly There is a chance ..... Her certainly must have

First, let me say that I am not the author of that passage. Christine Rose is the author of the chapter in which that discussion appears. I was merely the editor.

My questioner did not understand this passage in your post. He thought "my" intent was to provide definitions for those terms. Rather, Christine’s point was to show the *impreciseness* of these “wiggle words” and to demonstrate the need to explain our reasoning for all conclusions.

I totally agree with you that we need a means by which to express, far more precisely, the weight we feel our conclusions carry. The GPS provides a basis for this, with its five criteria that provide a minimum standard for acceptability. If, say, a conclusion is NOT supported by through research (Point 1 of the GPS), then it would fail the most basic criteria.

The problem with attempts to mathematically quantify “levels of confidence” is the same one that genealogical software has had ever since it instituted “surety levels” decades ago. What Gene Gullible would consider a 5, someone as experienced as you might consider a 1 or 2. By extension, what Gene Gullible might consider a 99.9% probability, you might assign to a 40% or 60% rating. Yes, we think of numbers as concrete, but when they are assigned to historical evidence, they’re shapeshifters that mean only what each of us sees in them.

In short (IMHO): Asserting statistical probability in genealogy is nothing more than asserting an opinion without investing the effort to actually present a case.

In my “daily tip” at the Facebook page for Evidence Explained on 25 June, I followed up, briefly, on the wiggle words from my 1999 article and from Christine’s 2001 chapter:

MONDAY'S METHODS
Qualifiers:
When we use wiggle words in our assertions about personal details (e.g.: Luis Tomassino *apparently* came from Barcelona; Franz Ritter was *probably* the man of that name who ...) we should always, ALWAYS, *ALWAYS* explain the reasoning that underpins that qualifier. Users of our work need and deserve to know, exactly, the basis for our conclusion, our theory, or our hypothesis. And WE need to know our own reasoning, every time we come back to this piece of research.

The only feasible solution, in my opinion, is proof discussions—that last criteria of the GPS. Whether we use percentage points or “wiggle words” to express probability, they are meaningful only when we (a) identify the sources we feel are relevant, (b) explain our interpretations of what we think the relevant information means, and (c) provide a framework showing how the disparate evidence fits together to make the case for our conclusion.

JDR said...

I thank ESM for comments on both this and the previous post in the series. Thoughtful comments are always welcome.
The point has certainly been made that probability is used but there seem to be imprecise definitions – wiggle words. If probabilistic terms are to be used they should be clearly defined, and that definition as widely understood and accepted as the GPS. That would be a positive step, and one I believe would be more robust if allied with a numerical scale.
ESM having spent most of her career, so far, working under the proof regime pre-GPS would surely tell us that change did not happen overnight, or easily. There's no illusion on my part that developing and adopting a quantitative probabilistic supplement to the GPS would be any less challenging. However, being open to considering new approaches is part of being a professional in any field. That means taking the time to investigate and consider the evidence, just as one would in developing a genealogical case, more so if there is movement in that direction in allied disciplines.

Harold Henderson said...

I agree that "All those references are at least a decade old. Shouldn't time move on in developing genealogical methodology?"

A good start would be to cite ESM's 2007 book _Evidence Explained._ On pp. 19-20 she proposes a hierarchy of probability adverbs, without numbers attached.

My father taught mathematics at the high-school level for many years, and one of his pet peeves was measuring to the nearest tenth and then calculating and giving the answer in thousandths or ten-thousandths! You can't increase the accuracy of the original by attaching random numbers to it.

I think we have a similar case here. Also in EE, Elizabeth lists ten factors that go into textual and evidence analysis. They are certifications and certificates, content, creator's veracity and skill, informant's purpose and reliability, language characteristics, material characteristics, penmanship, record's custodial history, record's degree of processing, and record's timeliness.

I don't see how to non-arbitrarily assign numbers to any one of these items, let alone to properly weight them against one another. And even if we could do that in one case, the relative importance and applicability will differ in the next case.

But maybe there is a way to do it. Perhaps the way to further this discussion would be to make it practical. An advocate of this approach could write up a solved genealogical problem using this approach, circulate it for comment, and submit it for publication in a reputable journal. Wouldn't that be something to chew on!

JDR said...

There's already an interesting study using probabilities, the case of Thomas Jefferson as the father of Sally Hemmings' children which uses Monte Carlo techniques and Bayes Theorem. See Coincidence or Causal Connection? The Relationship between Thomas Jefferson's Visits to Monticello and Sally Hemings's Conceptions
Fraser D. Neiman
The William and Mary Quarterly
Third Series, Vol. 57, No. 1 (Jan., 2000), pp. 198-210

Helen Leary published a subsequent article (which I have not yet seen) which I understand takes the conventional "proof" approach. National Genealogical Society Quarterly, Vol. 89, No. 3, September 2001, pp. 207, 214-218

I agree with Harold at http://midwesternmicrohistory.blogspot.ca/2012/07/how-can-i-prove-my-mom.html -- when he writes that ' "Proof" in genealogy is not like "proof" in mathematics. If I had the power to re-boot genealogy from the beginning, I would abolish the word altogether.' It's something BCG and APG might usefully revisit.

Harold Henderson said...

LOL -- got me, dude! I won't start arguing with myself just yet. I have read neither article -- may have something to say after reading both. (BTW, it would be BCG, not APG, that would revisit the subject.)

Tony Proctor said...

Anyone reading this might be interested in a discussion of the same subject at: https://groups.google.com/d/topic/soc.genealogy.britain/_VbY9XraGQ0/discussion.

It turned out to be rather emotive as some people were adamant that it was crazy.

However, some research into the practicality is happening.