True pedigree errors more frequent than apparent errors for single nucleotide polymorphisms

Single nucleotide polymorphisms (SNPs) are currently being developed for use in disequilibrium analyses. These SNPs consist of two alleles with varying degrees of polymorphism. A natural design for use with SNPs is the 'haplotype relative risk' sampling design in which a father, mother, an...

Full description

Saved in:
Bibliographic Details
Published in:Human heredity Vol. 49; no. 2; p. 65
Main Authors: Gordon, D, Heath, S C, Ott, J
Format: Journal Article
Language:English
Published: Switzerland 01-01-1999
Subjects:
Online Access:Get more information
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Single nucleotide polymorphisms (SNPs) are currently being developed for use in disequilibrium analyses. These SNPs consist of two alleles with varying degrees of polymorphism. A natural design for use with SNPs is the 'haplotype relative risk' sampling design in which a father, mother, and child are typed at an SNP locus. Given such a trio of genotypes, we ask: what is the probability that a pedigree error (a change from one allele to the other) at an SNP locus will be detected using only Mendel's laws as a check? We calculate the probability of detecting such errors for a hypothetical SNP locus with varying degrees of polymorphism and for various true error rates. For the sets of allele frequencies considered, we find that the detection rates range between 25 and 30%, the detection rate being lowest when the two alleles have equal frequencies and the highest when one allele has a frequency of 10%. Based on this detection rate, we determine that the true error rate is roughly 3.3-4 times that of the apparent error rate at an SNP locus. The greatest discrepancy between true and apparent error rates occurs when allele frequencies are equal.
ISSN:0001-5652
DOI:10.1159/000022846