Multiresolution identification of germ layer components in teratomas derived from human and nonhuman primate embryonic stem cells
We propose a system for identification of germ layer components in teratomas derived from human and nonhuman primate embryonic stem cells. Tissue regeneration and repair, drug testing and discovery, the cure of genetic and developmental syndromes all may rest on the understanding of the biology and...
Saved in:
Published in: | 2008 5th IEEE International Symposium on Biomedical Imaging: From Nano to Macro pp. 979 - 982 |
---|---|
Main Authors: | , , , , , , , , , , , , , , |
Format: | Conference Proceeding |
Language: | English |
Published: |
IEEE
01-05-2008
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We propose a system for identification of germ layer components in teratomas derived from human and nonhuman primate embryonic stem cells. Tissue regeneration and repair, drug testing and discovery, the cure of genetic and developmental syndromes all may rest on the understanding of the biology and behavior of embryonic stem (ES) cells. Within the field of stem cell biology, an ES cell is not considered an ES cell until it can produce a teratoma tumor (the "gold" standard test); a seemingly disorganized mass of tissue derived from all three embryonic germ layers; ectoderm, mesoderm, and endo- derm. Identification and quantification of tissue types within teratomas derived from ES cells may expand our knowledge of abnormal and normal developmental programming and the response of ES cells to genetic manipulation and/or toxic exposures. In addition, because of the tissue complexity, identifying and quantifying the tissue is tedious and time consuming, but in turn the teratoma provides an excellent biological platform to test robust image analysis algorithms. We use a multiresolution (MR) classification system with texture features, as well as develop novel nuclear texture features to recognize germ layer components. With redundant MR transform, we achieve a classification accuracy of approximately 88 %. |
---|---|
ISBN: | 9781424420025 1424420024 |
ISSN: | 1945-7928 1945-8452 |
DOI: | 10.1109/ISBI.2008.4541162 |