Consciousness and how it got to be that way

Tuesday, July 30, 2013

Evolution and Rh Negativity in Historical Populations


A Kleihauer-Betke stain showing fetal blood cells circulating in maternal blood. Even 0.5 mL of fetal red
blood cells can be detected with this method. It can also be detected by the mother's immune system.


The Rh antigens on human blood cells always bothered me; what a terrible system. If a mother who does not have the Rh protein on her blood cells reproduces with a male that does, she may have a baby who also has the Rh protein. The first such baby is fine; but the mother stands a chance of getting enough of the baby's blood cells in her own circulation during the birthing process that her immune system sees it. Now, if she has another baby that is Rh+, her immune system will attack the baby's blood cells and destroy them. 5% of babies who are the second Rh+ baby of an Rh- woman will get hemolytic disease of the newborn (HDN), and be very anemic and sick; possibly die.


Structure of Rh factor. That's a channel-looking protein if ever there was one. From Mike Merrick at the John Innes Centre.


So what happened, I always wondered, before Rhogam? (The injection that stops an Rh- woman from making antibodies to Rh+ baby's blood cells.) Did we go through a hundred thousand years until 1960 with stillbirths and sick, anemic babies affected for their whole lives being born left and right, and life was just so scary and incomprehensible and miserable that we just accepted this?

I looked into this during my OB-GYN rotation and ended up giving a short talk about it. It turns out that Rh negativity is most a European thing; especially, it turns out, a Basque thing. Europeans are 15% Rh negative, Basques about 36%. The interesting connection is that the function of the Rh protein is an ammonia channel that, among other things, confers resistance to toxoplasma - which is spread through the feces of cats - which did not exist in pre-Holocene Europe. There's our answer! There was no pressure for the Rh factor be maintained once Europeans were in Europe, and the longer they were there, the more they lost it. So if the Rh factor is lost, no big deal, right?

Well, this still bothers me, for the simple problem that where we find Rh negativity, we don't find only Rh negativity. The Rh factor didn't disappear completely, so there's still a price to pay in terms of HDN and sick babies. In an all Rh- population, things would be fine; there would be no Rh-caused HDN, and no disadvantage to being an Rh- homozygote. But in the absence of an advantage to being Rh- (or even heterozygous) - which we're not aware of - it's a disadvantage, balanced against no advantage. I used to tell myself that maybe the mortality numbers were low so it didn't matter, but nature keeps close score. Even if Rh negativity is a little bit deleterious in terms of HDN, it should be selected out - eventually.



Using the Hardy-Weinberg equation, we can construct a simple model that tells us how rates of Rh negativity should change over time.

- Assume mated pairs have the same number of births on average, regardless of Rh status.

- Assume that for the second and further Rh+ babies born to an Rh- mother, 5% have HDN, and all these babies die. (They wouldn't have all died, even in the paleolithic, so the gene will disappear more slowly than this model predicts.) Future Rh+ babies get HDN at a higher rate than 5% but again, as long as I assume 100% death for them the gene will disappear more slowly than the model predicts.

- Assume there's no advantage to Rh negativity; its effect is entirely negative through HDN as above.

- Plug in current gene frequencies in Europe.

Since the gene will disappear more slowly than this model predicts due to the assumptions outlined above, we can project back based on current gene frequencies, and get dates that are probably slightly more recent than reality.

This model tells us that for the gene frequency to drop to 1% will take 565 generations - between 8,475 to 11,300 years, using a short generation time of 15 to 20 years. For the current Basque frequency of 36% Rh- (assuming it has remained static) to come down to the wider European frequency of 15% would have taken 208 generations, or between 3,120 and 4,160 years.

Assuming that Rh- was at fixation in proto-Basque populations in Iberia and Gascony, the introduction of just 1% Rh+ homozygotes would have taken 22,110 to 29,480 years to get the Basques to 36%. It doesn't escape notice here that these date ranges are all post African exodus, and some of them are within the scope of antiquity of the Near East.

Finally and most interestingly: assuming about an 8.6% injection in a 0 Rh- population of Rh+ 2,025 to 2,700 years ago, this would give us 36% today. That period coincides with the establishment of Phoenician and later Roman colonies in Iberia, and quickly established colonies and armies could easily introduce 1/12th the human DNA in the sparsely-populated Iberian peninsula. This assumes that during this time, Semitic and Indo-European people were Rh+.

It seems impossible without further research, particularly on ancient DNA, to distinguish between these three possibilities:

1) Rh negativity confers no advantage, and there was a founder effect that gradually eroded. Rh positivity was lost completely in a small ancestral Basque population, and very gradually Rh+ through HDN has been decreasing the proportion in the population; in turn Rh negativity has spread throughout the western half of the Old World.

2) Rh negativity confers no advantage, and a dramatic amount of Rh positivity was introduced in antiquity by migrants from around the Mediterranean. The high Rh negativity also seen in some parts of Africa could support a Phoenician mechanism of gene flow (Rh+ into Iberia, Rh- out).

3) Rh negativity does confer an advantage that partially or totally counterbalances the HDN problem that we are so far unaware of. Not exclusive of #1 or 2.


Rh factor isn't nearly the only immune incompatibility between maternal and fetal blood that can cause HDN, and we certainly have a lot to learn about the function of these markers.

Monday, July 29, 2013

To One-Beer or Two-Beer on Newcomb's Paradox

This is a bit of an inside joke for Less Wrongers, so my apologies if it doesn't make you smile. (More on Newcomb's problem here.)



Newcomb's Ranch is a (very isolated) bar on Angeles Crest Highway (CA Route 2) in Angeles National Forest, maybe an hour and a half from downtown LA. One might ask if I one-beered or two-beered at Newcomb's. My good madame or sir, why are one or two the only options? Anyway by the end of the night I thought I was pretty smart but then Omega in his function as omniscient bartender cut me off.

Friday, July 19, 2013

Corollary Discharge and Inner Speech: Clues for Psychosis

A central feature of psychosis is the disintegration of the sense of self. A healthy person, during the course of the day, talks to themselves, and rehashes past or hypothetical arguments with friends and family. The healthy person may even speak out loud during these episodes*, but they know all these thoughts and imagined voices are exactly that, coming from their own head, under their control. In contrast, the voices that psychotic people describe is that very often (actually, in my experience, usually) they are people that the person can identify - usually friends and family. Again in my experience, voices of parents are the most common. What's more, I've witnessed more than once a person who had badly decompensated and told me in the emergency department that he could hear his friend's voice talking to him and wasn't able to reality-test that the voice must be coming from his own brain; but as he reconstituted over several days with medication, the voice eventually became an internal monologue he was having with his friend, with full recognition by the patient that it was indeed an internal monologue - just like anyone else rehashing an argument in the shower.

An interesting study in Psychological Science by Mark Scott from UBC gives evidence that our capacity for inner speech is related to our ability to tune out our own voices when speaking; the neural correlate of this suppression is called corollary discharge. This of course leads to speculations about whether this mechanism underlies the origins of language and cognitive modernity, but psychiatrists and clinical psychologists will find it immediately interesting for another more immediately practical reason: as a way to measure and possibly target auditory hallucinations in psychosis. Of note, this study had no direct measurement of neural correlates, instead relying on the Mann effect, a phenomenon of context-dependent perception of vocal sound (McDonald-McGurk is another example.) That said, a 2011 by Greenlee et al at U of Iowa showed with intracranial electrode measurements in humans that corollary discharge in hearing speech is unsurprisingly located in the auditory cortex.


*More than once, while I was out running on what I thought was a deserted trail, I have been caught talking to myself by an alarmed trail user coming the other way. Invariably as soon as I see them I act like I was singing to myself the whole time. Somehow this never seems to comfort them.

Wednesday, July 17, 2013

A Paleo Cognitive Regimen

This is cross-posted to MDK10 Outside.

Paleo dieters avoid post-agricultural foods. The argument is that agriculture is a product of culture and so it introduced sources of nutrition that our genes haven't caught up with, especially grains, refined sugar, and other concentrated carbs. By eating vegetables and lean meats (so the argument goes) we are more in tune with paleolithic humans and therefore should be healthier.

There are any number of problems with that argument (not least of whic is that adherents often seem disinterested in empirical testing of it), but an interesting question is: couldn't we apply similar arguments to our cognition? The way we think about the world, or possibly even, that we think about the world, started undergoing profound changes about 40,000 years ago. Of note, this time corresponds with humans leaving Africa, developing specialized tools, hunting larger game that required team planning, and the spread of the current form of the FOXP2 gene and subsequent use of language and the cognitive modernization of humans; it marks the boundary between the upper and lower paleolithic. That is to say, until about 40,000 years ago, we solved problems in a primitive, isolated way, and knowledge could be shared only at a much more basic level. Suddenly we have language, reasoning, and mountains of cultural transmission in the form of tools and worldviews so that what we achieve in working memory can be expanded across a whole lifetime, or indeed indefinite lifetimes (i.e., among other things, the idea of an alphabet, hit upon by some clever Phoenician twenty-five centuries ago, allows us to more easily share these ideas right now). This has profoundly changed our physical environment in ways that have outrun our genes' ability to respond - including agriculture.

Therefore, shouldn't a person making the paleo argument for diet also make a similar argument for post-paleo (or at least post-lower paleo) cognition? That is to say, if you're really concerned that agriculture-based food is a threat to the fluorishing of H. sapiens because of its newness and alienness, aren't things like reasoning and systematic institutional research just as bad, if not profoundly more dangerous? Shouldn't we be approaching problems with blunt emotions and a vague memory of some chance association from last time we were in this part of the world with no way to obtain or share knowledge from others? Shouldn't the males of the species be getting in fights with people that look different from us or look at our woman for a second too long, with a resulting homicide rate of 30%? (The actual number for some hunter-gatherer groups.) This is what we're adapted for - our cognitive environment for thousands of centuries. How can the alien world we've built for ourselves in cognitive modernity not be hurting us?

In extension you could even make a Nagel-like argument here that paleo defeats itself, that if you think paleo is the way to go because profoundly biologically novel activities are threatening to an animal's well-being, then you should also eschew reasoning, and therefore eschew paleo, which was arrived at with a very un-paleo process (reasoning, communicating about it with large groups of people, and institutional research).

Wednesday, July 3, 2013

Intracytoplasmic Sperm Injection and Autism Risk: an Epigenetic Connection?

A JAMA paper by Sandin et al looked at neurodevelopmental differences in IVF and non-IVF children, and considered differences between IVF methods. There were slight trends, but no significant differences, associating autism and retardation in all IVF-produced children (RRs 1.14 and 1.18). The big news was the RR increase in intracytoplasmic sperm injection (ICSI, where sperm DNA is injected into eggs): for autism and retardation the RR was 4.6 and 2.35, both statistically significant. The obvious confounder here would be age but they controlled for it.

A few speculative ideas about what could explain these findings:

- epigenetic changes resulting from ICSI relative to normal sperm penetration and genome delivery. This could be due to changes introduced by the handling of the sperm and genetic material or because of lack of a physiologic egg penetration event resulting in aberrant methylation
- co-occurring diseases influencing both sperm mobility/penetrance and neuronal migration. There are known syndromes where defects cause loss-of-function in both sperm and lung cilia. Bypassing defective DNA delivery by sperm in individuals who are heterozygous recessive, have a de novo mutation or are mosaic would allow these to appear in the offspring.
- mechanical damage caused to chromosomes by the injection process
- damage to the membrane from the injection process influencing neurodevelopment in ways we don't yet understand


If the first bullet at least partly explains the problem, we now have an unfortunate set of natural experiments to investigate the speculated relationship between epigenetics and neuropsychiatric illness. GWAS studies on these diseases have produced frustrating few replicable results. We now know that non-coding regions of the genome do produce RNAs that are more likely to come from disease-associated SNPs outside open reading frames, and specifically in some cases are associated with neuropsychiatric disease, possibly we can start to connect the dots.


References:

Sven Sandin, Karl-Gösta Nygren, Anastasia Iliadou, Christina M. Hultman, Abraham Reichenberg. Autism and Mental Retardation Among Offspring Born After In Vitro Fertilization. JAMA. 2013;310(1):75-84.

Matthew J. Hangauer, Ian W. Vaughn, Michael T. McManus. Pervasive Transcription of the Human Genome Produces Thousands of Previously Unidentified Long Intergenic Noncoding RNAs. PLoS Genetics, 2013; 9 (6).

Alexander D. Ramos, Aaron Diaz, Abhinav Nellore, Ryan N. Delgado, Ki-Youb Park, Gabriel Gonzales-Roybal1, Michael C. Oldham, Jun S. Song, Daniel A. Lim. Integration of Genome-wide Approaches Identifies lncRNAs of Adult Neural Stem Cells and Their Progeny In Vivo. Cell Stem Cell, Volume 12, Issue 5, 2 May 2013, Pages 616–628.

Monday, July 1, 2013

Why Can't We Create APP Knockout Humans?

You will either die from Alzheimers disease, or from something else first. This of course is trivial, but Alzheimers is now the third leading cause of death in the U.S. and it's on track to become the second in the next few decades. In Japan it may already be the second, surpassing heart disease. AD can only be truly diagnosed at autopsy, but the majority of us have detectable plaques in our brains by the time we're in our 70s, whether or not we're showing clinical symptoms. The treatments we have so far merely attempt to slow the damage, and they can't even do that well; we've just learned that the (earlier-stage) oligomers are structurally different from the full plaques, which may be why the molecules we've thrown at the plaques don't interact well.

Thinking speculatively, what happens if we prevent the whole plaque formation problem - which really does seem to be an issue of humans living well past our paleolithic warranty period - by knocking out the amyloid precursor protein altogether in our descendants? Knockout mice have been around for a while; what's their phenotype like? Not great. They demonstrate:

reduced weight
decreased neuromuscular performance
reactive gliosis in the hippocampus & cortex (later in adult life)
reduced synaptic plasticity
loss of synaptic immunoreactivity for:
synapsin
synaptophysin
defects in the corpus callosum
This does not sound like an opportunity for enhancement. Original paper here.

The knockout mice also do much worse in a cerebral ischemia paradigm, related to the reactive gliosis seen in the list. If APP is involved with resistance to or recovery from infarct, this is consistent with the observed increase in Alzheimers and amyloid plaque formation in humans after cerebral infarcts.

Aphasia for Words with Visual Content

Fascinating case of a man who had an infarct in (unsurprisingly) his left occipital lobe, and now can speak fluently with abstract concepts, but not with image-able words. Blogged by Neuroskeptic; here's the original paper.