Friday, 28 March 2014

Not a problem for tomorrow, but for yesterday









Supplementary material

The story of our planet is written in its rocks. The study of rock layers, called stratigraphy, can tell us a lot about the nature of our world in times past. Deeper layers of sedimentary rocks correspond to time periods further and further away from the present, and they can be distinguished from each other by their internally consistent characteristics. The presence of certain fossils can also help us mark the boundary between one time period and another: the appearance of shelled organisms, for example, marks the start of the supereon called Phanerozoic.

"Phanerozoic", you'll say. Is that something like the Jurassic? Yes, yes indeed it is, in that it describes a certain division of time in the story of the Earth. These divisions, just like geographical divisions, include large sections that are subdivided into smaller ones. Just as we have continents subdivided in countries subdivided in regions when it comes to define space, we have supereons subdivided in eras, then periods, then epochs, then ages for our geologic history. The Jurassic, for example, is one of three periods that form the Mesozoic era. It was a time of dinosaurs, though not the time of the tyrannosaurus (no matter what the founder of Jurassic Park thought); the tyrannosaurus would live millions of years later, during the third period of the Mesozoic, called the Cretaceous.

Digging vertically and looking at the remains of past forms of life (in the form of fossils, mostly), we are essentially traveling through time. The deeper we are, the more long ago we are. This kind of study allows us to see how life changes over time. Some forms of life, recognized by particular body types, size and other characteristics, change gradually over millions of years (or remain pretty much the same, depending on the circumstances). Sometimes, certain types beget two or more slightly different subforms that, in turn, accrue more and more differences until they look very different from their ancestor. And sometimes, certain lines just... stop. A particular form that might have been present, albeit in more or less modified form, in layers going from 150 millions to 70 millions of years ago is no longer found in the layers corresponding to that moment until now. That's how we understand that this particular lineage went extinct.

In five particularly spectacular instances we saw not ten, not a hundred, but truly massive numbers of lines all disappearing in a very short span. These events are referred to as mass extinctions. The most celebrated is the one that saw the end of the dinosaurs, but it wasn't the most severe.

The most ancient and the second-worst (as far as marine life is concerned) was the Ordovician-Silurian extinction event, which occurred 450-440 million years ago. Paradoxically, this event put an end to a great expansion in animal life forms, as the different phylla that had appeared during the Cambrian saw a spectacular diversification. More than 60% of marine life form were to disappear during this extinction event.

Perhaps less brutal because it was more of a slow burn than a bang (or consisted of several events happening over a relatively short time, geologically speaking), the late Devonian extinction, which likely took place over a few million years around 360 million years ago, killed half the genera living at the time. Among the most severely affected the reef-building corals; there were back then massive reefs in the oceans. Trilobites were also pretty hard hit, but a few lines manages to hang on.

Then came the Great Dying, the Permian extinction, which ushered in the Triassic period and the start of the Mesozoic era. Roughly 252 million years ago, this dreadful moment saw up to96% of all marine life and 70% of land vertebrates vanish from the face of the Earth. (Well, forgive me the lyricism of the image; they didn't actually vanish; I assume they more likely dropped dead). Marine invertebrates took the worst of it and that was pretty much it for the remaining trilobites, who had been around for 270 million years. Compared to that kind of death toll, concepts like decimation (by which the Roman army inspired discipline in its men by killing one legionary out of ten) seem tame indeed.

Then came the Mesozoic era, starting with the Triassic period. This would be the era of the large reptiles. At the end of the first period of the era, the Triassic-Jurassic extinction event, 201 million years ago, would once again clear the table. The fifth of all marine families disappeared, and on land almost all the non-dinosaurian archosaurs died off, with the exception of the crocodiles and a few therapsids. The dinosaurs would then rule the land for the two following periods, the Jurassic and the Cretaceous. (That they did so for more than 180 million years tells us that they were remarkably good survivors; hence, the derogatory comparison of any obsolete thing or person with a dinosaur strikes me as particularly ill-informed. When our modern hipster cell phone-carrying scoffers will have endured 180 million years, they'll have the right to sneer. Not before).

But tough as they were, dinosaurs were to know their Götterdämmerung, leaving only the lovely bit mostly unthreatening birds as their descendants. This extinction is known as the Cretaceous-Paleogene event, but because the Paleogene (with the Neogene) used to form what was known as the Tertiary period, we often informally call the event the Cretaceous-Tertiary, or K-T, crisis.

We're fairly sure that this event was caused, or at least greatly facilitated, by the impact of a very large asteroid (more than 10 km in diameter) that left us the Chixculub crater in Mexico. Another smoking gun is the presence of a thin layer of clay highly enriched in iridium at the boundary between the Cretaceous and the Paleogene. Iridium is rare at the surface of our planet, but more abundant in certain asteroids and other rocks hitting us from space; our hypothesis is that as it was pulverized upon impact, the Chixculub asteroid spread its iridium all over the globe.

Mammals appeared roughly at the same time as the early dinosaurs, but it is only with their passing that we finally got our chance to shine.

Researchers estimate that each of these massive extinctions accompanied important changes in the levels of atmospheric CO2. This gas has different effects: we are all aware that it is a greenhouse gas that can cause the warming of our atmosphere, but it is also the cause of ocean acidification according to the reaction CO2 +H2O -> H2CO3. The carbonic acid thus formed attacks and demineralizes the calcium-rich shells of marine invertebrates... bad news for all involved.

As great CO2 producers, we'd probably be well-advised to pay attention to what we do with our world.



Thursday, 27 March 2014

Correlation and causation









Supplementary material

A fundamental logical principle is that correlation does not equate causation or, as Julius Caesar would have said on the ides of March if the subject of the day had been sophistry and not tyrannicide, "Cum hoc ergo propter hoc". (Instead, he either said "et tu Brute", "tu quoque me fili" or "well, damn!" depending on your sources).

In French, this is called the stork effect because of the observed correlation of the number of stork nests and the birth of babies. The correlation is not causal (if you do know where babies come from, of course. If not, please do not let me rob you of your innocence and read no further). Let's give an example of non-causal correlation that gives the impression that it is meaningful: the case of the horoscope that is right. For today, people born under the sign Leo can expect to have "uncommon energy to undertake new projects". Let's set aside trifles such as how this is a pretty vague prediction and how it might apply to just about anyone, and let's take it at face value. First, we should establish how many people are concerned by the prediction; we must determine our sample size. Very roughly speaking, let's divide the current world population (something around 7 222 378 000 souls) by 12 and say there are 602 000 000 Leos around. Out of all these people, there will certainly be a few who will get out of bed this morning and feel like a million bucks, ready to take on the entire world. Reading their morning paper, they would certainly be entitled to think that the astrologer who wrote their horoscope is a genius, since there is a perfect correlation between the prediction and what they observe. Astrology works, baby!

But see, the point we're making is that it's not enough to see a correlation; even before we try to determine if it is causal, we must demonstrate that it is meaningful and not an illusion. In this case, out of our 600 million Leos, there will be far many more who will feel either quite normal or even a little tired than bursting with energy. (How do I know? Because that's the case with everybody). Should a very patient statistician go around and ask every Leo how they felt this morning, it would probably be revealed that there is a far more convincing negative correlation between the horoscope's prediction and the way a typical Leo feels. In other words, one anecdote, ten testimonies, a thousand stories buttressing a case don't mean anything if they do not show that the association we're trying to demonstrate occurs more often than what would be predicted by chance alone.

A sad and much-publicized story based on a lousy correlation vs causation problem gave rise to the autism and MMR vaccine scare, a societal aberration that started with a medical paper now retracted and that branched into all sorts of nonsense regarding the safety or even efficacy of vaccines in general.

The story starts in the 1990s, with a group of parents of autistic kids who started a class action against the pharmaceutical companies Aventis Pasteur, SmithKlineBeecham and Merck, the makers of three MMR vaccines. The basis for the lawsuit, I'd hazard to guess, is that autism is usually diagnosed in the first few years of a baby's life (when they're around two and saying "no" all the time), an age that also sees kids receiving a lot of shots. The correlation will be there, of course. The lawyer handling the case, wanting some hard data to shore up the case, contacted a researcher who had already published a few papers trying to link certain vaccine strains with bowel disease. A british government outfit called (at the time) the Legal Aid Board started funding the research, which it stopped in 2003 because it judged the evidence was inconclusive.

In 1998, the researcher published a now-infamous paper reporting a link between MMR vaccines, autism and bowel disease in the very serious journal The Lancet. Naturally, as the paper ran against the grain of pretty much all the scientific literature, it was doubted, decried, criticized and heavily scrutinized. Said scrutiny revealed serious flaws in the work, and ten of its co-authors published a partial retraction in 2004, stating plainly that they saw no link between autism and the MMR vaccine. The paper was finally fully retracted by The Lancet in 2010. Adding to the brouhaha was the revelation that the main author of the paper, several co-workers and even one reviewer had received substantial sums from the plaintiffs' legal team and/or their backers. At the very least, this is a conflict of interests. No wonder that desired effects may have been reported.

Unfortunately, even if the retraction of the paper should have brought down the whole anti-vaccine edifice, the harm was already done. Many people, fearing for their children's health, had started refusing to give them their shots; accordingly, as vaccination rates dropped (from 92% in 1996 to a low point of 79% in 2003 in the UK), mumps cases started increasing (4 204 cases in 2003; 16 436 cases in 2004, and 56 390 in 2005). Thank heaven, polio and smallpox are no longer at our door; but even so, many children caught perfectly preventable diseases and some of them died the most futile death imaginable: death by negligence.

It's disheartening to see how a large minority can keep an urban legend alive, despite all the evidence of the remarkable efficiency and safety of most vaccines. Smallpox is gone, for crying out loud, thanks to a global vaccination campaign! "Lord, what fools these mortals be" would exclaim Puck were he around today.

***

The numbers in Bob's graphic are pretty much accurate, as far as my sources are concerned. The numbers regarding chicken consumption come from here and those concerning GM's market share were calculated from the data given in this article. Other curves showing similar trends (but unrelated to autism, as far as I can tell!) include the number of left-handed people in Australia, average hamburger size since the 1950s, the number of houses having access to a phone and (with a shocking negative correlation value), the number of times a Canadian team won the Stanley cup since the 70s.



Wednesday, 26 March 2014

Etiquette









Supplementary material

When greeting someone by raising a hand to the brow, we imitate someone removing their hat. That we no longer habitually wear hats didn't affect the gesture, and it persists as a cultural artifact (just as the hair on our arms rise when we're cold, in a futile attempt to form an insulating layer of pilosity. The poor guys don't realize that they're far too few to do a good job of it, and have probably been for millions of years. Stupid hair. No wonder they keep growing on our chin while deserting our scalp).

What's the origin of the fingers-to-brow salute? Some say that by removing one's hat, one shows respect by exposing one's entire head. "Here's my face. I'm not hiding anything from you". Other, more fanciful explanations, include one that harks back to the middle ages and to knights who would raise their visor to make eye contact. The problem I have with that is that helms with visors were not that frequent in western history, and that knights were even less so. Sure, the Hoi polloi often like to imitate aristocracy, but it seems to me that such a widespread custom is more likely to have had a fairly uncomplicated origin. And hats were, by and large, pretty much ubiquitous at one time.

An even more universal form of greeting is the handshake. It goes so far back that it would be difficult to say how it originated. Some suppose that the gesture allows us to show we're not carrying a weapon. Fair enough, symbolically speaking, but if that were the whole of the story I would have expected handshakes to involve both hands; for as he's distracting you with one empty hand, who knows what the other guy is hiding behind his back? The bastard!

It's a bit the same thing with the clinking of glasses during a toast. Popular wisdom has it that the tradition goes back to the middle ages (again!) and to the supposed habit (quite exaggerated, methinks) of poisoning one's drinking buddy. By clinking glasses, wine drops from one glass would end up in the other, thus ensuring that no one would try to spike his vis-à-vis's drink for fear of getting poisoned as well. It's a nice story, but as with Rudyard Kipling's Just so stories, the explanation raises more problems than it solves. To wit: for any amount of liquid to be exchanged between two glasses, they must be very, very full; that's a pretty good way to make sure there will be wine stains on the tablecloth. Normally, people will clink glasses very lightly so as to avoid making a mess, and also to avoid breaking the glasses. That's particularly obvious when beer glasses are involved: most people will clink them from the bottom, fat and sturdy, rather than the thin and fragile top, which clearly does not favour the exchange of beer, poisoned or not. (I once saw friends clinking their glass mugs very forcefully, but the only mix produced was on the floor, amid the shattered glass pieces). Furthermore, even if one managed to get a few drops from one glass to the other, the poison would have to be a pretty powerful one to have an effect on the poisoner in such small quantities. Especially if the nefarious host is Mithridates VI.

Etiquette defines proper social behaviour and will naturally evolve and adapt to new societal developments (including technological ones). It is for example quite inappropriate to make a phone call to someone in the middle of the night, something we all probably realize without having to be told. There are also spontaneously developing codes of behaviour: I doubt very much that anyone ever imposed the rules for proper urinal selection in the men's bathroom, but every gentleman is probably instinctively aware of them.

Switching off one's cell phone (or putting it in vibration mode) is also appropriate on an important date or at a funeral. It could even be that some future day, we will express our sympathies by shaking slightly, imitating a vibrating cell phone; but since by then phones will have been replaced by some other gadget, nobody will know whence the gesture came from. It may even be hypothesized that it goes back to the middle ages, when people had to huddle for warmth at night, and when losing a family member meant colder nights ahead.



Tuesday, 25 March 2014

Spam









Supplementary material

A cursory Google search for the word "technology" combined with "rudeness" reveals signs of a growing lack of inhibition in personal interactions over the internet and associated social media. It seems that the relative anonymity, physical distance between users and lack of serious consequences for transgressions can generate in many an anti-social attitude that can go from simple lack of empathy to extremely aggressive behaviour.

This is true of individuals, but it is also true of businesses; businesses that do not have the excuse of being emotional persons prone to all sorts of psychological impulses. Granted, businesses are managed by real-world people, but their decisions are expected to be the result of some kind of strategy, some kind of plan, and not just sudden whims dictated by the moment.

Naturally, businesses don't send threatening e-mails the way a few irate people do. Their goal is to get our money, not to scare us (unless they find a way to scare us into giving them money). But what they do, and with very little subtlety, is to subject us to a barrage of advertising. In the case of some websites, pop-up ads and slow-loading animated publicity sidebars can make a visit very unpleasant (and often fruitless), as we can hardly distinguish the website's message from all the noise of publicity. That's par for course, I suppose, as nobody forced us to visit that website; and such sites should eventually lose traffic and disappear. So no big deal in the long run.

For e-mail, printed fliers and telephone calls it's a different thing. In those cases we must sacrifice some of our time and attention to acknowledge the existence of, and get rid of, unwanted messages. It could be argued that it doesn't take much time to simply delete an unwanted e-mail, to hang up the phone, or to put fliers in the recycling bin. All of which is true. Furthermore, filters can be installed to keep most junk mail from ever reaching our attention. But in the communication ecosystem we witness a co-evolution of junk mail generators that are ever more clever in eluding anti-junk filters, and we see more and more genuine messages being treated as junk (thus forcing us, if we don't want to miss anything, to take a look at the long list of junk mail that might be genuine. And our dealing with all this junk does take time in the end.

The race between parasites (junk mail) and hosts (our communication devices) is nothing new, and I have faith that most systems will manage to stay one step ahead of the tide of junk that tries to get our attention. However, it looks as if simple rules of civil behaviour are being lost no matter what we do. Consider phone solicitation, as rude a form of communication as one can imagine. Calling people at dinner time to ask them if they're happy with their current car insurance is bad enough, but now we've gone one step further... it's a bloody machine that calls people at ungodly hours to deliver a recorded message. Are there actually people listening to such messages instead of slamming down the phone? Are there people who will actually give money to the company that rigged the whole operation? Probably very, very few, but even only a tiny fraction of a massively huge number of calls could generate a profit. This principle was the basis for a famous 1995 book written by two lawyers, a Mr. Carter and Ms. Siegel, How to Make a Fortune on the Information Superhighway: Everyone's Guerrilla Guide to Marketing on the Internet and Other On-Line Services. The gist of the strategy is this: flood the world with publicity for whatever you're selling, and out of the teeny tiny percentage who respond you will make good money. This would not work with printed fliers or TV ads because of the costs involved, but with e-mail it becomes all too easy.

This type of strategy naturally means that actual information will be lost in a wave of irrelevant noise, as in the famous Monty Python sketch that gave these unwanted messages their popular name: spam. In this 1970 sketch, the famous canned meat product is ubiquitous and communication is regularly interrupted and made difficult by Vikings chanting the word "spam". (It is delightfully absurd, of course, as is pretty much any Monthy Python skit). The word "spam" itself, used in this context, goes back to 1993 according to this source (and references therein).

What galls me the most at present is how the majority of spam messages today contain very small characters at the end, graciously informing us that you can "unsubscribe" from the mailing list (as if you had ever subscribed in the first place!!!) to avoid receiving further communications from this particular source. Well, first, one doesn't just spit in the soup and say "if you don't like it, I won't do it again". That is not appropriate behaviour, quite obviously. Secondly, judging from personal experience, these senders regularly get new e-mail lists from a variety of sources. Even if we unsubscribe after receiving some junk message and that of address is duly removed from the current sending list, it will just pop up again on the sender's computer sooner or later. I keep unsubscribing from the same sender's list on a regular basis.

It's not a major crisis; it can squarely be called a rich person's problem. But what it says about the progressive loss of respect for the individual in our society is pretty sad. Simple politeness, which business at least pretended to observe so as not to antagonize their clientele, seems today to be as threatened as the Amur leopard.

Monday, 24 March 2014

Rejection











Supplementary material

The Bible, the first book to come out of Gutenberg's moveable-type printing press, is an anthology of texts that were written over the course of several centuries. As Christianity developed in the mediterranean basin, many texts were shared and read by the early adopters of the new faith: the traditional sacred texts of the Tanakh, naturally, but also descriptions of the life of Jesus, tales of the early apostles, letters from disciples, and prophecies regarding the second coming of the messiah. Some of these new texts were not written in Hebrew but in Aramaic and even in Greek.

When it comes to religions based on scripture, many texts are supposed to have been given, dictated or at least inspired by a divine being. Naturally, in practical terms, there must be a time when someone must decide what, in the midst of all the available material, has been divinely inspired and what has just been written by inspired and enthusiastic mortals without official divine guidance. In the case of Mormonism, for example, the task is a relatively easy one since the Book of Mormon was first published 1830 by Joseph Smith and was not collected from disparate and perhaps conflicting sources. For other religions it is a little more complicated. Some of them simply do not have one single sacred reference; as for the Abrahamic religions, the composition and compilation of their sacred books (which overlap somewhat, either in actual chapters or at least subject matter) are the subject of many, many scholarly works.

In the case of the Qur'an, the revelation to Muhammad is said to have occurred all through his life; whether the compilation of the revelations was performed during his lifetime or was begun right after his death under the first caliph Abu Bakr (632-634) is under dispute. Ali, The prophet's son-in-law, is for example said to have been charged by the prophet himself to write down the text of the Qur'an, which he would probably have done in 631-632. What is generally agreed on is that the canonical version of the book was adopted under caliph Uthman (653-656); at that time, all personal copies (potentially tainted with errors or containing parts that disagreed with the canonical version, I imagine) were ordered burned.

With the Christian Bible, emerging as it did from the Jewish world, it was a given that the Tanakh would still be regarded as "official". The book Christians use today is divided in two parts: the Old Testament that describes events prior to the birth of Jesus, and the New Testament that start with his birth and continues with the deeds of his apostles. The Old testament includes classical chapters that are not found in the Jewish Bible, such as the books of Judith, Tobit, Maccabees and a few others, known as the deuterocanonical books.

With the New Testament, compilers had to deal with more recent texts; roughly 200 years of discussions were required among ecclesiastical authorities to determine what should be seen as canon and what should be rejected. (Fittingly enough, the apocalypse was the last book to be accepted by everyone!) Different synods convened to discuss these matters, and although the current catholic canon may have already been defined in Hippo Regius in 393, the acts of that particular council have been lost. The council of Carthage, in 397, did however read and accept a summary of these acts; we can assume that by that time, the bible as a single book had pretty much taken the form we know today.

The biblical canon would however still see a few changes over the centuries; even today, different branches of Christianity do not always agree on the status of certain books seen as canonical by some and apocryphal by others.

As for the reviewers mentioned in the cartoon above, they are the people tasked with reading and (severely!) criticizing scientific reports submitted to journals that use peer-review as a way to determine quality. Their job is to find any fault in the work, either in experimental design, control, analysis or interpretation. They act anonymously, and are expected to be fair-minded but as thorough and critical as possible. Sometimes there will be two, sometimes there will be three; not infrequently, they will not agree 100% in their analysis (in which case the editor can decide what to do, although all reviewers must generally be happy with a paper for it to be published). Peer review has been criticized as not being a perfect system, a criticism with which I agree (but what system is perfect?) However one thing is sure: it sure puts a lot of pressure on scientists when it comes to the soundness of their work. That's why "real" science is found in peer-reviewed journals and nonsensical pseudo-science can be found everywhere else.

Because out of the three reviewers there is often one who will be really, really critical of a submitted paper and prevent it from being published, "reviewer #3" has become something of a cultural icon in scientific circles.



Friday, 21 March 2014

Sensitive doctor











Supplementary material

Look, no matter what we do, we're going to die some day. Our mortal flesh did not evolve so that it would simply keep going on, but so we could reproduce and see to it that our offspring, in turn, had a good chance of propagating their (and our) genes. On an individual basis, we can already be glad that increased sanitation, better nutrition, protection from the elements and science-based medicine pushed back the average life expectancy to something in the early 80s, way after our ongoing existence can have a significant impact on the survival of grandchildren and great-grandchildren. In other words, we are blessed with longer lives than most of our ancestors could have hoped for, and it doesn't cost us much.

Unfortunately, there are circumstances that can cut short the thread of our life. Accidents, naturally, can put an end to our days suddenly and unexpectedly. Diseases, as well. In both cases, there are steps we can take to mitigate the risks.

Just one example: thanks to vaccination, we have destroyed or pretty much contained the monsters that were smallpox and tuberculosis. Consider the measure of our success: from a staggering death toll of 300 million people imputable to smallpox in the early 20th century, we went down to zero when it was finally eradicated in 1979. In fact, we were so successful in this endeavour that many people today fear vaccines more than they do the horrible diseases that they helped destroy; some will even go so far as to claim that these diseases went away by themselves. Oh, the humanity.

Other steps we can take to increase our chances of living a long life is to detect signs of certain cancers as early as possible. Cancer is, generally speaking, a disease involving cells that divide uncontrollably. This proliferation can disrupt the normal function of the body and lead to death. Now in many cases, the proliferation will start in one spot and form a mass of proliferating cells that we refer to as the "primary" tumour. The good news if we catch it at this stage is that we may be able to cut it out of the body, (provided it's not in some critical spot) and if we are able to remove all the out-of-control cells in the operation, then the cancer is gone. Were we to wait and wait and wait, we'd give the tumour the chance to shed some of its cells; these cells would travel around the body like dandelion seeds and start secondary tumours elsewhere. These new tumours, called metastases, are much harder to remove since there are so many of them all over the place. Hence the need for speed at the detection stage.

Some cancers with a high prevalence are, luckily enough, fairly easy to spot. Most men will undergo the uncomfortable and *ahem* invasive procedure that allow a doctor to check if the patient's prostate is hypertrophied, an early sign of prostate cancer. (The American Cancer Society states that prostate cancer is the second most common in American men, with 233 000 new cases and roughly 30 000 deaths annually out of a population of about 150 million).

Among women, the most prevalent cancer is breast cancer, with similar numbers: 232 340 new cases and 39 620 deaths in 2013. Self-inspection and mammographies are used to try and spot signs of a tumour, which would feel like a small hard mass. (Unlike prostate cancer, however, breast cancer does not restrict itself to mostly older people; its societal impact is therefore greater).

Although some men will also suffer from breast cancer, there are a few others that are restricted to women: endometrial cancer showed up in 49 560 new cases in 2013, a year that saw 8 190 lives lost to the disease. Ovarian cancer had 22 240 new cases and 14 030 deaths. Cervical cancer was found in 12 340 new cases and caused 4 030 deaths.

The good news regarding the latter is that we know pretty much how it develops: most cases are caused by a virus, the human papilloma virus (HPV). And we now have vaccines against the damned thing! With a little luck and if the anti-vaccine flat-Earthers do not convince too many people that vaccination causes spontaneous combustion of whatever disease is currently fashionable in the Hollywood jet-set, we may soon lower the death toll of cervical cancer to a tiny fraction of what it was in the past. (Men should also get vaccinated, since on top of keeping them from potentially infecting their significant other, it will also protect them from genital warts. Now there's a disgusting word).

Yes, but what about the rest of us who might already be carriers of the virus? That's where the Papanicolaou test (the Pap test, for short) comes in (pun unintended). The test relies on collecting cells from the endocervical canal and looking at them under a microscope. Pre-cancerous cells do not look like normal epithelial cells, and will inform the doctor and his patient that a more attentive scrutiny might be appropriate. This precautionary step can lead to a decrease of 80% in mortality!

Thursday, 20 March 2014

A rose by any other name











Supplementary material

There is no firm rule when it comes to naming a new gene, a new protein or a new mutation. Some names are resolutely minimalist and flatly descriptive, as with one of the most important proteins involved in the control of cell proliferation: p53 simply means "Protein, 53 kilodaltons". Way to go guys! How are we to shake off the scientist's public image of a boring stiff if we keep giving such names to things?

Luckily, we also have plenty of names involving a little more imagination. Somehow, for historical reasons, it is particularly true for genes involved in development. Some of the earliest genes studied in this field caused malformations when they were mutated, and the gene was given a name describing said mutation: hedgehog, for example, when it was mutated, caused spikes to appear on the cuticle of the drosophila embryo. A mutant daughterless gene would interfere with follicular development, resulting in all-male offsprings. And the habit caught on, with fanciful names like gürken or knirps (pickle and umbrella in German, describing the shape of the embryo), decapentaplegic, sex-combs reduced, pumillio, cactus, scylla and charybde and many, many more.

When the mammalian homolog of hedgehog was identified, it was called Sonic hedgehog, a reference to a video game character. Pikachurin was named after the Pokémon character, but an actual gene first called Pokémon (for "POK erythroid myeloid ontogenic factor", a fine case of reverse-engineering of acronyms) suffered a name-change when the owners of the Pokémon trademark protested that their property's name was associated with a cancer-causing gene. It is now called ZBTB7A

Before we give the impression that scientists are either beige and unimaginative stiffs or wild video game addicts, let us mention the genes tudor, valois, staufen and vasa, named after four European royal lines that ended for lack of descendants; these four genes are involved in fertility and their mutation makes a drosophila sterile. Yes, scientists love general culture as well!

Some names are very descriptive and give a little more info than just "p53". BRCA1, for example, stands for "Breast cancer gene 1"; TICAM2 stands for TIR domain-containing adaptor molecule 2. As you can see, acronyms or quasi-acronyms are used to make unmanageably long names a bit easier to pronounce. But how does one choose to pronounce BRCA1? Do you go "Bee-Arr-Cee-Ay-One" or "Brecca One" or even "Barca One" if you have Carthaginian blood? Here again there is no fixed rule. The Autonomous Replication Sequence (ARS) is pronounced Ay-Arr-Ess wherever "arse" is considered a bad word. Tradition and usage will usually decide how an acronym is pronounced; a bit as with YHWH, really, which has given us Yahweh or Jehovah, depending on which vowels one decides to add. (Ancient Hebrew texts, where short vowels were not written down, might indicate where such a vowel should be introduced by adding dots in certain spots, but since everyone knew how to pronounce YHWH, more precision was not needed.)



Tuesday, 18 March 2014

PSSSHHHHT!!!







Supplementary material

Most liquids contain dissolved gases, and can contain a certain amount of molecules at a certain pressure (the limit being known as the saturation point). By increasing pressure, it is possible to increase the number of molecules dissolved in the liquid (making them supersaturated). However, if the pressure is then abruptly released, the excess dissolved molecules try to escape the liquid, shifting back into a gas. This is how we put the fizz in soft drinks: by supersaturating them with CO2. Opening the bottle or can releases the pressure, and bubbles form.

Now it is clear that in a normal soft drink the gas escapes as a series of bubbles which clearly have only a few points of origins, called nucleation points: the CO2 doesn't escape all at once from the liquid (otherwise, what would be the point?) These nucleation points, where bubbles start forming, are usually solid surfaces like a scratch on the inside of the bottle, a small piece of dust, a slice of lemon plunked into the glass or even local fluctuations in the liquid's molecular distribution.

In the famous Mentos and Diet Coke experiment, one or many Mentos mints are dropped into a soft drink (preferably into a 2L bottle). The surface of the candy, made of layers of sugars and gum arabic, is apparently home to a very large number of irregularities that serve as nucleation points. The CO2, so eager to escape its dissolved state, has therefore the opportunity to start doing so simultaneously from many different points, creating an instant and thick foam of bubbles that violently bursts out of the bottle's collar.

Since the Mentos is also heavy, it sinks to the bottom of the bottle, making the reaction even more efficient (instead having the candy dance on the surface, floating on a bed of bubbles and seeing part of its surface isolated from the liquid). What's more, the gum arabic in the candy lowers the surface tension of water (it is a surfactant), meaning that the gas needs less energy to escape the liquid.

Friday, 14 March 2014

Abstinence







Supplementary material

Bdelloid rotifers are microscopic invertebrates living in fresh water and moist soil. As I recall, they are commonly observed in soil samples (with paramecia, rotifers are what I remember most from my high school classes involving a microscope). They travel around by swimming or by crawling in a leech-like way, using their head and foot in an alternative step. ("Bdelloid" means "Leech-like").

Their most remarkable characteristic is that they have been reproducing in an asexual way for millions of years. There are no male bdelloid rotifers; only females whose eggs are produced by mitosis instead of meiosis. Genetic analysis of the roughly 360 parthenogenetic species of bdelloid rotifers suggest that their most recent common ancestor lived around 80 million years ago.

Why is that interesting? After all, parthenogenesis is common in the living world. Bacteria reproduce asexually. Yeast can reproduce either sexually or asexually. Plants can be spread by apomixis as well as by sexual reproduction. Even among animals, where sexual reproduction is the norm, we know of cases of "virginal births". For example, turkeys have given birth to babies without fertilization (all males, because in birds the sexual chromosomes ZW give females and ZZ to males, and the eggs involved here were the result of haploid eggs doubling their genetic content -the WW combination being lethal). Komodo dragons have given birth to babies without mating. Sharks were born to partner-less mothers in aquarium located in Nebraska and Virgina. We know of strains of the brine shrimp Artemia that have been parthenogenetic for quite along while. And the famous lizard Cnemidophorus neomexicanus is only one of several parthenogenetic lines of lizards where only females are present. So what's the big deal about bdelloidae? Well,it,s really the 80 miilion years figure.

From what we can tell, a general rule regarding the development of parthenogenesis in animals is that it's bad news in the long term. We came to that conclusion by observing that parthenogenetic animal lines die out sooner or later, but sooner more often than later. That is understandable from different view points: naturally, a parthenogenetic line has little allelic variation (only that conferred by individual mutations carried down a specific sub-line), and so the chances of developing individuals better adapted to this or that change in selective pressures is smaller than in a sexual species. Furthermore, with no meiotic recombination, each chromosome (even if they are paired) is on its own when it accrues point mutations, deletions or insertions; it is a victim of Muller's ratchet. (It is actually quite interesting to observe how the two chromosomes in each pair are growing increasingly different from their partner)(1)!

So, the Bdelloidae are really oddballs when it comes to parthenogenesis, having not only survived 80 million years but showing no apparent distress. A possible explanation for their amazing success in compensating for the sex-derived genetic variation that we all take for granted is that bdelloidae are amazingly good at appropriating foreign DNA sequences, even that of their bacterial food. They apparently do so when repairing their own damaged cells, and this massive horizontal gene transfer could very well compensate for what is lost with the abandonned meiosis.

As for purity rings, they are cultural artifacts indicating that the wearer intends to remain chaste until their marriage. They were made popular in the 1990s among American conservative-minded groups who promoted abstinence for teens.


(1) Dawkins, R. The ancestor's tale, Mariner Books, New York, 2004, page 429.

Tuesday, 11 March 2014

Jargon





Supplementary material

The epiploon (or greater omentum) is a large fold of the peritoneum that extends down, apron-like, over the intestines.

Friday, 7 March 2014

Voodoo economics







Supplementary material

Voodoo economics is a term crafted by George H. W. Bush in 1980, as he was opposing Ronald Reagan in the Republican primaries. It relates to economic strategies that are expected to deliver certain results even if the link between said actions and results has not been demonstrated. To wit: lowering taxes for the rich was supposed to cause them to spend more, thus stimulating production, thus creating jobs, thus increasing everyone's economic situation. (What we have observed since then is an ever-increasing gap between the most well-off and the most hard-pressed).

Baron Samedi is a loa from the Haitian Vaudou religion. He is usually represented as wearing a black tuxedo, but when that's what I drew the entire cartoon was altogether too dark; I therefore resorted to a white tux version evoking that seen in the James Bond movie Live and let die.



Thursday, 6 March 2014

You win some, you lose some








Supplementary material


The Mars Climate Orbiter was a probe meant to orbit the planet Mars and study its atmosphere as well as act as a transmission relay for the Mars Polar Lander. It was lost in 1999 as it was trying to reach its orbit, because some of the computer programs directing its behaviour used non-SI units (pounds per second instead of Newtons per second). This caused the trajectory of the probe to be inadequate for orbital insertion, and the orbiter disintegrated in the martian atmosphere.

A silly mistake that cost over 300 million dollars. Still, as far as accurately sending projectiles go, an impressive feat.



Wednesday, 5 March 2014

At the museum









Supplementary material



Darwin's frogs have a way of caring for their brood that might seem quite odd when witnessed for the first time: the male stands guard over the roughly 40 eggs that were laid by the female, and as soon as the embryos start moving he ingests them. Don't worry, though: he's actually keeping the kids in his vocal sac.

There the embryos hatch into tadpoles and they'll remain in their dad's vocal apparatus until metamorphosis, at which point they hop out of his mouth to start their frog life.

"Where do babies come from?" is a question much easier to answer when you're a Darwin's frog: "from daddy's mouth, dearie".