Sunday, February 28, 2010
Witch, plague victim, vampire -- a real trifecta of European nightmares.
Gordon-Reed begins by reconstructing the background and life of Sally's parents, the "bright mulatto" slave Elizabeth Hemings and John Wayles, English migrant, lawyer and debt collector, whose daughter Martha married Thomas Jefferson in 1772. Hence, notoriously, Sally was the half-sister of Jefferson's wife and the two were said to look alike. For some white Southerners, this quasi-incest has served as an explanation, because Jefferson can be seen as having reached out to his dead wife through Sally. A less familiar fact . . . is that, because there was interracial sex and resultant children over two generations -- Wayles with Elizabeth Hemings, Jefferson with Sally Hemings, assorted white employees and neighbours with other Hemings women -- Monticello was an intricately claustrophobic site of family ties, acknowledged and hidden. Slaves and free people had relationships based on power and dependency, but also on blood. Sally was a maid to Jefferson's daughter Polly, but Sally was also Polly's aunt, and Sally's children by Jefferson were Polly's cousins, just as Polly's son by Francis Eppes was cousin to Sally's children. These people could be mistaken for each other by strangers. This consanguinity was a central fact at Monticello, for the Hemingses formed a mulatto elite, and they kept aloof from darker slaves, with whom they almost never intermarried.
At Wired, an alarming piece on UG99, a new variant of stem rust that threatens the global wheat crop and therefore the food supply of a few billion people:
The enemy is Ug99, a fungus that causes stem rust, a calamitous disease of wheat. Its spores alight on a wheat leaf, then work their way into the flesh of the plant and hijack its metabolism, siphoning off nutrients that would otherwise fatten the grains. The pathogen makes its presence known to humans through crimson pustules on the plant’s stems and leaves. When those pustules burst, millions of spores flare out in search of fresh hosts. The ravaged plant then withers and dies, its grains shriveled into useless pebbles. . . .Have a nice day.
The pathogen has already been detected in Iran and may now be headed for South Asia’s most important breadbasket, the Punjab, which nourishes hundreds of millions of Indians and Pakistanis. What’s more, Ug99 could easily make the transoceanic leap to the United States. All it would take is for a single spore, barely bigger than a red blood cell, to latch onto the shirt of an oblivious traveler. The toll from that would be ruinous; the US Department of Agriculture estimates that more than 40 million acres of wheat would be at serious risk if Ug99 came to these shores, where the grain is the third most valuable crop, trailing only corn and soybeans. The economic loss might easily exceed $10 billion; a simple loaf of bread could become a luxury. “If this stuff gets into the Western Hemisphere,” Steffenson says, “God help us.”
Saturday, February 27, 2010
Friday, February 26, 2010
Some others I particularly like:
3 Never use a verb other than "said" to carry dialogue. The line of dialogue belongs to the character; the verb is the writer sticking his nose in. But "said" is far less intrusive than "grumbled", "gasped", "cautioned", "lied". I once noticed Mary McCarthy ending a line of dialogue with "she asseverated" and had to stop reading and go to the dictionary.Margaret Atwood:
4 Never use an adverb to modify the verb "said" . . . he admonished gravely. To use an adverb this way (or almost any way) is a mortal sin.
8 You can never read your own book with the innocent anticipation that comes with that first delicious page of a new book, because you wrote the thing. You've been backstage. You've seen how the rabbits were smuggled into the hat. Therefore ask a reading friend or two to look at it before you give it to anyone in the publishing business. This friend should not be someone with whom you have a romantic relationship, unless you want to break up.Hilary Mantel:
3 Write a book you'd like to read.Joyce Carol Oates:
6 Keep in mind Oscar Wilde: "A little sincerity is a dangerous thing, and a great deal of it is absolutely fatal."Colm Toibin:
2 Get on with it.
Over recent decades, autism and other development disorders also appear to have proliferated, along with certain cancers in children and adults. Why? No one knows for certain [but] suspicions are growing that one culprit may be chemicals in the environment. An article in a forthcoming issue of a peer-reviewed medical journal, Current Opinion in Pediatrics, just posted online, makes this explicit.The chief author is Dr. Philip J. Landrigan, a prominent pediatrician. He told Kristof that while the science is still iffy, he is personally certain that most autism is caused by toxins.
The article cites “historically important, proof-of-concept studies that specifically link autism to environmental exposures experienced prenatally.” It adds that the “likelihood is high” that many chemicals “have potential to cause injury to the developing brain and to produce neurodevelopmental disorders.”
“The crux of this is brain development,” he said. “If babies are exposed in the womb or shortly after birth to chemicals that interfere with brain development, the consequences last a lifetime.”Some small studies have indicated increased risk of autism and other developmental abnormalities from a variety of drugs and other chemicals:
Suspicions of toxins arise partly because studies have found that disproportionate shares of children develop autism after they are exposed in the womb to medications such as thalidomide (a sedative), misoprostol (ulcer medicine) and valproic acid (anticonvulsant). Of children born to women who took valproic acid early in pregnancy, 11 percent were autistic. In each case, fetuses seem most vulnerable to these drugs in the first trimester of pregnancy, sometimes just a few weeks after conception.None of these studies have been replicated on a large scale, so none of this can be considered very certain. All such studies suffer from the problem that we are exposed to so many bio-active chemicals, natural and man-made -- before you get outraged over the dangers of modern life you should remember that soot from wood fires is full of dangerous poisons -- that we may never be able to figure out which chemicals or combinations of chemicals cause which problems. It is a dangerous world we inhabit.
Not this year. This morning light snow is falling around my house, and as I stood on the train platform I was scoured by a bitter wind that turned the falling flakes into little icy missiles. My yard is still buried in snow from the Great Storm of 2010, and nothing is green but the rhododendrons and hollies, at least, the branches that weren't broken by the weight of the snow.
Ah, well, you can't be lucky every year.
Thursday, February 25, 2010
My married friends with kids don’t spend that much time with their husbands anyway (between work and child care), and in many cases, their biggest complaint seems to be that they never see each other. So if you rarely see your husband—but he’s a decent guy who takes out the trash and sets up the baby gear, and he provides a second income that allows you to spend time with your child instead of working 60 hours a week to support a family on your own—how much does it matter whether the guy you marry is The One?
Wednesday, February 24, 2010
In this paper I develop a dynamic programming model of the joint career and marital decisions of young men between the ages of 16 and 39. The results show that labor market decisions are strongly influenced by their returns in the marriage market. If there were no returns to career choices in the marriage market, men would tend to work less, study less, and choose blue-collar jobs over white-collar jobs. These results suggest that the existing literature underestimates the true returns to human capital investments by ignoring their returns in the marriage market.Although as one of the commenters on Matt Yglesias' blog put it, this might be
Because there’s no phenomenon for which economists can’t create a model that confirms what they already believe.
The site isn't just old, it redefines old: the temple was built 11,500 years ago—a staggering 7,000 years before the Great Pyramid, and more than 6,000 years before Stonehenge first took shape. The ruins are so early that they predate villages, pottery, domesticated animals, and even agriculture—the first embers of civilization. In fact, Schmidt thinks the temple itself, built after the end of the last Ice Age by hunter-gatherers, became that ember—the spark that launched mankind toward farming, urban life, and all that followed. . . .
The new discoveries are finally beginning to reshape the slow-moving consensus of archeology. Göbekli Tepe is "unbelievably big and amazing, at a ridiculously early date," according to Ian Hodder, director of Stanford's archeology program. Enthusing over the "huge great stones and fantastic, highly refined art" at Göbekli, Hodder—who has spent decades on rival Neolithic sites—says: "Many people think that it changes everything…It overturns the whole apple cart. All our theories were wrong."
Schmidt's thesis is simple and bold: it was the urge to worship that brought mankind together in the very first urban conglomerations. The need to build and maintain this temple, he says, drove the builders to seek stable food sources, like grains and animals that could be domesticated, and then to settle down to guard their new way of life. The temple begat the city.
Interviewer: How about venison? were there much deer up in this area, when you were growing up?
Ms. Draper: None.
Ms. Alland: Yeah -- that's what Hidy Wilhide said, he said 1930-some he saw a deer and he said people came from miles just to see the tracks.
WHEN fearsome Baltic pirate Klaus Stortebeker was executed 600 years ago, his headless body is said to have walked 12m along the length of Hamburg quayside.
He had struck a deal with the elders of the port: any of his 70 men he managed to pass in his post-decapitation walk should be spared. The quivering corpse passed 11 fellow pirates before the executioner put out a foot and tripped him up.
Little wonder, then, that the skull of Stortebeker has fascinated Germans for so long -- and that its theft from a Hamburg museum last month has kept police busy. They interrogated members of the often reckless FC St Pauli fan club and dug deep into the city's Goth scene, before concentrating on a new possibility: that the pirate's skull has become a trophy in the turf wars between rival biker gangs. . . .
Stortebeker is regarded as a Robin Hood or even a Che Guevara figure by many north Germans because he robbed the rich merchant ships of the Hanseatic League. However, evidence of him redistributing his booty to the poor is scarce. Legend has it that after his execution, Hamburg senators found the masts of his ships had cores of gold and silver.
The possibility that Stortebeker, who was decapitated in October 1401 (or a year earlier, by some accounts), aged 40, was little more than a bloodthirsty crook has not detracted from his iconic status. He has a statue honouring him in Hamburg and a brewery in Stralsund named after him. "The skull is an important relic of Hamburg history," said Hamburg Museum director Lisa Kosok. "It is priceless." It disappeared for a few centuries but re-emerged in 1878 during excavations to expand Hamburg harbour. The age of the skull was confirmed in 1999.
The Hamburg Senate failed to keep its promise to Stortebeker and the 11 men were not spared. After chopping off the heads of all of Stortebeker's pirates, the executioner was asked if he was not a little tired. He replied that he had enough energy to execute the Senate elders as well. This was probably intended as a joke -- but the Senate ordered the executioner to be beheaded.
I'm going to have to find out the source of these stories. You can read a brief biography of Stortebeker here.
Tuesday, February 23, 2010
--Ned Block and Philip Kitcher, responding to a philosophical attack on the logic of natural selection
I am reminded of David Hume, who remarked while that there is no way to refute Bishop Berkeley's argument that the world doesn't really exist, it just isn't very convincing.
When it comes to matters of opinion or personal beliefs, it is absolutely the duty of the news media to report both sides (and any extra sides there may be, on those rare odd occasions when there are somehow more than two). It doesn't matter which one they agree with, they need to acknowledge the fact that some people think gay marriage is a right and others think the gays are forming a unicorn army that will kill us all.
When it comes to matters of fact, however, they absolutely do not have that duty. Particularly when it comes to technical or scientific matters where it takes somebody with training to speak knowledgably on the subject.
If we're talking about if, say, vaccines cause autism, we need to hear from scientists. That's a scientific issue. We do not need to hear from Jenny McCarthy or Jim fucking Carrey, in the name of giving "both sides." Jim and Jenny don't get a side. They have no background in the subject, and it's one that requires fucking background.
Sure, they can talk about poisonous vaccines to Oprah or whoever is sitting next to them at the Lakers game all they want. They have freedom of speech. That freedom does not guarantee them a seat on a panel of experts. . . .
But we can't just disregard their opinions, can we? Yes. Yes we can. If you're going to weigh in on a scientific matter, you need to bring data, gathered by people who know what the fuck they're talking about. If the subject is medicinal marijuana, we're not going to quote a stoner who has suddenly realized his hands can talk.
As with most Phase I trials of cancer drugs, the people being treated were very sick and considered beyond help with other methods, so that the drug was able to keep 15 of 16 patients alive and reverse the disease in nine is very impressive.The Phase I study examined primarily melanoma patients whose tumors were shown on genetic sequencing to contain the V600E mutation. A total of 21 patients were treated with escalating doses of the drug. The drug appeared to be well tolerated at doses providing projected therapeutic drug levels, although dose limiting toxicity was ultimately reached. A small cohort (3 patients) of papillary thyroid cancer patients was also treated.The results were remarkable. Of the 16 melanoma patients with the V600E mutation, 9 had partial responses , 6 had stable disease and one had progressive disease as their best responses. None of the 5 non-mutated melanoma patients had any responses. Also, of the three papillary thyroid cancer patients treated, one had a partial response and two had stable disease.The clinical and radiologic effects were reported as rapid and dramatic. Tumor shrinkage at multiple organ sites were seen, including liver, lung and bone. (No patients with brain metastases were treated in this trial.) Several patients also had dramatic clinical improvement in symptoms associated with their disease, such as pain.
Our knowledge of the genetics of cancer has been growing for 30 years now without leading to much in the way of help for cancer patients. Perhaps this latest generation of drugs will change that.
Monday, February 22, 2010
In deciding not to refer charges to state bar committees, Margolis does not tell us that Yoo and Bybee behaved admirably or according to the high standards that we should expect from Justice Department lawyers. Indeed, he says the opposite. Yoo and Bybee exercised poor judgment and let the Justice Department down. But Margolis argues that the Office of Professional Responsibility chose too high a standard to judge the professional responsibility of Yoo and Bybee. The OPR argued that Yoo and Bybee had "a duty to exercise independent legal judgment and to render thorough, objective, and candid legal advice." This standard, Margolis explained, is much too high a requirement and not one that Yoo and Bybee were previously warned was the standard to which they would be held.
I know what you are probably saying: shouldn't every government lawyer have to live up to this standard? Of course, they should, but the point is that this is a disciplinary proceeding. It's not about what people should do, but about how badly they have to screw things up before they are subject to professional sanctions.
Instead, Margolis argues that, judging by (among other things) a review of D.C. bar rules, the standard for attorney misconduct is set pretty damn low, and is only violated by lawyers who (here I put it colloquially) are the scum of the earth. Lawyers barely above the scum of the earth are therefore excused.
Margolis concludes that Yoo and Bybee exercised poor judgment and made bad legal arguments. But lawyers often make arguments that are bad or even laughably bad, and this by itself does not violate the very low standard set by rules of professional responsibility. These rules are set up by jurisdictions to weed out the worst offenders, leaving the rest of the legal profession to make entirely stupid, disingenuous and asinine arguments that normal people with functioning moral consciences would not make. That is to say, rules of professional misconduct are aimed at weeding out sociopaths and people driven to theft and egregious incompetence by serious drug and alcohol abuse problems; they do not guarantee that lawyers will do right by their clients, or, in this case, by the Constitution and laws of the United States of America. In effect, by setting the standard of conduct so low, rules of professional conduct effectively work to protect all those lawyers out there whose moral standing is just a hair's breadth above your average mass murderer. This is how the American legal profession simultaneously polices and takes care of its own.
There has been much debate about the early course of autism, specifically the earliest age at which autism may be detected. At present scientific evidence suggests that autism is dominantly genetic, and so researchers expect that there may be early signs of autism even in infancy. Traditionally, however, autism is not diagnosed until age 2-3, when parents bring their children to medical attention, or when signs are detected on routine well-child visits or in day-care.
Retrospective studies, largely involving review of home movies, have suggested that autism can be diagnosed as early as 6-12 months, suggesting that parental report is not an adequate screen because subtle signs are hard to detect without rigorous observation.
Now a group has published the first prospective study to address this question. They followed 25 children who were later diagnosed with autism spectrum disorder (ASD) (22 of which were high risk) and 25 low risk children who were later determined to have typical development (TD). They found:
These results suggest that behavioral signs of autism are not present at birth, as once suggested by Kanner, but emerge over time through a process of diminishment of key social communication behaviors. More children may present with a regressive course than previously thought, but parent report methods do not capture this phenomenon well. Implications for onset classification systems and clinical screening are also discussed.
More precisely, they carefully assessed the children, counting instances of eye contact and smiling, for example, and found that there were no statistically significant differences between the groups at 6 months, but that almost all measures were reduced in the ASD group by 12 months.
Now Castle has published a collection of her personal essays, which Ross Posnock raves about at the New Republic.
The thing that jumped out at me from this review was an aside about painting. Castle, dragged by her mother to visit Santa Fe, shudders at the thought of seeing paintings by Georgia O'Keeffe, whom she despises, and resolves to rescue herself from the horridly popular O'Keeffe by meditating on Agnes Martin:
I’ve secretly inoculated myself with what I consider the ultimate Connoisseur’s Good Taste Vaccine. Everywhere we go, I tell myself, what I’ll really be doing is looking for the Agnes Martins. Agnes, I’ve decided, will be my private talisman, my anti-O’Keeffe…My aesthetic invulnerability assured, I’ll be able to enjoy everything else ironically.If you have never heard of Agnes Martin, I suspect that is rather the point. Only true aesthetes like Agnes Martin, and Castle's whole pose is that she is a sort of ordinary and weak person whose only great virtue is her exquisite good taste. She may be a boot-licking sidekick, but at least she has the good taste to be Susan Sontag's boot-licking sidekick. Agnes Martin's paintings look like this:
I could go on, but you already get the idea. If your appreciation for the artistic vision is so weak that you don't see how much more refined and sophisticated these canvases are than anything by such a clumsy, obvious, and earth-bound artist as Georgia O'Keeffe, you simply don't have what it takes to join the rarefied aesthetic club where Castle makes her home. And yet, see, Castle ends up being impressed by some of O'Keeffe's paintings. Not even Castle is as much of a rarefied aesthete as she likes to think she is, and in telling you this she presents herself as an aesthete who nevertheless has the common emotions and can understand what ordinary tourists feel. An aesthete who knows how to enjoy slumming.
Castle reveals another side of herself by raving about the autobiography of jazz musician Art Pepper. Posnock:
Art is “so painfully human” she can hardly bear it: he “offered himself up with such astonishing vulnerability I found my eyes welling up repeatedly.” She finds irresistible his “superprurient adventures”— voyeurism, masturbation, chicks, needles, rage, and booze—but never stops asking herself why she is obsessed with this terminal macho man: “what a self-destructive (and self-deluding) bastard Art Pepper must have been. And what’s up with you, Terry Castle, that you claim to like this guy? I admit it: it is strange.” She takes on the skeptics and they push her to grasp the “Core Emotional Truth”—that success in art demands that “you have to stop trying to disguise who you are. The veils and pretenses of everyday life won’t work; a certain minimum truth-to-self is required.”I find this ridiculous and bizarre. How could an expert in 18th-century fiction think that art is "truth-to-self"? (And if Agnes Martin's paintings are true to herself, what does that say about her?) No, I think Castle is simply swept away by the brutal energy of Art Pepper as she was swept away by the brutal energy of Susan Sontag. Fascination with such characters gives Castle a masochistic thrill. She is too small, too academic, too professorial to live as they lived, but she can at least enjoy reading about it, and by confessing to her enjoyment she publicly aligns herself with the brutal artistic supermen of the world and against academic small-mindedness.
In her new career as a self-examining truth-teller, Castle pours scorn on the academic world. She is especially hard on academic writing, something she once did a great deal of. Professors of all sorts seem to love Castle's new writing; Posnock says her book "understands more about the academic vocation, and the art of self-examination, than the shelf of grave and socially responsible studies of and by professors that have appeared in recent years." And, in truth, it is hard to find professors who have much good to say about the conventions of academic world. Academic convention seems to exist mainly to be mocked. Professors are like characters in Jane Austen novels, who laugh at the foolishness of society while complying meekly with all its demands. If, oh professors, you find academic writing so frustrating to your true voices, why do you do it? If academic study somehow misses the real truths of life and art, what is the point of it? Or is scorn for academic life just a masochistic exercise for self-despising professors, who thereby revel in their own inferiority to artistic supermen?
If you don't like academic life, do something else. If you are drawn to academic life but bridle at some of its conventions, fight them. But please don't write any more long essays about your double consciousness, your ironic and detached appreciation for the pettiness of your own life and work, because no matter how minutely it is examined, small-mindedness remains small.
Sunday, February 21, 2010
I have always been on the record, in fact, since 2003, with the concept of living our values. And I think that whenever we have, perhaps, taken expedient measures, they have turned around and bitten us in the backside. We decided early on in the 101st Airborne Division we're just going to--look, we just said we'd decide to obey the Geneva Convention, to, to move forward with that. That has, I think, stood elements in good stead. We have worked very hard over the years, indeed, to ensure that elements like the International Committee of the Red Cross and others who see the conduct of our detainee operations and so forth approve of them. Because in the cases where that is not true, we end up paying a price for it ultimately. Abu Ghraib and other situations like that are nonbiodegradables. They don't go away. The enemy continues to beat you with them like a stick in the Central Command area of responsibility. Beyond that, frankly, we have found that the use of the interrogation methods in the Army Field Manual that was given, the force of law by Congress, that that works.
So I have spent my life watching, not to see beyond the world, merely to see, great mystery, what is plainly before my eyes. I think the concept of transcendence is based on a misreading of creation. With all respect to heaven, the scene of miracle is here, among us. The eternal as an idea is much less preposterous than time, and this very fact should seize our attention. In certain contexts the improbable is called the miraculous.
Religious thinking has been tied to various brain regions before, but a new study (abstract) moves things a big step forward. By measuring indicators of religiosity in brain-cancer patients before and after surgery to remove their tumors, a team of researchers in Italy has found that damage to a specific region of the brain (the posterior parietal cortex) can increase a person’s feelings of “self-transcendence,” or the feeling of being connected to others and to the universe.
The parietal cortex has previously been linked to maintaining one’s sense of self — such as in keeping track of the locations of one’s various body parts.
The researchers surveyed 88 brain-cancer patients before and after surgery, asking them to answer “yes” or “no” to statements such as: “I often feel so connected to the people around me that I feel like there is no separation”; “I feel so connected to nature that everything feels like one single organism;” and “I got lost in the moment and detached from time.” People who answered “yes” to these statements score high on the trait of self-transcendence. The same people who score high on this measure are also prone to belief in things like miracles and ESP.
What the researchers found was that people who came in with tumors in the posterior parietal cortex scored higher on self transcendence before surgery than other patients, who came in with tumors in the frontal cortex. After the tumor removal, the patients who’d had tumors in the posterior parietal cortex scored even higher on self-transcendence. Patients who’d had tumors in the frontal cortex showed no change on that trait after the surgery.
This is certainly interesting, but I resist the equation of "self-transcendence" with "religion." As one the Nature commenters put it, the study only considered “one self-report measure, which is a coarse measure that includes some strange items.” I have complained before about the notion that spirituality can be equated with belief in ESP or a sense that "everything feels like one single organism." I believe that one can be religious without denying reality. I have seen no data that would connect a high score on quizzes like this one with attendance at worship or other sorts of religious behavior, and I don't see why we should consider it an important indicator of religion.
Saturday, February 20, 2010
Since our immigration law allows no exceptions from these bizarre deportation rules, the only hope for keeping this outstanding American in the country is a pardon from NY's governor or the President.
The teenager, a gifted student, was pleading guilty to a string of muggings committed at 15 with an eclectic crew in Manhattan’s Chinatown. The judge, who remembered the pitfalls of Little Italy in the 1950s, urged him to use his sentence — three to nine years in a reformatory — as a chance to turn his life around.
“If you do that, I am here to stand behind you,” the judge, Michael A. Corriero, promised. The youth, Qing Hong Wu, vowed to change.
Mr. Wu kept his word. He was a model inmate, earning release after three years. He became the main support of his immigrant mother, studying and working his way up from data entry clerk to vice president for Internet technology at a national company.
But almost 15 years after his crimes, by applying for citizenship, Mr. Wu, 29, came to the attention of immigration authorities in a parallel law enforcement system that makes no allowances for rehabilitation. He was abruptly locked up in November as a “criminal alien,” subject to mandatory deportation to China — the nation he left at 5, when his family immigrated legally to the United States.
We need to sweep away all of the "zero tolerance" laws we have passed over the past 20 years and let judges make decisions about what punishments are appropriate, like we used to. All criminals are not the same.
Hundreds of people taking Avandia, a controversial diabetes medicine, needlessly suffer heart attacks and heart failure each month, according to confidential government reports that recommend the drug be removed from the market. . . .It seems FDA scientists are divided over the drug, some saying it should stay on the market as an option despite the risks. There is also a Senate investigation:
The bipartisan multiyear Senate investigation . . . sharply criticizes GlaxoSmithKline, saying it failed to warn patients years earlier that Avandia was potentially deadly.“Instead, G.S.K. executives attempted to intimidate independent physicians, focused on strategies to minimize or misrepresent findings that Avandia may increase cardiovascular risk, and sought ways to downplay findings that a competing drug might reduce cardiovascular risk,” concludes the report.
Friday, February 19, 2010
Archaeologists announced today that they have located not just the site of the Battle of Bosworth, but the spot where – on 22 August 1485 – Richard III became the last English king to die in battle when he was cut down by Tudor swords.
Nearby Henry Tudor was crowned Henry VII, with the crown which had tumbled from the dying Richard's head.
The crucial evidence, including badges of the supporters of both kings, sword mounts, coins and 28 cannonballs, was found in fields straddling Fen Lane in the Leicestershire parish of Upton, where no historian had looked before. . . .
Another key discovery was a silver boar no bigger than a thumbnail, battered but still snarling in rage after 500 years. It was found on the edge of a field still called Fen Hole, which in medieval times was a marsh that played a crucial role in the battle, protecting the flank of Henry Tudor's much smaller army. The marsh was drained centuries ago, but Oliver said it still gets boggy in very wet summers.
After a charge in which Richard came within almost a sword's reach of Henry, he lost his horse in the marsh, a moment immortalised in the despairing cry Shakespeare bestowed upon him: "A horse! A horse! My kingdom for a horse!"
"The fact that this little boar is Richard's personal emblem, and made in silver gilt, means that it can only have been given to one of the closest members of his retinue. The man who wore this would have fought and died at Richard's side," Foard said.
"If you were to ask me what was the one find I would dream of making, which would really nail the site, it would be Richard's boar emblem on the edge of a marsh."
I have always fancied being bored on a huge and stylish scale. I’m talking Great Gatsby boredom, with everyone lying around in white clothes and floppy hats, sipping long drinks with cooling names, and being utterly and divinely bored. How sophisticated can one get, goes my thinking, that even when surrounded by the best things in life, it’s not enough? Boredom wins through.
There’s something exquisite about boredom. Like melancholy and its darker cousin sadness, boredom is related to emptiness and meaninglessness, but in a perfectly enjoyable way.
Obama-style reform is the only way to solve this problem.
Here’s the story: About 800,000 people in California who buy insurance on the individual market — as opposed to getting it through their employers — are covered by Anthem Blue Cross, a WellPoint subsidiary. These are the people who were recently told to expect dramatic rate increases, in some cases as high as 39 percent.
Why the huge increase? It’s not profiteering, says WellPoint, which claims instead (without using the term) that it’s facing a classic insurance death spiral.
Bear in mind that private health insurance only works if insurers can sell policies to both sick and healthy customers. If too many healthy people decide that they’d rather take their chances and remain uninsured, the risk pool deteriorates, forcing insurers to raise premiums. This, in turn, leads more healthy people to drop coverage, worsening the risk pool even further, and so on.
Now, what WellPoint claims is that it has been forced to raise premiums because of “challenging economic times”: cash-strapped Californians have been dropping their policies or shifting into less-comprehensive plans. Those retaining coverage tend to be people with high current medical expenses. And the result, says the company, is a drastically worsening risk pool: in effect, a death spiral.
Thursday, February 18, 2010
And then I do want to be a voice for some common-sense solutions. I’m never going to pretend like I know more than the next person. I’m not going to pretend to be an elitist. In fact, I’m going to fight the elitist because for too often and for too long now, I think the elitists have tried to make people like me and people in the heartland of America, feel like we just don’t get it and big government is just going to have to take care of us.At least she admits she doesn't know any more than the next person.
A ONE DAY SYMPOSIUM ON THE ARCHAEOLOGY OF OUR NATION’S CAPITAL
National Museum of Natural History
Extraordinary Resources in the Nation’s Capital: The Archaeology of Washington, D.C.
Dr. Ruth Trocolli, City Archaeologist, Washington, DC
Bold, Rocky, and Picturesque: the Archaeology of Rock Creek Park, a Wooded Refuge in the Nation’s Capital.
Dr. John Bedell, Senior Archaeologist, The Louis Berger Group, Inc.
Death, Dogs and Monuments: Excavations at Historic Congressional Cemetery
Laurie Burgess, Associate Chair, Dept. of Anthropology, NMNH
“…Gardens abounding in much gay and Variagated Foliage”: Understanding George Washington's Upper Garden
Dr. Esther White, Director of Archaeology, Mount Vernon
Lost in Time: The Boy in the Iron Coffin
Deborah Hull-Walski, Collections Manager, Dept. of Anthropology, NMNH
Dr. David Hunt, Collections Manager of Physical Anthropology, NMNH
“The debate’s over,” he said. “Torture stained the honor of the United States.”I really hope his sources speak for the Agency as a whole.
Lara Dadkhah thinks Americans are making a mistake by doing less bombing in Afghanistan:
American and NATO military leaders — worried by Taliban propaganda claiming that air strikes have killed an inordinate number of civilians, and persuaded by “hearts and minds” enthusiasts that the key to winning the war is the Afghan population’s goodwill — have largely relinquished the strategic advantage of American air dominance. Last July, the commander of Western forces, Gen. Stanley McChrystal, issued a directive that air strikes (and long-range artillery fire) be authorized only under “very limited and prescribed conditions.”
So in a modern refashioning of the obvious — that war is harmful to civilian populations — the United States military has begun basing doctrine on the premise that dead civilians are harmful to the conduct of war. The trouble is, no past war has ever supplied compelling proof of that claim.
I don't know about "proof," but those people who have studied counter-insurgency warfare in the 20th and 21st centuries have all come to the same conclusion that killing civilians is a bad idea; the US Marines' "Small Wars Manual" of the 1920s says that killing one wrong person can undo the good done by killing 100 of the right ones. So McChrystal is only following military tradition in issuing these orders.
Besides which there is the small problem that killing civilians in wartime is murder -- clearly defined as such by the Geneva Conventions. The excuse that these are accidental deaths, the unfortunate result of militarily necessary acts, wears thin after a while, especially when you consider how few American soldiers and Marines have been killed in firefights. (Most of our losses are do to IEDs.)
Our aim in Afghanistan is not to defeat the Taliban, but to persuade them to make a peace deal with the Karzai government. The fewer of their relatives we kill, the easier this will be.
The people were filled with superstitious dread, for they believed that they had neglected the honors of the gods that had been established by their fathers. In their zeal to make amends for their omission, they selected 200 of the noblest children and sacrificed them publicly.... There was in their city of bronze image of Cronus (as the Greeks called Ba'al Hammon), extending its hands, palms up and sloping toward the ground, so that each of the children when placed on the arms rolled down and fell into a gaping pit filled with fire.Skeptics have always been doubtful about this, and the debate over whether these stories are true, or just the propaganda of Carthage's enemies, has been going on for 300 years. To answer this question archaeologists have for a century been digging in the Tophet or temple precinct of Carthage. Within the Tophet are buried thousands of jars containing the remains of cremated infants. But were these sacrifices, or just children who were buried close to the gods?
Now some forensic anthropologists have released the largest study yet done of human remains from the Tophet, convering the contents of 368 urns. They find that most of the identifiable remains were from babies of 2 to 5 months in age, and that 20 percent of the sample consists of pre- or peri-natal infants who were either stillborn or died at birth, and therefore would not have lived long enough to be sacrificed. They conclude:
A study led by University of Pittsburgh researchers could finally lay to rest the millennia-old conjecture that the ancient empire of Carthage regularly sacrificed its youngest citizens. An examination of the remains of Carthaginian children revealed that most infants perished prenatally or very shortly after birth and were unlikely to have lived long enough to be sacrificed, according to a Feb. 17 report in PLoS ONE.I can't see how these findings prove anything one way or the other about Carthaginian sacrifical practices. The one thing they convince me of is that SOME of the babies whose remains were buried at the Tophet had not been sacrificed. That is interesting, because those "ambiguous inscriptions" describe the buried infants as having been dedicated to the gods. The Greek and Roman writers make it plain that infant sacrifice was not an everyday event in Carthage, but a rare response to extraordinary calamities, or, in some accounts, an annual event in which one child was chosen for sacrifice from the whole population of several hundred thousand. Such practice is perfectly compatible with the finding that only a few of the still-identifiable remains seem to represent sacrificed children.
The findings—based on the first published analysis of the skeletal remains found in Carthaginian burial urns—refute claims from as early as the 3rd century BCE of systematic infant sacrifice at Carthage that remain a subject of debate among biblical scholars and archaeologists, said lead researcher Jeffrey H. Schwartz. Schwartz and his colleagues present the more benign interpretation that very young Punic children were cremated and interred in burial urns regardless of how they died.
"Our study emphasizes that historical scientists must consider all evidence when deciphering ancient societal behavior," Schwartz said. "The idea of regular infant sacrifice in Carthage is not based on a study of the cremated remains, but on instances of human sacrifice reported by a few ancient chroniclers, inferred from ambiguous Carthaginian inscriptions, and referenced in the Old Testament. Our results show that some children were sacrificed, but they contradict the conclusion that Carthaginians were a brutal bunch who regularly sacrificed their own children."
I wonder about the practice of burying stillborn infants in the same burying ground as the victims of sacrifice. Surely the rite of sacrifice was the most solemn and awesome religious act in Carthage; surely the slaughtered babes were given the most impressive possible funeral rites; surely their parents were assured that the child had gone to be with the gods, and that his or her death was a great service to the community. Could it be that this solemn machinery, or at least some of it, was adapted for the consolation of parents whose children died natural deaths? Did they dedicate those dead infants to the gods with the same language used for the chosen victims of sacrifice, hoping to invoke the same divine favor, and to receive the same consolation?
Wednesday, February 17, 2010
ROME, FEB. 8, 2010.- Bankers are not the cause of the global economic crisis, according to the president of the Institute for the Works of Religion. Rather, the cause is ordinary people who do not "believe in the future" and have few or no children.I suspect that falling birth rates create economic stresses of various kinds, but since we had several several economic recessions in the 1800s, the time of the western world's fastest population growth, and then the Great Depression at a time of more modest population growth, and more recently very robust economic growth in the 1990s, the argument that economic performance is related to the birth rate needs more evidence before I will take it seriously.
"The true cause of the crisis is the decline in the birth rate,” Ettore Gotti Tedeschi, said in an interview on Vatican Television's "Octava Dies."
He noted the Western world's population growth rate is at 0% -- that is, two children per couple -- and this, he said, has led to a profound change in the structure of society.
"Instead of stimulating families and society to again believe in the future and have children […] we have stopped having children and have created a situation, a negative economic context decrease," Gotti Tedeschi observed. "And decrease means greater austerity."
“With the decline in births,” he explained, “there are fewer young people that productively enter the working world. And there are many more elderly people that leave the system of production and become a cost for the collective.
The notion that people who don't have children don't "believe in the future" is very strange. The control of family size represents a determined effort to shape the future, to build a world in which we and our children both have a chance to live full lives. I believe very much in the future, thank you, and I think our future will be better with fewer than 10 billions humans on the planet.
The image of the buttoned up, reclusive widow clad in black has obscured the first half of Queen Victoria's story – that of a very natural, uninhibited young woman attracted to the sensuous and the physical who not only didn't mind nudity, but actually enjoyed it.
Many of the works were pretty bold choices for a queen. Most other Victorian wives would have felt apprehensive at the idea of their husband surrounded by nude, bathing beauties on the walls at home, but it just shows how confident she was.
Prince Albert had a taste for purity, so she may well have been attempting to loosen him up a little.
Avoid the term “global warming.” I prefer the term “global weirding,” because that is what actually happens as global temperatures rise and the climate changes. The weather gets weird. The hots are expected to get hotter, the wets wetter, the dries drier and the most violent storms more numerous.
The fact that it has snowed like crazy in Washington — while it has rained at the Winter Olympics in Canada, while Australia is having a record 13-year drought — is right in line with what every major study on climate change predicts: The weather will get weird; some areas will get more precipitation than ever; others will become drier than ever.
Exactly. I note, though, that he makes the same irritating equation of what the models predict with what "actually happens." We don't know what will "actually happen" to the weather over the next century. But we do have models, and since what they predict is pretty much in line with what we have seen over the past 20 years, it might be a good idea to take them seriously.
But there is something unique about a particular excavated area beside a rather plain looking mound -- Mound 34 -- that lies about 200 yards east of the world famous and huge Monk's Mound at Cahokia Mounds State Historic Site. The carefully sifted soil at this excavation has revealed evidence of the only known copper workshop from the Mississippian-era, a culture that peaked about 1250 A.D. throughout the middle and southern portions of America. The overall Illinois state site was the location of a large, prehistoric city of perhaps 20,000 that archaeologists call Cahokia.Ah, but this is the second time it has been discovered:
"It's the only one (copper workshop) that's been discovered," said James A. Brown, professor of archaeology at Northwestern University in Chicago.
The irony is that a self-taught archaeologist, Greg Perino, who grew up in Belleville and pioneered a sometimes heavy handed excavation style that featured bulldozing, actually discovered the copper workshop and another nearby nearly 60 years ago. Perino died in 2005 at age 91. However, his mapping was rudimentary and it took years to relocate his find.Showing once again that somebody already knows the answers to many archaeological questions, if only we could find him and ask him.
Tuesday, February 16, 2010
From the Telegraph:
The trading vessel was carrying an extremely valuable cargo of tin and hundreds of copper ingots from the Continent when it sank.
One of the world's oldest shipwrecks has been discovered off the coast of Devon after lying on the seabed for almost 3,000 years.
Experts say the "incredibly exciting" discovery provides new evidence about the extent and sophistication of Britain's links with Europe in the Bronze Age as well as the remarkable seafaring abilities of the people during the period.
Archaeologists have described the vessel, which is thought to date back to around 900BC, as being a "bulk carrier" of its age.
The copper and tin would have been used for making bronze – the primary product of the period which was used in the manufacture of not only weapons, but also tools, jewellery, ornaments and other items.
Archaeologists believe the copper – and possibly the tin – was being imported into Britain and originated in a number of different countries throughout Europe, rather than from a single source, demonstrating the existence of a complex network of trade routes across the Continent. . . .The cargo recovered includes 259 copper ingots and 27 tin ingots. Also found was a bronze leaf sword, two stone artefacts that could have been sling shots, and three gold wrist torcs – or bracelets.
The team have yet to uncover any of the vessel's structure, which is likely to have eroded away. However, experts believe it would have been up to 40ft long and up to 6ft wide, and have been constructed of planks of timber, or a wooden frame with a hide hull. It would have had a crew of around 15 and been powered by paddles.
The picture shows a gold wrist torc on the seabed. Flash player presentation of lots of pictures here.
The bronze age history of western Europe is fascinating. The world seems more cosmopolitan than the early iron age that followed, with stronger ties to the Mediterranean and a wealth that was invested in wonderful monuments.
Now it is true in a general sense that scientific thinking was part of the progressive mindset during the period from 1550 to 1850 or so, when the old, aristocratic politics of Europe was overthrown. It is easy to list democratic political leaders and thinkers who were interested in science, like Benjamin Franklin and Thomas Jefferson. Ferris has dug up a delightful story I did not know,
in which George Washington and Thomas Paine floated together one night down a New Jersey creek, lighting cartridge paper at the water’s surface to determine whose theory was correct about the source of swamp gas.I also think that the spread of world-transforming technology in the 19th century made the benefits of free thought clear to many Europeans and Americans, and I am even willing to concede that the experimenter's attitude of contempt toward authority is in some sense democratic.
But this is stuff that everybody concedes and that nobody has to write books to argue for. Ferris' stronger argument, that science was the driver of democratic thought in the early modern period, and that the experimental attitude is what sustains democracy now, is foolish. For every 17th- or 18th-century scientist who supported liberty, I can produce another equally famous scientist who comfortably served some absolutist prince -- Lavoisier, perhaps the 18th century's greatest scientist, was executed by the French revolutionaries. This argument also overlooks the crucial part that religious reform played in undermining royal absolutism; the people who fought and died to overthrow 17th-century kings were mostly religious fanatics, not humanists. I also question the direction of the causality here, that is, I think some people were interested in science because they wanted to oppose the intellectual supports of the old order, not the other way around. Ferris cites John Locke as an example of a political reformer/scientist. But Locke's ideas about the human mind, the famous "blank slate," were not based on any experimental data. They were shaped in opposition to the hereditary bias of medieval thought and the belief in original sin. As a scientist, Locke failed miserably, but his ideas did lend intellectual support to the assault on aristocratic privilege.
There is also one crucial sense in which science is not democratic: scientists believe in the truth. Once the truth is known, opposition to it becomes, not just wrong, but non-scientific. If you try to oppose the atomic theory, for example, scientists will simply laugh at you. You can see this playing out now in the global warming debate. Many scientists have decided that their models predicting catastrophe are true, so anyone opposing them is outright evil. They have no interest in putting their theories to a vote, or even in trying very hard to convince ordinary people that they are right. Something much more serious happened in the late 1800s and early 1900s, when many scientists were captured by the scientific pretensions of communism. The communist ideology, that the scientific truth about economics and sociology had been discovered and that it now depended on an elite cadre to put this discovery into action, was perfectly designed to appeal to scientists.
But the most important problem with imagining science as the basis for democracy is that science assigns no value to anything. What scientific postulate, what experiment, tells us that we should value the lives of other humans more than the lives of flat worms? Western democracy has been to a large extent built on a foundation of belief in "human rights;" what scientific evidence is there that we have such rights? As Gary Rosen put it,
Ferris’s refrain of “experiment” is a well-chosen trope. Few other words in the vocabulary of Western progress can match its prestige and practical appeal. To rely on experiment is to doubt authority, to cultivate self-awareness, to seek the reality behind natural appearances and received opinion. The experimental frame of mind encompasses the scientist in her lab, the inventor in his workshop and even (with some literary license) the reflective bohemian, the calculating entrepreneur and the shrewd democratic leader. But does it yield the “laws of nature” from which Locke and Jefferson drew the idea of universal human rights? Does it explain our reluctance today to compromise those rights in the name of expediency or results? Jeremy Bentham dismissed the idea of natural rights as “nonsense upon stilts,” because it stood in the way of a proper utilitarian calculus of human welfare. Arguably, one can find his heirs today atop the Chinese state, conducting technocratic experiments of their own and deploying the tools of modern science to preserve a “harmonious society.” For the politics of liberty, mere empiricism is not enough.A scientific spirit is necessary to make society flourish, but to build a just and free world we need much more. We need, especially, compassion, for without compassion the pursuit of scientific excellence can easily become a workshop of horrors.
Monday, February 15, 2010
He observes that his critics don't disagree with the facts he presents, but, as I said, that is not the issue for people who dislike his work. The issue is that he has not given the "whole story." He has, in the eyes of his critics, torn a story out of its historical and political context and presented as if it meant something in and of itself, whereas, to either Israeli or Palestinian partisans, no event has meaning except as part of a 60-year-long narrative of suffering and resistance, and to leave out that narrative is worse than lying.
In the US and Britain, there is a campaign to smear anybody who tries to describe the plight of the Palestinian people. It is an attempt to intimidate and silence – and to a large degree, it works. There is nobody these self-appointed spokesmen for Israel will not attack as anti-Jewish: liberal Jews, rabbis, even Holocaust survivors.
My own case isn't especially important, but it illustrates how the wider process of intimidation works. I have worked undercover at both the Finsbury Park mosque and among neo-Nazi Holocaust deniers to expose the Jew-hatred there; when I went on the Islam Channel to challenge the anti-Semitism of Islamists, I received a rash of death threats calling me "a Jew-lover", "a Zionist-homo pig" and more.
Ah, but wait. I have also reported from Gaza and the West Bank. Last week, I wrote an article that described how untreated sewage was being pumped from illegal Israeli settlements on to Palestinian land, contaminating their reservoirs. This isn't controversial. It has been documented by Friends of the Earth, and I have seen it with my own eyes.
The response? There was little attempt to dispute the facts I offered. Instead, some of the most high profile "pro-Israel" writers and media monitoring groups – including Honest Reporting and Camera – said I an anti-Jewish bigot akin to Joseph Goebbels and Mahmoud Ahmadinejadh, while Melanie Phillips even linked the stabbing of two Jewish people in North London to articles like mine. Vast numbers of e-mails came flooding in calling for me to be sacked.
Any attempt to describe accurately the situation for Palestinians is met like this. If you recount the pumping of sewage onto Palestinian land, "Honest Reporting" claims you are reviving the anti-Semitic myth of Jews "poisoning the wells." If you interview a woman whose baby died in 2002 because she was detained – in labour – by Israeli soldiers at a checkpoint within the West Bank, "Honest Reporting" will say you didn't explain "the real cause": the election of Hamas in, um, 2006. And on, and on.
An interesting study from the 80s, highlighted by Psyblog:
The participants were asked if the news reports from ABC, CBS, and NBC were biased. The results:
Robert P. Vallone and colleagues from Stanford University invited 144 Stanford undergrads who held a variety of views on the continuing Arab-Israeli conflict to watch some of the news coverage of the Beirut massacre (Vallone et al., 1985). The Beirut massacre was the killing of between 328 and 3,500 Palestinian and Lebanese civilians by Lebanese militia forces in September 1982.
At the time the story received huge media coverage around the world with much speculation about whether Israeli forces had allowed it to happen (a subsequent commission held the Israeli government indirectly responsible).Some of the participants recruited for the study were moderate in their initial views, others were specifically recruited from both the pro-Arab and pro-Israeli student associations.
Here are the average ratings for the news coverage from each group:
- Pro-Israeli: 2.9 (perceived a marked pro-Arab bias)
- Neutral: 3.8 (perceived a slight pro-Arab bias)
- Pro-Arab: 6.7 (perceived a marked pro-Israeli bias)
As you can see the pro-Israeli participants thought the news reports were biased against Israel while the pro-Arab participants thought the news reports were biased against Arabs. This is impressive because everyone was watching exactly the same news reports. Even more surprising was that each thought that when someone neutral saw the coverage, it would persuade them to side with the opposite position.
This is interesting but I think it is not the whole story. One reason everybody think news reports are biased is that they are brief, and so they naturally omit bits of the background that partisans on both sides think is critical. In the case of the Israel-Palestinian conflict, the pro-Israel participants probably think every story should begin with the story of the Arab war against Jewish settlers going back to 1947 or earlier, the refusal of Arab governments to recognize Israel or condemn terrorist attacks against it, the relentless terrorism against Israel, etc., in short, the whole story of Israel as a small nation beleaguered by enemies. The pro-Palestinians think the necessary background is the Israel occupation of Arab lands and the eviction of Palestinians from their homes, going back to 1947 and continuing. So of course each thinks a story about contemporary events that focuses on those events is biased, since they see each event as part of an ongoing story.
When it comes to American domestic politics, the spectrum of possible views is so broad that any approach is offensive to somebody. Conservatives all think that the mainstream media has a "liberal bias," but on the other hand my Progressive friends all think it has a conservative, corporate bias. You can model how this happens by imagining a long line representing the spectrum of political opinion, extending from socialist eco-fanatics to strip-mining fascists. I would say that back when I watched the network news, 20 years ago, they were slightly left of center. But that still left them far to the right of real leftists. And my reading of America is that most politically engaged people are not clustered in the center, but spread out toward the fringes; so of course very few people with strong political opinions thought the news was unbiased.
During the Bush years, as I drifted toward the left through opposition to Bush's wars, the torture of detainees, corporate malfeasance, Wall Street shenanigans, and the lack of any national health care policy, the media seemed to me to have fallen completely under the control of Republicans. Where were the prominent media voices opposing either the invasion of Iraq or the abuse of detainees, or calling attention to the obvious looming problems on Wall Street? Over the past two years I have detected a recentering, so perhaps the Bush years were just an outburst of militant patriotism inspired by 9-11. The NY Times, in particular, seems to have recovered its voice, along with many smaller papers, and of course now everybody is angry about Wall Street. (The Washington Post remains a neocon disaster.)
Now the main problem I see with our big media outlets is a silly striving toward balance that leaves them unable to take strong stands on anything. Coverage of the health care debate is particularly silly, because this is an area where the status quo sucks and nothing modest will even start to fix the problems. One can imagine a conservative solution (make people pay for their own health care, with insurance only for catastrophic costs), but most Americans won't go for that, so that leaves only different kinds of statist solutions as possibilities. But many reporters still seem to be looking for a middle ground that does not exist, talking up meaningless bipartisanship and asking why we can't have a more modest approach.
The east coast elite consensus that dominated the news business from the 1940s to the 1980s has dissolved, so we will not see again that sort of "mainstream" news again any time soon. The future belongs to niche marketed news, and, I hope, to the fact checking blogs that will try to keep the distortions in line.
The picture is of a culture made by putting a nurse's hand onto sterile agar and waiting 24 hours. It takes very vigorous washing to really make hands sterile, so medical personnel are often walking around with all kinds of bugs on their hands.
Sunday, February 14, 2010
Being very attractive reduces a young adult's propensity for criminal activity and being unattractive increases it. Being very attractive is also positively associated with wages and with adult vocabulary test scores, which implies that beauty may have an impact on human capital formation. The results suggest that a labor market penalty provides a direct incentive for unattractive individuals toward criminal activity.The source for this is given as: "Ugly Criminals" from The Review of Economics and Statistics.
The huge advantages that accrue to the beautiful, and the corresponding penalty for ugliness, is one of the dark truths of our society.
Saturday, February 13, 2010
RICHMOND, FEB. 9 -- The House of Delegates is scheduled to vote Wednesday on a bill that would protect Virginians from attempts by employers or insurance companies to implant microchips in their bodies against their will.
It might also save humanity from the antichrist, some supporters think.
Del. Mark L. Cole (R-Fredericksburg), the bill's sponsor, said that privacy issues are the chief concern behind his attempt to criminalize the involuntary implantation of microchips. But he also said he shared concerns that the devices could someday be used as the "mark of the beast" described in the Book of Revelation.