Monday, June 29, 2009
But wait again! What do these researchers say?
"It's somewhat of a counter-intuitive idea," said Brice Kuhl, a doctoral student working in the lab of Associate Professor Anthony Wagner of the Psychology Department. "Remembering something actually has a cost for memories that are related but irrelevant." But this cost is beneficial: The brain's ability to weaken unimportant memories and experiences enables it to function more efficiently in the future, Kuhl said.
Eh? 'Counter-intuitive'? To whom?
It reflects rather poorly on people working in the field when this kind of thing comes up and when it makes news, as if it actually were news.
Here's the abstract:
Remembering often requires the selection of goal-relevant memories in the face of competition from irrelevant memories. Although there is a cost of selecting target memories over competing memories (increased forgetting of the competing memories), here we report neural evidence for the adaptive benefits of forgetting—namely, reduced demands on cognitive control during future acts of remembering. Functional magnetic resonance imaging during selective retrieval showed that repeated retrieval of target memories was accompanied by dynamic reductions in the engagement of functionally coupled cognitive control mechanisms that detect (anterior cingulate cortex) and resolve (dorsolateral and ventrolateral prefrontal cortex) mnemonic competition. Strikingly, regression analyses revealed that this prefrontal disengagement tracked the extent to which competing memories were forgotten; greater forgetting of competing memories was associated with a greater decline in demands on prefrontal cortex during target remembering. These findings indicate that, although forgetting can be frustrating, memory might be adaptive because forgetting confers neural processing benefits.
Not only is the notion that things should be so anything but counter-intuitive, but it is an almost obvious consequence of the finitude of brain capacity—in terms of storage and processing; a separation of functions that is probably meaningless, but this is still the way people think about it, so I'm using that kind of model—and should have been self-evident, and not in need of fMRI confirmation, to anyone working in the study of attention, and those in the neural networks community should not exactly find themselves surprised.
Still, putting all this aside, with this confirmation, if it were needed, of the obvious, maybe some people will start paying attention to the facts, rather than the obsession, in science and pop culture alike, with trying to force people at all stages in lifr, and especially the 'later' ones, into a mode where the kind of memory common to the early stages of life—when acquisition, storage and recall of items of what you might call 'random' nature, as opposed to those contextually set, is at its peak; and understandbaly so—is touted as something that needs to be trained in order to retain mental capability into the later years of one's life.
That never made sense, at least not to me. I've always thought that, given the limited capacity of the brain, there really was no option but to discard other items. In the course of one's life, things come and things go; and one's identity, which is intimately liked to one's memories, to the extent that one might be tempted assert that the two are in a real sense identical, cannot really be expected to be any different.
Of course, before this filters through to the common man through the pop media and the self-interested propganada of a whole range of private, academic and governmental organizations, it'll be a while. Much the same, I suspect, as will be the case in the new research coming out in relation to the whole antioxidant craze.
Those interested in memory might also be interested in this article.
Sunday, June 28, 2009
I point at the last paragraph in the previous blog. Never mind their rationalizations; they're all just that, as such things always are.
The world is divided—among other things—into people who 'get' XF2 and those who don't. I guess I should feel honored by obviously belonging into David Stratton's (that's the guy who talks with his hands folded in his lap; his body appears to be incapable of assuming any other position, or else he'd hiding something) category of, and I quote: "Kids who leave their brains at home should have fun with this monster of a movie."
Seems to me like Dave and Maggie left something else at home—or maybe it's now in the freezer. It certainly isn't used any more.
Saturday, June 27, 2009
Anyway, it turns out (SPOILERS!) that Megatron isn't the real big cheese after all, but there's this creepy megalomaniac dude, lurking somewhere in a giant ship near Saturn, who's really got it in for the 'Primes', of which Optimus is the last. He's also got it in for Earth and humans and all that fleshy stuff. And, yes, he's pissed, and supported by a major phalanx of Decepticons.
The mythology is getting way out of hand, but who cares! As long it supports yet another very cool flick, with lots of real big robots talking and walking like humans and talking smack, but fighting like robots with huge bits and pieces flying here and there and always missing the good guys; a cool basic-hero-this-is-your-destiny dude wisecracking and bumbling, but ultimately being a hero, his way through the story; a sultry female providing him with a foil and more than just support, but almost managing 'equality'; a crack team of our Ranger buddies from the first movie really laying it on this time; some major military hardware being terminally damaged, while other hardware does its America-saves-the-world stuff...all that and the product-placement car—though I do prefer the old, original Camaro from the first movie. Shame on you, Megan Fox, for sitting on Shia's lap and forever taking the original Bumblebee out of the equation with your "If he's like, this super-advanced robot, why does he transform into this piece-of-crap Camaro?"
Friends of ours had an issue with the first movie, where they thought they spotted—correctly—lots of US military hardware and just generally America-saves-the-world. Well, if they had a problem there, they certainly will this time; tenfold. But should that spoil a perfectly good fun movie; or should all that product placement divert from just enjoying oneself? On the contrary I say. Verisimilitude demands that one is true to what is. There was a brief reference to evacuating 'President Obama' from the White House to some place of safety as well; and the National Security advisor and his presidentially imposed mission was just the kind of thing I wold have expected from an Obama flunkie. Just like the previous president, in the first Transformer flick, was depicted as a bit of an asshole, in that immortal line, requested from one of his attendants on Air Force One to "rustle me up sone Ding Dongs". You gotta love Michael Bay. He's on good terms with the US Mil and obviously thinks highly of them, and especially the people who do the real work—reminds me of Ridley Scott that way—but I think his opinion of politicians is somewhere at the bottom of an aircraft carrier's bilge. Well, mine's even lower; so good on ya, Michael!
Anyway, if a movie is about the present and if you can get money for it by creating a context in which products that actually exist today are being used; what's wrong with that? It's verisimilitude, people; so stop bitching about it, those of you who do, and enjoy the ride, for chrissakes!
Bottom line, XF2 rocked. Haven't had so much just-fun since...well, since XF1, really. I take that back in the same breath: I thought Star Trek was also right up there in the super-geek-fun league, but it was definitely more adult. Movies like XF1 and XF2 will allow everybody, from 7 to 70, to have fun.
There's something—in my case not even 'guiltily'!—atavistically and profoundly pleasurable about just turning off all the bullshit plot and character analysis mode and what we are told we should think and what makes sense (the Transformers mythology makes none at all, but what mythology actually does, I ask!) and what is A and B and C grade and whatever other crap floats around in your head. It's like a mental holiday; no pressure to think this or that or whatnot; just go with the flow...and when you come out of the cinema and you drive back home, all the cars around you and every bit of machinery, really, start to look...well, kinda different. After the absolutely brilliant homage to Spielberg's classic, Gremlins, near the start of XF2, not even our kitchen will ever look the same to me.
If you don't enjoy XF2, you're either living in a very impoverished universe, or you're just emotionally...well, whatever; or there simply are some things in this world that you don't 'get' and possibly never will. Poor bastards...
Friday, June 26, 2009
Excess Pounds, but Not Too Many, May Lead to Longer Life
Being overweight won’t kill you — it may even help you live longer. That’s the latest from a study that analyzed data on 11,326 Canadian adults, ages 25 and older, who were followed over a 12-year period.
The report, published online last week in the journal Obesity, found that overall, people who were overweight but not obese — defined as a body mass index of 25 to 29.9 — were actually less likely to die than people of normal weight, defined as a B.M.I. of 18.5 to 24.9.
By contrast, people who were underweight, with a B.M.I. under 18.5, were more likely to die than those of average weight. Their risk of dying was 73 percent higher than that of normal weight people, while the risk of dying for those who were overweight was 17 percent lower than for people of normal weight.
The finding adds to a simmering scientific controversy over the optimal weight for adults. In 2007, scientists at the Centers for Disease Control and Prevention and the National Cancer Institute reported that overweight adults were less likely than normal weight adults to die from a variety of diseases, including infections and lung disease.
“Overweight may not be the problem we thought it was,” said Dr. David H. Feeny, a senior investigator at Kaiser Permanente Center for Health Research in Portland, Ore., and one of the authors of the study. “Overweight was protective.”
He said the finding may be due to the fact that a little excess weight is protective for the elderly, who are at greatest risk for dying, or because many health conditions associated with being overweight, like high blood pressure, are being treated with medication.The study took into account smoking status, physical activity, age, gender and alcohol consumption. It included a separate analysis excluding those who died early in the 12-year period, in order to weed out participants who might have been thin because they were smokers or had an underlying disease, like cancer.
Thursday, June 25, 2009
The question translates into another, namely: Is it acceptable to kill the guilty to protect the lives of the innocent?
Which begs the next question: Who is guilty and who is innocent, and who is to judge?
Where do we draw the line between 'guilty' and just someone being a bystander and at best guilty by association and not doing anything—for whatever reason? (Which, right now, would make the whole world complicit in the crimes committed in Iran—and in any place in the world, where evil is being inflicted on those who cannot avert its infliction on themselves)
Terry Goodkind's solution to this kind of problem (in the final book of the Sword of Truth series, Confessor) was to take an easy way out; 'easy' for a writer of fiction, that is. Those who followed or supported the 'Order' (standing in, pretty thinly veiled, for any of the world's religions; and I can't really fault his strongly drawn characterizations of the absurdity of religious faith and the grotesque results it produces in human beings—I mean, just look at Iran right now, as it plays out; and then remember all the other instances where religion and dumb-assed ideology have been invoked in to elicit and bring forth the worst in its practitioners) were separated from those who wanted "to live their own lives"—into non-overlapping universes. It was the only way to wrap up the books, I guess. Bit of a deus ex machina; one of many.
Goodkind's approach to how to defend oneself against an enemy both overpowering and basically not caring if they died, was equally straightforward and uncompromising; though I must confess that, at a gut level, I'm pretty much on-board with it. In fact, Teris, the Aslatrix—known to readers of Fontaine and Tethys—deals with someone in pretty much the same way in the first few chapters of Aslam.
The problem with both of Goodkind's solutions—the first one also being inflicted with the limitation that in this world, there isn't anything like the "Power of Orden", the powerful magic that ultimately had to be invoked to sort things out—is that they need to be applied on a sliding scale. He ignores that by setting the slide way over to one end of the scale of judgment. But just where we slide it to n each particular case...well, that's where the rub is. For, unless you're an ideologue or religious nitwit, the mark you draw onto those scales of judgment applied to others and their level of responsibility for and complicity in suffering inflicted on those on whom it is being inflicted, is entirely arbitrary. It's like abortion: how do you decide at what point a clump of cells has become a human being with a 'right to live', pretty much just like anyone having been born?
Or is it?
The usual 'rational' way of dealing with the 'sliding scale' is to become an ethical/moral relativist—if not explicitly so, but definitely by implication. This is the way of most of the enlightened intellectualigentsia, as well as every 'open minded' dimit who thinks that evil is relative and a matter of opinion.
Is there a baseline of conditions one might use to allow judgment of the existence of 'evil' in human deeds? Conditions, external and internal, that would cause anyone existing under them to come to basically the same conclusions—possibly by a gut reaction, something that is at a more 'basic' level than 'reason'—about the nature of evil and its appearance. What we are really looking for are types of 'human universals' that, if not kept down by conditioning or contigency, would lead people to come to similar conclusions, no matter who, where and when these people exist.
Well, here are some suggestions. Suppose we had an arbitrary human exemplar, who...
- Is not a psychopath or has other drastic neurological of psychological conditions that would make his or her reactions to other people differ significantly from the 'average', especially in terms of their capacity for empathy. (It's been said that if the 9/11 hijackers had been able to empathize with their victims—a capacity annulled by psychopathy and/or their 'faith'—they would not have been able to do what they did.)
- Lives under conditions of economic security, and social and physical safety—including that relating to, for example, threats by outside groups.
- Has been brought up with certain narratives of a metaphysical nature, but has not been programmed with strong and persistent religious or ideological dogma containing specific instructions for the exercise of morality and ethics that judge those who do not share 'faith' as something 'lesser', possibly requiring vanquishing or conversion.
So, suppose, we had such a person or persons, or those deviating from this standard by degrees not significant in the context, and capable of displaying the 'universals' of human morals and ethics.
Would it be sensible to use such 'averaged' people and their judgments and sensibilities, especially the visceral ones, as measuring sticks to attempt a definition of what definitely is 'evil'? Like killing people who haven't done you any harm? Like inflicting suffering, often horrific, on children and the helpless? Like making the lives of others into hell, just because it happens to be convenient or suits some selfish or uniopic purpose?
There is a list of 'human universals' listed by Donald Brown—I only learned of these through reading The Blank Slate—that, whether entirely accurate or not, gives one pause for thought. But if there are such basic sensibilities, then there has to be some form of possibility to judge when these sensibilities have been violated. The violators' proximate motives can also usually be assessed—though often they will be concealed under a tarnkappe of deceit, often on a grand scale and often including a significant measure of self-deceit—and these must be the ones based on which judgment needs to be rendered.
And, supposing that this is so, then we need to re-ask the question, this time in a more complicated form, to account for those universal sensibilities:
Do those who take life, inflict suffering, take away those 'human rights' which are part of the 'human universal sensibilities', and so on...do these people thereby forfeit their own right to those things; like life and liberty, as the US Constitution would have it?
Does the protection of those unable to protect themselves, or having charged certain people with the task of protection, justify the taking of the lives of those who would take the lives of those that are being protected?
To take this further: Supposing the answer is 'yes', then— unless an agency or agencies charged with protection can also guarantee that they live up to their assigned protective task—should it not be the right of anyone to take whatever measures s/he considers necessary to make it possible to effect such protection? Or is the right of a society for social regulation— which in practice seems to imply as a consequence taking away from individuals both the right and the means for, say, self and family protection—greater than the right of individuals to be able to exercise their implicit, 'human universal', duty?
I'm not saying things are one way or the other. These are just questions. But many of them are at the heart of much domestic and international politics, and of the national and global Zeitgeist. Therefore our trying to ignore them or answer them in he usual facile way is done at our own peril.
Wednesday, June 24, 2009
In this day and age of death and destruction delivered to our homes at dinnertime, in this day and age of numbers—"14 people died in recent riots in the streets of XYX", and so on—people usually fail to appreciate that it's not 14 people who died, but 14 individuals, each of them with, maybe not the same but equivalent, hopes and dreams as ourselves, with lives that were just as irrevocably lost into the dreary finality of dead-forever. They weren't "14" but 14 x "1".
I don't know if that conveys what I'm trying to say here, but just to drive it home, below is a link to a video of the death of Neda Agha-Soltan, one young woman shot on the streets of Tehran. I warn you: it is a harrowing clip. Not so much because it is graphic; there are considerably more gruesome videos around for those seeking thrills and sick entertainment. But to watch the death of Neda Agha-Soltan is so shocking because it is to witness it; to witness the extinguishing of a human life, brought closer perhaps because it is a young woman, and—particularly for Westerners—a yong woman that could have walked the streets and lived a life anywhere in one of our 'civilized' countries.
The clip makes no political point, unless one wishes to project it onto it—and it is easy to do so, of course, because of its context, and because the murder shown was committed by one of the literally millions of licensed thugs, the Basij, that act as the long and pervasive arm of the tyrants ruling Iran. But above all, this clip shows that every death is the death of a person—and the person in this case, could have been the girl living next door to any of us.
Please note: do not watch this clip if you think you can't handle it! It'll haunt you through your days and nights if you can't compartmentalize that kind of thing.
Tuesday, June 23, 2009
The supreme Whoever declared that since an Islamic state does not cheat it must be clear that there was no cheating in the recent elections.
Surely, a supreme case of 'reasoning'–which it definitely is. It's logic supreme. Proving yet again that logic in itself is classic GIGO. And there was a lot of GI to go around here. And people are dying because they know it and because they're fed up.
Think about this, people: how many nations on Earth today could you think of, whose people are willing to bring those kinds of sacrifices to shake of the brutal and naked tyranny of a militia-backed theocracy?
I know there are some who would like to! But does it really take Iran to actually at least to try and act on it?
Whether it succeeds or fails—and I fear it will fail, and yet I hope that it will succeed, without the need for yet another lunatic bunch of despots leading it!—it should be an example to all of those who preach, but who ultimately refrain from putting their health where their loud mouths are. We could only wish to have that kind of civil courage.
Monday, June 22, 2009
I've been thinking about 'fundamentals' of existence off and on for years. Many years. A lot of things emerging from both physics and the life-sciences—areas which somehow must be connected; for me that's almost an article of faith—and also the cognitive sciences simply don't hang together, not even by hairs. For that matter, things don't hang together inside those different disciplines!
Recently, something directed me back to an article by Lee Smolin from 2006, as well as his recent book, The Trouble with Physics, which I'll really have to read sometime soon. As sometimes happens, something got nudged this way and that, and I ended up looking over some stuff that's been bouncing around in my head for a number of years, and it was like "hmmfff..." and "interesting..." and "maybe..."
Thing is that—and if this isn't irony, I don't know what is!—that, at the very least in physics, we may be in a position that isn't unlike that existing prior to Copernicus coming on the scene—with the able assistance of Galileo about a century afterward.
And don't get me started on anything to do with the 'mental' sciences! At least physics can try to stand back from the thing it looks as—unsuccessful as the attempt has proved to be, but the illusion of it still is deeply entrenched in many of those who practice it. But cognitive science? Does anybody really think that there's a single person working in the field or seriously thinking about it, who hasn't got an axe to grind, an agenda to follow, a faith to defend against those who would assault it?
No, cognitive science also is still deeply immersed in its own pre-Copernicanism. And with it goes, inevitably, all of biology, the science of 'life'; still laboring under a total cluelessness about what 'life' actually is—or if it is anything at all distinct from whatever non-life is.
You know, with so many fascinating questions floating around, why does anybody actually bother to waste intelletcual time on stupid pursuits like 'theology'?
Theology is [...] searching in a dark cellar at midnight for a black cat that isn't there. R.A.Heinlein
Saturday, June 20, 2009
One can only hope.
Meanwhile, here are a few voices from Iran.
Thursday, June 18, 2009
I'm sure there are people who do some valuable thinking—and indeed, a number of them, many of whom I didn't know about, are mentioned in Pinker's book—but it's a lot like searching for precious gems in a city garbage dump. The overall stench is enough to turn you off trying, and giving up seems so much easier, and finding the odd gem doesn't seem worth all the trouble and tribulation and endless showers one needs, because the stench seems to cling like that of rotten fish.
The Blank Slate was a bit of an air-cleaner to put it mildly. I have some issues with Pinkers basic scientific philosophy, but these are nuncupatory with regards to the subject at hand, which is, whether there's something like—mostly genetically determined—'human nature' and whether there are variations on the theme, also mostly genetically influenced, that make one human being different from another in the sense that such differences are a) there before any 'conditioning' by culture/environment takes place, and b) may be sufficiently deep to indeed over-rule the in vogue notion that "you can be anything, if only you set your mind to it"—in a purely 'potential' sense, if you will, and ignoring the inevitable life-vicissitudes that may interfere with the implementation of whatever if it you might be wanting to be.
On other words, just as there are people that will never, ever become Olympic athletes, no matter how hard they try or how much they might want it, so there are those that will never be piano virtuosi or first-class mathematicians—or, if I may say so, writers or film makers or actors or, even closer to my heart, story-tellers.
The Blank Slate is an attempt to summarize the evidence for 'human nature', as opposed to the wishful-thinking that would deny such a thing. People are different in how they start off on their path of life, and what they can and will do with it, is subject to the limitations imposed on them and the potentialities they happen to be endowed with. This applies to every area of life, from that requiring certain skills and talents, such as in the areas of, say, logical thought or creativity or imagination—which are partially determined by genetics—as well as those areas relating to what you might call 'social' things; like empathy and compassion. People with inadequate mirror-neuron systems may be geniuses and hugely talented in some things, but they're likely to end up exhibiting all the signs of autism; or, going off in a different direction, as psychopaths.
Pinker also provides a very informative overview over the politics of blank-slate-ism, which is an impressive litany of how irrationality rules over evidence; even in the hallowed halls of that most 'rational' of all human activities: science.
I tend to judge 'fact' books by how much they're making me tell myself "Now why didn't I consider that?" Not in the sense of not having considered any of the gazillion stupid things I might have considered—and may have, on occasion, like the existence of God, the power of 'reason' as an ultimate arbiter of what is 'truth' and other silly things like that—but of having looked at something that is basically in line with the way I understand things in a sufficiently different way to say "Now that's interesting!"
The Blank Slate provides a number of such points, and thereby qualifies as valuable reading; something that was worthwhile spending time on. Since it is a cogent book, and since I disagree with a number of aspects of Pinker's philosophy, and also some of his assertions and his interpretation of the very evidence he cites, I've also been prompted to new thoughts about why something actually isn't as he says it is, and what's missing from the book and what's missing from Pinker himself. That, too, is very valuable, because it has helped me to clarify things in my own mind that otherwise might have been left unattended.
What I come away with—apart from some interesting ways of looking at things that really have helped me to understand some important previously-missing connections—is a sense, again!, of how cognitive philosophy has become a slave to a) the computational paradigm and b) connectionist thinking. Pinker is one of a class of cognitive philosophers who qualify as more perceptive and keen to 'go with the evidence'—as the CSI shows tend to put it—than many of those wrapped up in loopy ideologies and political agendas. However, he, too, is blinkered by the limitations he imposes on his own point of view. A lot of his evidencve is gleaned from studies in what you might call 'western nations', and—necessarily!—studies done during recent years, meaning, just to put a number on it, 100 (being generous). This means that said studies were done on urban man, and are therefore subject to the limitation of...well, of being done on urbanized humanity. If nothing else, that severely limits their value, since the societies we live in now and which have been studied, do not necessarily represent anything more than a blip in the timeline of humanity. To take them and to draw the sweeping conclusions Pinker draws, makes him just as guilty of over-reaching his scope as do those he berates. There is no evidence in anything he writes that he is aware of these limitations. He appears to be unaware of the very existence of social history. Much as he also appears unaware of the overpowering force and mind-shaping power of religion and ideology; which can alter people's minds beyond recognition and drive them into intellectual dead alleys and/or darkness forever, without any hope of ever getting out of them.
The other thing that strikes me is that, like just about every other cognitive philosopher, Pinker actually has very little grasp of the actual 'nature', if you will, of 'the Mind'. That doesn't mean, I hasten to add, that I think he's missing things like 'ensoulment' or 'spirit' or anything like that. I think those things are primitive concepts emerging from the natural inclinations of people to conceptualize in what are basically 'materialist' ways. ('Soul' and 'Spirit' are materials, substances, things. They're just somewhere else than in the physical reality, that's all. But they're still essentially 'stuff'. Think 'ectoplasm'.) No, what I think cognitive science is missing—and it basically is in that corner because of the philosophical traditions out of which it grew—is that while there is no other 'stuff', there nonetheless are dimensions to 'existence', which are ultimately subject to scientific enquiry, but which we simply don't investigate because we haven't gotten our heads around the questions we need to ask.
One might argue that that doesn't matter. Science is fairly complete as it is, in terms of the scope of its questions. Well, maybe. But there are questions it cannot answer. Questions about things that are. And how they are. Not just about this and that in the human mind, in you and me, in the strange phenomenon of 'consciousness'—a problem Pinker thinks will be resolved within this decade!—and how it happens and what it is. Recursiveness and computational modules aren't 'it'; and if they were, the definite answer would already have been produced.
The problem is that science and 'rational' thinking—I keep putting 'rational' in quotes, because it's such an abused term, almost universally not understood by its advocates!—are essentially 'materialistic' and concerned with the logic emerging from...well, the material nature of the computational processes in our brains. Anything that 'computes' needs to be material, even if it's a 'quantum computer'. We think 'material' in everything. Even Plato, with his loopy idealism, was still thinking of ideals as 'things'; some kind of 'stuff', even if it existed in an ideal universe.
The breakthrough will come when science breaks free of this, and when philosophy follows it along the path. Until then we'll have intelligent, insightful, interesting books like The Blank Slate, that help us understand a lot of things, but which still do very little to answer the questions that—deep inside of us, even the dullest I'd guess—we'd really like to know how to ask; and preferebly have them answered as well.
What are they?
Tuesday, June 16, 2009
There's also Gargamel, of course, the idiot evil wizard, who could never figure out that if only he'd used the rules of the game somewhat differently, he could long ago have nuked the pesky Smurfs out of existence, or into slavery or whatever we might have been up to.
Sunday, June 14, 2009
Anyway, there's one thing I will blog about very soon, and that is my most recent non-fiction reading, Steven Pinker's The Blank Slate. I don't read a lot non-fiction, if for no other reason but that I think so much of it isn't worthy wasting time on. The Blank Slate—to which my attention was drawn by a Pinker TED talk—was a notable exception; which is always encouraging to see. Until I get aorund to thet though, here's something else.
As some of you may know, I do take the piss out of German foibles every now and then. I guess I should apologize, but I was born there and occasionally I come across something that's so utterly 'typical' that my evil little self just can't seem to resist the temptation. I also do this, because, unlike Americans, the Germans of today aren't nearly enough beaten up on. And the same goes, of course, for the American's northern neighbor, Canada. But then one comes across this...
Words almost fails me!
Of course, one might wonder of anything like that hasn't actually happened in the US as well. Surely, with whole nations being in the throes of PC convulsions these days, you gotta be able to find something equivalent is come demented US media outlet.
Well, yes, there is, so rest easy. Or not, because the 'media outlet' in question is the Onion, which, in 2000 (yes, the year Y2K!) had the following article:
Ahh, yes, Germany, alone you are not. Don't know exactly what you're not alone in, but it's something.
Of course, in Germany meanwhile, fate decided to give at least one young German a break. When your time isn't up it just isn't up. A few inches further to one side and it would have been his head that had the hole in it, not the pavement.
all meteorites are 'space meteorites' (damn journalist retard)—and some really awful scientifically illiterate writing, including the wording "bouncing off" when referring to what the object did with relation to the boy's hand, this is a definite feel-good story. When it's not your time, it's not your time. In my life I had at least two experiences like that and they make you think about stuff you mightn't have thought about before.
P.S. Also, one wonders idly, why has the headline a reference to a non-metric measurement system? And, no, it wasn't a 'direct hit' either, but a 'glancing' one at best. A direct hit would have gone straight through the kid. So, the writer not only is ignorant of science, but doesn't even understand everyday English.