Thursday, July 30, 2009

Happiness and Perception, Disgust and Conservatism

We all know—well, we should have heard of it by now—that the world looks different to those whose glasses are half full, as opposed to those whose glasses are half-empty. Whether the attribute is determined by genes or environment, or maybe both—or whether the question makes any sense!—is beside the point; we're not looking at causes right now, but only at consequences.

The essence of 'positive' attitude isn't blind "everything is good" or "I can do anything" or shit like that, but of "these are the cards; and I'll be damned if I'm not going to play them for all they're worth". And it now appears that people who look at the world with a what amounts to a 'positive' attitude, in the sense just explained, actually see more than those who don't. Literally. We're talking about visual perception!

"...when in a positive mood, our visual cortex takes in more information, while negative moods result in tunnel vision..."

Of course, there's a downside as well:

"Good moods enhance the literal size of the window through which we see the world. The upside of this is that we can see things from a more global, or integrative perspective. The downside is that this can lead to distraction on critical tasks that require narrow focus, such as operating dangerous machinery or airport screening of passenger baggage. Bad moods, on the other hand, may keep us more narrowly focused, preventing us from integrating information outside of our direct attentional focus."

So, as usual, it's a question of balance and what you might call 'contextual appropriateness'. Well, I call it that.

And here's something else interesting...

One of the most recognizable facial expressions is disgust: the expression displayed by an individual who is exposed to a nauseating image or horrifying story. But what happens when this emotion is not expressed? When the person keeps a straight face – either intentionally or unintentionally – and pretends that nothing is wrong?...

...[People who are disgusted by something and who don't show it] experience more negative emotions. ‘They look at the world with negative eyes because they cannot get rid of their feelings of disgust by expressing them. A botox treatment also has an effect on emotional experience, therefore, and not on wrinkles alone’.

An unexpected side-effect of botox, it seems. Wrinkle free-ness appears to come at a high price; not just financially.

I advise perusal of the article in question. It offers a lot of food for thought, beyond that about mere facial twitches. And it's not long; short enough for even the ADD generation.

And as if that weren't enough—such things seem to come together—there's anther aspects to disgust, or, to be more precise, our inclination to be disgusted. In a classic bit of evolutionary negligence, it appears that, as one would expect, what once used to be a protective mechanism against disease—see something icky and stay away from it, because if you touched it, you got ill and possibly dies of some horrible disease before you had a chance to propagate your genes!—has turned into something that influences our ethical and moral assessments of the world as well.

Not only decaying bodies with maggots all over them are icky, but so are certain things that people do—and I'm not just talking about a certain drunken Australian rugby player (or whatever you call the variant of the sport the offender practiced; I know nothing about that stuff and really don't give a rat's behind about the nuances of these wally 'sports') defecating on in the corridors of hotels—but about moral judgments about everything from gay practices to abortion and so on. Indeed, the ultra-conservative, anti-immortalist, religioid (now there's a bunch of insults that should shake any man in his boots!) 'ethicist' Leon Kass—whom Wikpedia, without any editorial provisos described as a 'public intellectual'; which is an insult to intellectuals, no matter how little I think of the breed in general!—promoted a notion called something like 'the wisdom of disgust'; thus trying to justify its use in moral judgments.

As usual, we find that, on one hand, it is really important, for our mental health and that of the societies we create, to yield to the need to allow our evolutionary heritage to find some sort of expression—just say "YUCK!" when it seems appropriate and you really, really want to—but also to ensure that we don't become slaves to said heritage.

We live on a brink of what we are and what we can be. It's a very thin ledge, and the winds are blowing this way and that, and it's so easy to fall this way or that.

I know it's hard, but we have to keep on trying to our last breath, not to allow ourselves to fall.

The existential joke life has played on the likes of Leon Kass is that, while they think that humans—and they themselves, of course; especially they!—are the pinnacle of creation and the best-possible-image of some monomaniacal God, the belief that an evolutionary survival tool like disgust has any value at all beyond what it happened to have to a bunch of primitives who had no clue about the realities of microbiology, negates everything they'd like to claim about their own superior intellectual and existential status in the world.

The joke is on them. The whole damn joke. The rest of us just have to make sure that their primitive ways of thinking—did I say 'thinking'?—don't drag us back to the times when these things still had survival value.

Wednesday, July 29, 2009

When you're thinking about the end of the world, the cost of milk doesn't matter much.

OK, so what does that mean? Well, it's pretty damn obvious, isn't it? If your head is engaged with dealing with large-scale problems, the little ones don't seem so important or pressing anymore.

This observation isn't exactly original, and what follows probably isn't either; but even those who know intellectually that cognition 'works' in this way appear to be spending much of their lives blissfully unaware of the fact that they do live them according to exactly the same principles. Or, I should say, the corollary to the theorem of the title, which is "If you don't have 'bigger' problems to think about, then the small ones will expand to fill the available head-space." In other words, you lose perspective of where you fit into the grand scheme of anything. Or into no scheme at all, because there probably really isn't anything like a 'scheme'.

The same goes, of course, for such things as 'life purpose' and just generally for the 'why' we are doing whatever it is we're doing. If you are seeing yourself (that's you, over there; just you!) and your physical being—and we are all utterly dependent on and defined by the image of ourselves as a physical being, as mapped into the network of our brains, currently and also from the point of view of chronology—in the context of the cosmos and its gazillion galaxies and apparently endless space, then you're likely to take a very different perspective of your own importance than if you were in the body, and living the life, of...oh, let's say, to pick a suitably obvious example, Donald Trump...you know, the one of the gazillion dollars, grotesquely overinflated self-esteem, ineffable vanity and truly laughable comb-over. (I thought I'd pick on someone else than a member of the group of my usual favorites: i.e. politicians.)

Applying this to a social context, maybe the most important thing we lose is our sense of empathy. In this case what I mean by that, is some deep-down understanding that the world does not just 'look' different, but is perceived as actually being different, by every human being. Never mind that underlying it all is a substrate of the 'real' that is not subject to being changed by anybody's 'perception', as New Age lore would have it—at least not as they would understand it. What matters is that our interpretation of what this or that actually signifies—or 'means'; which is maybe a more familiar term—is drastically different between individuals. Even if we happen to agree on this or that and what it means and so on...it's still different. Words, or any form of communication, can never convey and even less create a complete, identical understanding of another's 'meaning'. We merely pretend that it's good enough and leave it at that. And that's in the best-case scenario.

Back to problems and head-filling. Have you noticed how, when you latch onto an idea about the world or when your attention has been drawn to something, there suddenly appear to be instances supporting the theory or attracting the attention everywhere? That is, by the way, how 'fixed ideas' become established and apparently cannot be removed. The only way to get rid of them is to replace them with something different. If you have some story going round and round is your head, as stories tend to do, then the only way to stop it from doing that is to replace it with another story. This technique is the only one that actually works; though it, too, requires practice, skill and, often, fortuitous circumstance. Best way to try it, is at night, when you wake up at some ungodly hour and there are these thoughts going round and round in your head and you don't seem to be able to stop them.

The bottom line to all this is that your brain—those neurons still functioning!—require these circulating narratives, in order to stay alive and kicking. That's just basic use-it-or-lose-it stuff. If you don't control the narrative, the narrative will control you. Some narrative has to circulate, and it's up to you to determine what it is. You may not be completely in control of this process—and rightly may you ask what I mean when I say 'you'!—but this is a matter of practice.

The notion that one narrative can be replaced by another, that this is indeed what is 'going on' in your head, if you will—in other words, the very thing I'm telling you here!—is, of course, just another narrative; which is, by you reading it, being added to the set of your existing stories. Whether it starts circulating and growing and having noticeable effects, or just fizzles into oblivion, depends on a gazillion factors, 'internal' and 'external' to you, all of which define you as an individual. You are what goes on in your head, in interaction with everything else that goes on in your head; and what goes on in your body and the world you interact with.

Have a look at the stories dominating your life. Have a look at where they come from, who injected them, who owns them, who controls their development and thereby controls you. Be prepared for some very scary revelations.

And then ask yourself, if you really want to be what you are being made to be by those who try to write your life for you—our lives for us.

Tuesday, July 28, 2009

Self-Help or Self-Hindrance?

This is probably worthwhile quoting inline. There are other studies that support this; which makes one wonder if self-help doesn't actually qualify as a potential—as they now like to call it—'Public Health Crisis'. Things are not always what they appear to be. What can one possibly do to make people more cognizant of this ubiquitous fact of life?

Nothing, I guess...

The Problem With Self-help Books: The Negative Side To Positive Self-statements

ScienceDaily (July 3, 2009) — In times of doubt and uncertainty, many Americans turn to self-help books in search of encouragement, guidance and self-affirmation. The positive self-statements suggested in these books, such as "I am a lovable person" or "I will succeed," are designed to lift a person's low self-esteem and push them into positive action.

According to a recent study in Psychological Science, however, these statements can actually have the opposite effect.

Psychologists Joanne V. Wood and John W. Lee from the University of Waterloo, and W.Q. Elaine Perunovic from the University of New Brunswick, found that individuals with low self-esteem actually felt worse about themselves after repeating positive self-statements.

The researchers asked participants with low self-esteem and high self-esteem to repeat the self-help book phrase "I am a lovable person." The psychologists then measured the participants' moods and their momentary feelings about themselves. As it turned out, the individuals with low self-esteem felt worse after repeating the positive self-statement compared to another low self-esteem group who did not repeat the self-statement. The individuals with high self-esteem felt better after repeating the positive self-statement--but only slightly.

In a follow-up study, the psychologists allowed the participants to list negative self-thoughts along with positive self-thoughts. They found that, paradoxically, low self-esteem participants' moods fared better when they were allowed to have negative thoughts than when they were asked to focus exclusively on affirmative thoughts.

The psychologists suggested that, like overly positive praise, unreasonably positive self-statements, such as "I accept myself completely," can provoke contradictory thoughts in individuals with low self-esteem. Such negative thoughts can overwhelm the positive thoughts. And, if people are instructed to focus exclusively on positive thoughts, they may find negative thoughts to be especially discouraging.

As the authors concluded, "Repeating positive self-statements may benefit certain people [such as individuals with high self-esteem] but backfire for the very people who need them the most."

I was going to leave the 'l' out of 'public', to see if anyone noticed, but then I thought better of it.

Monday, July 27, 2009

Romance Forever?

Well, here's a newsflash that should put a spanner in the works of commonly spouted wisdom about love and life. And hopeless romantics should indeed take heart.

Contrary To Widely Held Beliefs, Romance Can Last In Long-term Relationships

...companionship love, which is what many couples see as the natural progression of a successful relationship, may be an unnecessary compromise. "Couples should strive for love with all the trimmings," [the researcher] said. "And couples who've been together a long time and wish to get back their romantic edge should know it is an attainable goal that, like most good things in life, requires energy and devotion.

All warm and fuzzy. But, yes, don't expect it to happen miraculously. Those who expect it to be 'free', will be disappointed. No surprises there.

Sunday, July 26, 2009

Violence (sorry, that should be "V I O L E N C E")

VIOLENCE, like GLOBAL WARMING and THE GLOBAL FINANCIAL CRISIS, is one of those things in regards to which everybody has an agenda. It's also one of those areas of scientific research that's almost invariably slanted and littered with whatever the researchers happen to think 'violence' actually is. That determines the questions asked, and as we should all know by now—'should', but obviously don't—questions pretty much determine the likely answers.

All the more refreshing then to read that there are people who obviously have discerned the problem, as they talk about...

...the need for a more general conceptualization of the effects of exposure to TV violence, one that takes into account personality differences, ethnic differences, the social context in which TV is viewed, variations in the dramatic context, and other potentially significant moderating factors.

Indeed, video game violence isn't TV violence, isn't cinema violence, isn't book violence. And within each of those violence-delivery media there are huge differences of the meaning and purpose, or complete lack thereof, of violence. Each will have completely different effects on the audience, because what matters is not the violence itself, but the story within which violence appears. And, let's face it, every story that holds any interest value to adults will contain some form of violence—occasioned by conflict, which is the source not only of violence, but also the driving force behind all 'story'; something that has far reaching consequences, especially for a 'narrative' view of the human mind. The only question about the violence in narrative, is whether it's physical or psychological—or both, of course. This may be denied by the arty-farty movie crowd, for example, but in the final analysis a spade is a spade, no matter what you paint on it.

Also—and this is really needed here—we need a simple reality check on the whole 'violence' debate, which is provided by what you might call 'real life'. For, at least in my experience, the violence experienced in life, especially of the psychological side, tends to be much more in-your-face and have a greater impact on the development of people—meaning mostly children and young people, because that's who's being studied in all the media-violence research—than anything the media serve up. The often outright brutality and inherently oppressive nature, at all levels, of the growing-up environment almost all children are exposed to even in the so-called 'civilized' world—think of your average 14 years of compulsory 'school'!—surely screws them up more than any TV or movie violence ever will.

Saturday, July 25, 2009

Keaen, Finister, Tergan, Fontaine, Tethys, Seladiënna, Continuity Slip

Esteemed Readers,

Yes, there appear to be some, because there actually are sales. It's not a flood, but people read the books.

If you are in my 'readership', could you please, please do me a big favor and write some reviews—if that seems like a worthwhile thing to do. Such 'worthwhile-ness' might exist if you either consider the books a total waste of your time and money, which is possible, or if you are in that select group who not only know about them, but have read them and think that it's a damn shame that they linger in obscurity and the world of self-publishing, while so much crap is being peddled in the shops; implying that stuff is being published and pushed into the book outlets.

They say that any publicity is better than none, and evidence appears to support that contention. The best publicity I can think of is if my readers seem to think that my stories are good enough to invest their precious time to write reviews—especially if they're not in the business, as it were of reviewing. If people are prompted to express their enthusiasm for something they've read—I know this sounds a tad over the top, but we're talking 'promotion' here, and besides, there's nothing wrong with being enthusiastic about something one really likes; as opposed to "Oh, yeah, that was OK."

If anybody feels like that about the Tethys series, Seladiënna or maybe even the lightweight Continuity Slip, please do me that favor, if you could.

Another thing. If you purchase these, it's faster and cheaper for you—if you live in Europe, the US or Australia—to buy the books directly from lulu.com. The easiest way to find them there, is to go to this site, and click on the 'i' links beside the books at the bottom. This will get you to a page where all the book covers shown link to the lulu.com originals—and you can get the download versions as well. The latter are cheaper, but you know what I think of reading books from a screen. Still, better that way than not at all, I guess.

Finister and Tergan are also available from Amazon, and I'll add the other ones in the Tethys series within the next few monts. It's just that I still have some ideas about changing a few cover images. Maybe, maybe not. We'll see. Revising books for Amazon distribution costs me extra money, which I can ill afford to spend at the moment. On lulu.com it costs nothing.

Another benefit of people getting my books from lulu.com is purely for me: I get more royalties/book. Call me selfish, but even that way I won't be getting rich from the sales!

Reviews from lulu.com are easily copied and pasted to the relevant Amazon pages, or vice versa, of course. Saves a lot of work.

So, if you could do that—especially if you like my stories—it would be very much appreciated. One day, if and when some publisher finally picks them up, you can say that not only are you among those who 'discovered' the, but also that you have editions from a time when publishers didn't even look at them.

To those who choose to help me with this, a heartfelt thanks. I mean it. A story-teller is nothing without his or her audience.

Friday, July 24, 2009

Meat is Murder

While I am an 'ethical' vegetarian, I still think this is very damn funny.

So, go on, vegos, have a chuckle. I dare you.

The meatos will, of course, laugh without inhibition.

Thursday, July 23, 2009

Only in the USA (well, maybe...)

I have taken the piss out of the Germans on occasion, and well do they deserve it. But this here is so 'American', it makes me want to cringe in embarrassment for those Americans who surely must cringe just as much as I; and who probably feel that this ridiculous circus only adds to the already huge trove of anti-American prejudice already littering the world.

2 cruisers lead Jackson mementos to cemetery


Two hearses jammed with stuffed animals left in memory of Michael Jackson were given a two-car police escort Friday to the toys' burial at Woodlawn Cemetery...

OK, so the police higher-up muckamucks weren't happy with the action, as you'll find out when you read past the first sentence, but that doesn't change the bizarre nature of what's going on in the aftermath of MJ's demise.

I understand that it was unavoidable: the time, energy and bandwidth wasted on reporting on the death of Michael Jackson and its interminable aftermath.

Let's have a little reality check though. When the catharsis has run its course and people have stopped sobbing and self-indulging in their real or proxy grief over the King of Pop; when history has run its course a bit further and revealed the utter irrelevancy of all of this...

...then maybe it will become clear that the murder of Neda Agha-Soltan, drowning in her blood on the streets of Tehran, will have made more of a difference to anything that any element of the 'pop' culture that so occupies the dull-witted public's attention and clogs up its mental arteries.

By now, of course, the news of that girl's death has long passed from the short-term attention of most western media and their consumers.

I mean, who cares about the millions in Iran trying to attain something that we not only take for granted, but treat in a very cavalier and careless fashion, by surrendering more and more of it to progressively more powerful nanny-state governments in the name of so many dim-witted rationalizations that one hardly knows where to start enumerating them.

Oh, yes, that 'something' is often called 'liberty'.

I just hope these frogs will be happy

Villagers solemnise a frog marriage at Madhyaboragari village, about 85 km (52 miles) east of eastern Indian city of Siliguri.

The frog marriage is a traditional ritual observed by the rural folk to appease the gods to bring in rain and ensure a good harvest.

© Thomson Reuters 2009 All rights reserved

I just hope that the marriage, executed while the participants are tied up with strings, so they don't bolt, isn't entirely a sham. The way things stand—at least in certain places and incarnations of 'civilization'—the whole thing might be a prelude to the participants being killed and eaten.

Let's hope this is different. However, the marriage, judging by the evidence, is definitely 'arranged'.

Wednesday, July 22, 2009

Where are they now?

This came up the other day in conversation, as such things do. There are, of course, TV programs that have this as a subject, but the conversation was more about actors specifically. You remember them from times gone past, but they've kind of sunk from sight—at least the kind of sight that has them appear in tabloids, on current movie posters or heading up casts on TV series. Doesn't mean they've stopped working, but they're not working as prominently and 'out there'.

I have little hope that the same will happen to some of the current headliners, many of whom—as has always been the case, I suppose—are basically a bunch of fatuous bores, kept aloft and in prominence by publicity and media machines, all of whom make whooping profits out of doing it. Gossip rags are even worse offenses to sensibility—and the 'truth', of course!—than your average mainstream 'newspaper'. And that's saying something!

Anyway, among the 'where-are-they-now-?' actors, I recently chanced upon two moderately heartwarming examples of former 'stars'—not huge ones, but they definitely had their day in the cinema or on TV—who may have sunk into some obscurity, but who not only are still 'working' and presumably earning a living, but who ended up getting married to their romantic or semi-romantic opposites in whatever show or movie made them famous; and who, surprisingly perhaps, are still married. Which, of course, makes my 'two examples' into 'four'!

The couples in case are Paul Hogan and Linda Kozlowsky, who met on the set of Crocodile Dundee in the latter 1980s and hooked up soon after...

...and Michael Brandon and Glynis Barber, who appeared in the TV series Dempsey and Makepeace in the latter 1980s.

Must've been the time for that kind of thing. Call me a silly romantic, but I always thought that was kind of nice in both cases. The pairs had obvious on-screen 'chemistry', as people tend to call it, and that wasn't just imagined—obviously. The fact that they're still together in both cases, and even have produced offspring...well, it kind of shows how people can move on in life, and how being in the spotlight of fame doesn't have to screw you up. When you compare that to the people it does screw up, it makes you feel all warm and fuzzy.

Or not, depending on your disposition. You may think it's all pretty silly, really. Indeed you may.

But sometimes it's just...nice...when a bit of fiction becomes a bit of reality. Of course, it doesn't happen all that often—which makes these four people into something remarkable.

I think so, anyway.

Tuesday, July 21, 2009

There are some things one would rather not know about...

...and there was something I read today that's even more despicable than the flogging I blogged about some time ago. Actually, it's just the natural extension of it; the logical consequence of a whole series of developments and mental events, that brutalize a person to the point of no redemption. Or, should I say, of a person allowing himself to be brutalized. For no one is just a victim. 'Choice' does come into it, and explanation cannot become exculpation here.

Warning: this link, though it does not contain a video, nonetheless has descriptions of acts of brutality against women; which, if someone wrote about them in fiction, would be dismissed as the feverish imaginings of a sick mind. Don't go there, if you don't think you can stomach it.

In the event, I happen to have suitable place in Aslam, where, without specifics, I can at least allude to these events.

I have a lot of imagination—at least I think so—and that goes for the good and the bad. But it would never have occurred to me that, as a matter of routine, people would really do this kind of thing. People who call themselves 'civilized'.

In this case though, it is purely incidental that it's all under the umbrella of 'religion'—even though it comes in handy, as it usually does. But here it is nakedly just what seems like ordinary people becoming unforgivably and, at least to my mind, irredeemably evil.

Saturday, July 18, 2009

And of a sudden, one finds that time has run out

Trains of thought drifting idly into strange directions...

Kafka's Before the Law is to a great degree about how people will do anything not to do what's needed to realize what they think they want—and probably actually do want. Until it's too late, that is; and then they can't do a damn thing...because, well, it's too late, and nothing they'll do will make a difference anymore, because the time for doing it has come and gone. Forever. Forever for this life. And since there's only one of those it is indeed forever for all eternity.

All that's left is usually bitter regret, which is, if the person concerned is capable of it, followed by endless rationalizations as to why it's either OK that things turned out as they did, or why they 'should have', or else by depression and the darkness accompanying the deep knowledge, no matter how hidden, that one has failed—and that one only failed because one didn't 'do' when one should have. In other words, there really is only one and only person to blame for the dismal position one has ended up in.

There are two basic rules of thumb when it comes to 'life'. They are not absolute, but they apply almost universally—with the odd exception.

Rule 1: Time probably is not on your side, unless you can prove, beyond reasonable doubt, that it is.

If anything it works against you in every way conceivable. Every heartbeat brings you closer to death: that's a given, no matter how long you live. Every heartbeat also takes you closer to, and ultimately carries you past, life's (missed) opportunities and milestones.

It can't be overemphasized how dreadfully final and unforgiving time is. And every opportunity that drifts past is indeed an opportunity gone forever. It doesn't matter how we rationalize that we didn't grasp it, or whether the reasons given are valid or not. Contingency doesn't give a shit. Opportunity gone. Forever. Period.

Corollary to Rule 1: There's never enough time to do what wasn't done when one missed one's chance to do it.

There are no exceptions to this rule. None. That's 'none'. Not a one. There never were, there aren't and there never ever, in the entire future history of the cosmos, will be any such exceptions.

Rule 2: Only dead fish go with the flow.

This saying has recently been revived by a certain US politician. I rather like it, though I've been known to tell people to 'go with the flow' and stuff like that. Thing is, if the fish is alive, it will actually not just 'go with the flow', but it'll use it to get where it is going to get there faster. Unless it swims against it, like Salmon for example, when they work their way upstream against some pretty formidable odds and forces. That is an option, of course, though it's kind of exhausting, and it'd better be worth it!

This is, of course, the point: it's not about going with the flow, but using it.

Don't just be a dead fish. You'll just start to smell. Very badly.

What turns a live fish into a dead one? (Yes, I know, I'm running the metaphor ragged!)

The usual suspects. Rationalization. Denial. Bullshit reasons that seem true and valid, but really are born out of the fear of facing that which might really make a difference to our lives—for a change. It's so much easier, by and large, just to carry on as things are. Even if it means that opportunities drift past.

Never to return.

Friday, July 17, 2009

DARK CITY and THE 13TH Floor (definite spoilers!)

I had occasion as of recent to re-watch both of these movies, which date from just about 10 years ago, give or take one.

Dark City was released in 1998; The 13th Floor in 1999. Both are infinitely superior, in terms of story-telling, characterization and asking existential questions, to the pretentious Matrix (1999) and its crap sequels.

Often, as is demonstrated by comparing the three movies, in order to tell a story and make it have a point, you really have to take it easy on heaping up too much philosophical claptrap, even if it's wrapped up in CGI Kung Fu scenes.

If you're looking for superb impressions of existential angst and despair, you won't find them in the overblown in-your-face sequence of Neo emerging from his goopy enclosure—but you may indeed find something clamp tightly around your chest when you follow Douglas Hall, as he drives out of the city and across deserted country roads to the end of his world, and faces the reality of his own artificiality and the destruction of everything he believed about himself; or Inspector Bumstead's numb and futile groping for his memories of a place called 'Shell Beach', or how to get there; and yet it's a place everybody knows—but can never get to.

Maybe the most desolate line is uttered by 'Douglas Hall' (13th Floor), after he realizes the truth and confronts a 'downloaded' person in his artificial world, who tells him that he does, despite everything, indeed have a 'soul'.

"...how can I?...none of this is real. You pull the plug; I disappear; and nothing I ever say, nothing I ever do, will ever matter."

You can't get it more existential-angst-ish than that.

If you haven't seen either of these flicks, treat yourselves to them. If you can't find them at video stores, just download them from your friendly neighborhood p2p network instead. Both deserve the appellation 'classic' and not having seen them is a definite loss.


Tuesday, July 14, 2009

Here we just sell small rectangular objects. They're called books. They require a little effort on your part, and make no bee-bee-bee-bee-beeps.

It's just possible that the 'Nothing', the dreadful thing that was about to consume Fantasia in the enchanting classic The Neverending Story, may be the computer.

I'm not saying this as a kind of Luddite, because I'm anything but that. And I rarely hanker for 'the old days' where things were better—which some may have been, but a great many really weren't—and we didn't have this and that which screws up our lives now; mainly because there were other things that did the same job, or an even better one, instead of what we have now. But in this instance things are different. In this instance what is being phased out—or at least people are attempting to do so; in the name of everything under the sun, from imagination to conservation of natural resources—is the book as a medium of story telling; to be replaced either by books on the screens of computers or digital reading devices like the Kindle.

The question as to why we should continue to read stories in books, as opposed to doing it on the screens of electronic devices, has been discussed by many people and in many contexts. So far, it has always appeared that any preferences either way were mainly a matter of taste or just habit.

Well, it now appears that there may be more to it...

Storybooks On Paper Better For Children Than Reading Fiction On Computer Screen

Clicking and scrolling interrupt our attentional focus. Turning and touching the pages instead of clicking on the screen influence our ability for experience and attention. The physical manipulations we have to do with a computer, not related to the reading itself, disturb our mental appreciation...

...reading on a screen generates a new form of mental orientation. The reader loses both the completeness and constituent parts of the physical appearance of the reading material. The physical substance of a book offers tranquility. The text does not move on the page like it does on a screen.

"Several experiments in cognitive psychology have shown how a change of physical surroundings has a potentially negative affect on memory. We should include this in our evaluation of digital teaching aids. The technology provides for a number of dynamic, mobile and ephemeral forms of learning, but we know little about how such mobility and transience influence the effect of teaching. Learning requires time and mental exertion and the new media do not provide for that," ...

"We experience to day a one-sided admiration for the potentials in the technology. ICT is now introduced in kindergarten without much empirical research on how it influences children’s learning and development. The whole field is characterized by an easy acceptance and a less subtle view of the technology,"...

"Critical perspectives on new technologies are often brushed aside as a result of moral panic and doomsday prophecies. ...there is generally little reflection around digital teaching material. What we need, is a more nuanced view on the potentials and limitations of all technologies – even of the book. Very often important discussions about technology and learning have a tendency to reduce a complex field to a question about being for or against," ...

The development of digital media leads to a need for more sophisticated concepts of reading and writing and a new understanding of these activities.

...

Even if children and young people do not read as many novels in book form any more, one may still argue that they actually read more than before. Most of what they do on a computer or on their cell phones, is exactly reading and writing.

"...we understand more and better when reading on paper than when we read the same text on a screen. We avoid navigating and the small things we don't think about, but which subconsciously takes attention away from the reading. Also texts on a screen are often not adapted to the screen format. The most important difference is when the text becomes digital. Then it loses its physical dimension, which is special to the book, and the reader loses his feeling of totality."

...hypertext stories... exploit the multimedia possibilities of a computer and use both hypertext, video, sound, pictures and text. They are constructed in such a way that clicking one's way around them comes close to a literary computer game.

...

"The digital hypertext technology and its use of multimedia are not open to the experience of a fictional universe where the experience consists of creating your own mental images. The reader gets distracted by the opportunities for doing something else."...

The last paragraph may be the most significant, because it folds into the consideration the factor of 'distance' between reader and fictional world and its importance to not only involvement in a particular story, but also to overall mental development. It appears that this involvement may be a critical factor in the development of a number of human faculties; and, unsurprisingly, it has to do with our relationship to 'narrative'.

Interesting pointers at the significance of all this may be found, inter alia, in research such as this:

Researcher Links Storytelling And Mathematical Ability

Two-thirds Of School-age Children Have An Imaginary Companion By Age 7

Imaginary Friendships Could Boost Child Development

Children Better Prepared For School If Their Parents Read Aloud To Them

Very Young Children Can Step Into The Minds Of Storybook Characters

I don't know if you ever had that experience: you walk into an office, the one you work in (you do, don't you?) or maybe a bank of whatever, and you see a whole bunch of people sitting before rectangular screens. I tend to take note of this. Once you've done it, the sense of the surreal tends to linger and draw your attention to the unnaturalness of itall.

Well, this is just one example of humans being put into situations for which they are cognitively unprepared. The results of extended exposure to such things range from the mild to the severe, from low-level chronic cognitive dysfunctionality at all levels of cognition, decision making, 'thought', emotion and action, to acute and obvious mental 'problems' that can become seriously destructive. I've been a software developer for many years, and I can assure you, from personal observation of my own self, as well as others surrounding me who worked in similar roles, that the effects are real and not anything to joke about. I still spend a shitload of hours behind a computer—as a tech writer as well as doing things like writing novels, laying out books, designing covers, editing video etc etc—and getting away from it and to a book, instead of glueing my eyes to the TV, another rectangular screen, for the remainder of my day, or just to some views of things that aren't rectangular and glowing, like trees and grass, and such like...that's tonic on my human perceptions. The same goes for the things one does. Like physical exercise. Training of of body coordination—martial arts are great for that, though they're on the back-burner right now. And so on.

We're not necessarily—well, we aren't, period!—evolutionarily adapted to reading books either, but in the context of narrative delivery, and with the social backdrop of having been read to from a book (Like that's going to happen with a Kindle! Mum reading to the kid from the screen. Ha!) and the whole-sense associations we have coming from that...a book is so much closer to something we can relate to. Thing is, books don't stand alone and without context against electronic media for narrative delivery. They have a historical/social context, and this actually matters, because humans are social/historical creatures. We can't exist without that. Books also are physical entities. A novel in a book is a 'package' of sorts, a whole, a unity. We close a book and take the story with us. Complete union of the object and its content. You cannot get that in a laptop, PDA or Kindle.

This isn't atavism, but connectedness to our nature; the fabric of our being.

Books also don't require electricity to run, nor are they likely to fail delivering if hit by power surges or cosmic rays. There is a sense of comforting security about them, and that, too, must be taken into account when considering such things. There'a s profound difference between a 'library' of e-books—an abominable misnomer, insulting every books in sight—and one of books. You just can't develop the same relationship them. You can't just pick one up, open it and just read a page, or two or one here and one there, or a chapter, or flick forth and back, stick your finger into the book to keep the page you're reading and flick back to something you've read before. The e-book is hidden on a medium unreadable by you, but requires an intermediate, very complex and fragile mechanism, whose functioning, though many of us are using it routinely, is basically incomprehensible to all but those who actually know this or that and preferably more about the technology.

If you want to do the finger-in-between-pages trick and try to look back to confirm maybe that a plot point makes indeed sense, or just because there was a scene there that relates to what one reads now, but which is somehow particularly interesting, enticing and/or thrilling...try that with an e-book and see if it can come in any way close to the 'book'-experience.

I know, many people claim they're just as happy reading from a screen as a real book—and there are those who love audio-books, something I've written about not so long ago; without much 'liking' involved—but I think the Bastian experience in the store is unrepeatable by e-book and will be so forever:

Mr. Koreander: Your books are safe. While you're reading them, you get to become Tarzan or Robinson Crusoe.
Bastian: But that's what I like about 'em.
Mr. Koreander: Ahh, but afterwards you get to be a little boy again.
Bastian: Wh-what do you mean?
Mr. Koreander: Listen. Have you ever been Captain Nemo, trapped inside your submarine while the giant squid is attacking you?
Bastian: Yes.
Mr. Koreander: Weren't you afraid you couldn't escape?
Bastian: But it's only a story.
Mr. Koreander: That's what I'm talking about. The ones you read are safe.
Bastian: And that one isn't?

We live in a world where "It's only a story." is part of our civilization's fabric; an insidiously pervasive and obviously truth, whose obviousness hides its contextual decrepitude. As if this truth were virtuous in some way. As if it helped us become better, wiser, more loving, more passionate, more intelligent and able to cope with what the world throws at us...and so on, and so on.

It's never "just a story", because everything that happens to us, everything we think about, we think about in narrative terms and sequences. Everything we hear said in the English language—unless its some pretentious avant-garde random linguistic abomination crap passing for 'art', probably!—is ultimately narrative. Every sentence is a more or less explicit mini or micro narrative; even those relating to the most abstract and obtuse kinds of subjects.

"It's just a story" may be the most existentially unperceptive thing anybody can say about...

...about our stories. But to actually understand that, you may just need—books.

Thursday, July 09, 2009

News that Matter

Among the crap that passes for news—and, yes, I'm including those on the 'political' front, and, yes, those promulgated by the 'serious' organs (love the word 'organs' when connected to news!)—the ones that tend to get lost in the melee of attention grabbers are those that relate to us individually. Stuff that's about ourselves as people, as creatures, as minds, as mortal beings who would love not to have to get sick and/or age and(no 'or' here!) die, and things like that.

We tend to forget about these kinds of things, unless they're held up in our faces—because we or those we care about get sick or old or dead. You listen to, read, or watch the crap on the 'media' and you drown in what amounts to irrelevancies.

Science is a vast field, and it is true that much of it probably is of no interest to most people, who'd rather hear about the results emerging from said science—well, some of them. But surely those parts relating to anything having to do with our health and also our nature—as much as science can contribute to the understanding of the latter—should be of major interest and attention.

Well, here's a way to be reminded daily, and prompted maybe to probe and do some directed internet browsing. Just pick the topics you're interested in. Trust me, it's worth it.

Wednesday, July 08, 2009

Photoshop does not always rule

Have a look at this image. I can't hotlink it, because it's blocked, but if you go to the site, you'll see some truly amazing photography.

A lot of photography—au naturel or composited with tools like Photoshop—even that exhibited at fairly broad level, at exhibitions large and small, prestigious or just 'local', is, not to put too fine a point on it, lifeless, utterly dull and devoid of aesthetic value. It's the kind of crap you walk past at exhibitions, and you may stop and ponder, such as not to appear like a cretin, and as you do this, dragged to the place by who-knows-who, you try to see if it says anything to you at all; if you can maybe squeeze a little bit of meaning out of the dismal offerings.

It's something one usually remains silent about, because it's not socially acceptable to voice one's misgivings or assessments of the total void-ness of the offerings; if only not to offend the artist. That's the job of mean spirited reviewers. Besides, who knows? There may be 'meaning' in this and you just are the one who can't see it. But that last sentence is tokenism. In my experience, my personal lack of discovery of content in much of this work, like so much in the 'Arts', is usually shared by others. It just takes some effort to get them to open up an admit it.

There's also the argument—for me this is maybe more relevant in a literary and motion-picture context—that the person engaging in activities assessed by others as falling under the 'artistic' rubrik may himself or herself be producing a lot of similar...ahh, let me call it 'stuff'... that has meaning mainly, or maybe only really, to the creator. So, gotta be careful judging! Gotta leave that to critics; they will not be taken to task for producing shit. They don't, after all, create anything but wind and/or meaningless symbols on a written page.

I digress, as I often do.

The bottom line is, go to this photographer's website and have a poke around. As for me, I can't read a single line of what's written, but the images are more than sufficient. There's something about Russian (and those from nations associated with what used to be 'Russia') photographers, and what they see through the lenses of their cameras, that maybe unrivaled anywhere on Earth.

Tuesday, July 07, 2009

Reason and Rationality

Write or print this on a BIG piece of paper and stick it on a wall where it's in-your-face.

Just because people use reason, that don't mean they're rational.

This is what one might call the 'Objectivist Fallacy' (that is, the Rand-version of 'Objectivism', as opposed to the metaphysical kind): the notion that what has become a tool for survival, developed and refined through the process of evolution, is actually somehow 'fundamental' to how we should understand ourselves and our essence.

The evidence to support the Objectivist Fallacy is so close to non-existent that it's almost on par with the evidence to support the existence of God. On the other hand, there is evidence, as for example popularized by Dan Ariely, that irrationality is indeed our basic mode of functioning. The joke—on Rand-ian Objectivists; and that's only one of the jokes, because they unwittingly deliver others quite of their own accord—is, of course, that we can use reason, that powerful tool, to analyze irrational behavior; to the extent of actually being able to predict its occurrence and actions.

The whole thing came up, BTW, because a friend of mine simply appeared at a loss to explain what he considered the irrational behavior of tenants who rent a part of his rural property. Why would they act such and such, if this made no sense whatsoever?

Because...see above.

And surely the very assumption that other people share one's own sense of what is rational and sensible is in itself entirely irrational. The results of the use of the tool of reason is entirely dependent on pre-existing assumptions—like my friend's mistaken one! That's the danger with tools. To someone skilled in the use of a hammer every problem looks like a nail. To someone skilled in the use of reason, every problem looks like it's just waiting there for a rational solution. Which is may well be and in some cases it will be subject to rational resolution; but if said problem relates to matters of psychology, individual and social, the only workable assumption from which to proceed is that other people use reason but are not actually motivated in their action by it.

That it should be different may be closely related to this snippet of information:

Brain Represents Tools As Temporary Body Parts

...when we use a tool—even for just a few minutes—it changes the way our brain represents the size of our body. In other words, the tool becomes a part of what is known in psychology as our body schema...

I would like to submit that the processes that enable the Objectivist Fallacy to exist, are related to processes not dissimilar to this. It's not exactly the same thing, of course—after all, the 'tool of reason' is internal, if you will, and usually implicit, so we don't actually think about it as a tool. But that may be exactly the reason why the confusion arises to easily, persistently and ubiquitously.

Thursday, July 02, 2009

Imagination, Memory and Age

Here's something for people—probably mostly those not still wet from crawling out of the egg—to think about. I'll comment on the significance of these studies at the end, but first let me present them in summary, with brief excerpts from the relevant pages. You can read the complete articles at the links provided.

First of all there's the article I cited the previous blog. Might want to refresh your memory on this (pun intended).

Then there is:

Having A Higher Purpose In Life Reduces Risk Of Death Among Older Adults

...Purpose in life reflects the tendency to derive meaning from life’s experiences and be focused and intentional..

...Possessing a greater purpose in life is associated with lower mortality rates among older adults...

Lack Of Imagination In Older Adults Linked To Declining Memory

...the ability of older adults to form imaginary scenarios is linked to their ability to recall detailed memories...

...episodic memory, which represents our personal memories of past experiences, "allows individuals to project themselves both backward and forward in subjective time."...

...Therefore, in order to create imagined future events, the individual must be able to remember the details of previously experienced ones extract various details and put them together to create an imaginary event, a process known as the constructive-episodic-simulation...

Think Memory Worsens With Age? Then Yours Probably Will

...Thinking your memory will get worse as you get older may actually be a self-fulfilling prophecy. Researchers at North Carolina State University have found that senior citizens who think older people should perform poorly on tests of memory actually score much worse than seniors who do not buy in to negative stereotypes about aging and memory loss...

Imaging Pinpoints Brain Regions That 'See The Future'

...remembering the past and envisioning the future may go hand-in-hand, with each process sparking strikingly similar patterns of activity within precisely the same broad network of brain regions...

..."In our daily lives, we probably spend more time envisioning what we're going to do tomorrow or later on in the day than we do remembering, but not much is known about how we go about forming these mental images of the future," says Karl Szpunar, lead author of the study and a psychology doctoral student in Arts & Sciences at Washington University...

..."Our findings provide compelling support for the idea that memory and future thought are highly interrelated and help explain why future thought may be impossible without memories."...

-----

Most of this relates to 'older' people, but that makes sense, since they're the ideal subjects. Nothing works better than for decent scientific investigations into such things than to watch as things go wrong and how, especially if traits or capabilities disappear and such events can be linked to neurological phenomena; in this instance conditions connected to, for example, the hippocampus.

Summing it up in as few words as possible, it appears, strongly so, that imagination and memory are very strongly linked, if not basically the same thing. This is, in itself, a very important point from a purely philosophical perspective.

Of particular interest here is that we're mostly talking about 'episodic' memory and imagination. In other words, about mental narrative. One should also consider that 'purpose' in life is linked to all of this because 'purpose' is nothing but narrative about the larger context of one's existence and one's future existence.

I'll leave that stewing for a while, because I think readers can draw their own conclusions—which are fairly obvious and in-your-face, particularly with regards to what it actually says about the very nature of our memories and our very minds. I've maintained for a long time, with nobody really paying attention, that all explicit memory recall is either completely 'episodic'—meaning narrative—or at the very least framed in a context of narrative. In other words, whether we know it or not, but every time we remember anything at all, we tell a mental story in which the remembered item features prominently.

All other memory is 'implicit' and cannot actually be made explicit—or 'conscious', as some might put it—unless it is done, again, in the context of mental narrative. What connectionists refer to as 'associative memory' is actually misnamed, as it really should be called 'narrative memory'. Narrative isn't, as cognitive science would have it, just one incidental aspect of cognition and what we do with it. Narrative lies at the core of cognition and consciousness. It is the way these things work.

Narrative as 'story'—usually told in some form—is merely that form associated with the existence of some form of language; which, and I agree with Pinker in this, is by and large a tool of communication. It refers to 'internal' communication also. Language also imposes structure/constraints on certain categories and types of narratives that—in a swirling could around a center that we perceive as 'self', but which exists only in the same way that a 'center of mass' exists with regards to some physical body—constantly circulate in our brains. And when people say 'competing memes' they should be thinking of 'competing narratives', and a lot of things would become so much clearer. And our lives...well, they are, in a very real sense, stories, resulting from the interaction of our mental and physical contexts.

I have digressed. Something more practical:

Having read the articles above, and taking into account that use-it-or-lose-it is the absolute order of the day with just about everything physiological and cognitive, does it not suggest itself that, rather than training 'memory' to keep people alive—which is really boring; at least I think so; and besides, it produces a potentially unwelcome spin-off, namely too many irrelevant memories of really dumb bits of data—they should really be stimulated/encouraged to let their imaginations run free instead? It's so much more interesting and it has the desirable side-effect of training them to indeed have plans for tomorrow; by virtue of their capacity to create narratives for the future.

Ken Robinson has criticized the manner in which today's schools all over the world stifle creativity, courage to innovate and how they de-emphasize the value of 'imagination', favoring instead the learning of factual things; that is, committing things to memory, be they scientific facts or 'social norms', and thus making them into what is perceived, rather myopically and unimaginatively, as 'useful citizens'. With what I said above, is it not clear that the problem, which showed up by considering 'older' people, may actually spread into ever-earlier age groups over time?

In a population that is getting older—and in which, or so I hope, many of us alive today may actually achieve timely escape velocity and into a life of literally hundreds of productive years—is this not something that we should be acutely concerned about?

And, as a final thought, I just was reminded, by a programme on the radio, of the notion that the peak of creativity in many professions is considered to be achieved in a person's early years; typically in science you're talking 30-ish and in the arts 40-ish. I wonder how much—just like the memory loss myth alluded to in an article above–is a self-fulfilling prophecy.

Actually I don't wonder at all.