| up a level
from the algorithm-and-blues dept.
A.I.: Artificial Intelligence
[The following review is almost entirely composed of spoilers. Do not read it if you haven't seen the movie.]
I didn't like it at first, even though I really wanted to. Kubrick is a near god in my eyes, so I was determined to see the good in A.I., no matter what Spielberg did to it.
It seems to begin as an inquiry into the nature of love. What is real love, when simulated love can be made to serve genuine human needs? By exploring love in this context, the film hints that we will discover what it means to be an authentic being.
Once David is abandoned by his mother, though, the story is transformed into the archetypal quest. For a while, it seems possible that it might serve as a parable of the search for the Holy Guardian Angel, though how it could be consummated is unclear. But after Rouge City, after escaping again from the lab that created him, David is trapped, and the inevitable hollowness of his search crashes down on us just as the carnival ride crashes down on his vehicle, leaving him abjectly beseeching on the ocean floor for two thousand years.
If it had ended at this point, it might have come across as a piece of stealth existentialism, or possibly a Buddhist meditation on emptiness. But then, in the far future, it changes direction yet again.
The conclusion is the section of the movie that Spielberg has been most widely criticized for. It is a resolution of the quest that is acceptable on David's terms, but is utterly unsatisfying on ours. That it is delivered with all of the sentimental manipulation that the director is capable of only makes the dissatisfaction more cutting. I came to the conclusion after watching A.I. that art that says nothing about the human condition shouldn't be offered for human consumption.
It hit me several days later.
Has any other movie ever stated so clearly that we are programmed and that, although our programming may be relevant at the time it is burned in, it can become something monstrously inappropriate once circumstances change?
But if this is the message the story is intended to impart, wouldn't it have been sufficient to end with David pleading to the Blue Fairy as the seas iced over? If it had, though, it would have been too easy to come away with nothing more than nihilism. David wants something we all want, and ultimately, he Just Can't Have It.
My theory is that the discomfort that comes along with the ending is absolutely intentional. David gets the closest thing possible to the fulfillment of his impossible desire, and it is nowhere near enough for us. This forces us to look at the necessity of taking responsibility for our own programming. We are beings of the same order as David, as long as we never make the effort to transcend our imprinting. By asking ourselves, "Do I have a Blue Fairy that I pray to as my real opportunities slip by?" we transform ourselves into something real.
This, then, is a worthy successor to 2001: A Space Odyssey after all. 2001 announced to the world that the Transition (or should I say, "Equinox of the Gods"?) is taking place. A.I. tells us what each of us needs to accomplish for its realization.
< | >
|"As St. Paul says, 'Without shedding of blood there is no remission,' and who are we to argue with St. Paul?" -- Aleister Crowley|
|All trademarks and copyrights on this page are owned by their respective companies. Comments are owned by the Poster.|