Friday, March 20, 2009
God, Allah, the Universe and Hinduism
The Nature of God: THE SUPERIORITY OF HINDUISM TO ISLAM
from Part Two, Chapter 2 of INDIA IN DANGER: THE OFFENSIVE OF ISLAM by Bojil Kolarov
Chapter 2: About the superiority of Hinduism to Islam
The ignorant people over the world often name the Hindus “polytheists”. However, this is not at all true. Something more – the Hinduism is the only religion in the world able to explain the nature of the Only God and to defend the ideas of monism from the attacks of the atheists. Once the delusion comes from the ignorance of the monotheists, not knowing what exactly they are praying at, and second – from the imperfection of the human language, which is unable to express in words the whole complexity of the higher reality. What exactly are we talking about?
As it was already stated, God is the other name of the Absolute Spirit. It is created from the highest and finest substance of the Universe – the spiritual substance, that is the cause for the material worlds. Therefore, God being the pure essence of spirit, needs not to be personality (as far as personality means the presence of another rougher substance – the mental one). The ancient Indians understood this very well and that’s why they names the Deity with the word “Brahman” i.e the impersonal higher reality or the pure spirit of the Universe.
However, the not –so- enlightened people from the Near East, due to their inability to understand the real nature of Brahman, turned the latter into a personality. Unknowingly they decided that if God should be considered Almighty, he might more or less resemble man. So they ascribed to him some Ego (mind), and began giving his individuality different names like Jehovah, Allah etc. The Muslims and the Christians thought that in this way they exalt God, but in fact they abased Him. Because nothing can be higher than the pure, virgin Spirit, “unpolluted” by any Ego.** This east was certainly known to Moses, Christ and Mohammad, but they were approaching rather primitive people and could not preach some very complex philosophy. As for the so called “Gods” of the Hindus, the worshipping of whom brought real outburst of indignation in the followers of the monotheistic religions, we must say they should not be accepted so literally, but more as corresponding to the angels and archangels in Christianity and Islam. For example, it is known that for the Islam there are about three hundred million Gods of the Hinduism. It is obvious, in this case it is a trick of words, it is not so important whether the higher creatures in the Universe will be called “Gods” or “Angels”, but the content of word is more important. In this case, it is more important to be accepted the undivided authority in the Universe, its sole First source.
From all we have said so far it turns out that the Hindus are not at all polytheists, but they are more monists. The Hindus' Brahman corresponds to the Muslim’s Allah, and the Hindus' gods to the Muslim’s angels and archangels. The fact that the Muslim’s clergymen for 1000 years could not understand this obvious fact is a clear sign for the imperperfection of Islam. The Indian priests have always known that Islam is a rather rough and simplified approach to the problems of metaphysics and philosophy.
This is illustrated by the fact that the Muslims do not know where to look for God. Considering Him as a personality, they naturally seek him somewhere in the objectivised world (often in the sky), not being able to understand that he is inside man. The truth about the identity between the Absolute Spirit (Brahman) and the individual Spirit (Atman) is revealed only to the Vedanta. The philosophic system of Vedanta undoubtedly, is the highest point to which human speculation has ever reached and only it can explain and unite all world religions. To compare the Muslim philosophy with Vedanta is as if to compare the dry mountains of Yemen with the Himalayas.
More at http://bojilkolarov.voiceofdharma.com/offensive.html#2ch2
from
INDIA IN DANGER: THE OFFENSIVE OF ISLAM
by Bojil Kolarov
http://bojilkolarov.voiceofdharma.com/offensive.html
AND
"The God"
The religion of Islam has as its focus of worship a deity by the name of "Allah." The Muslims claim that Allah in pre-Islamic times was the biblical God of the Patriarchs, prophets, and apostles. The issue is thus one of continuity. Was "Allah" the biblical God or a pagan god in Arabia during pre- Islamic times? The Muslim's claim of continuity is essential to their attempt to convert Jews and Christians for if "Allah" is part of the flow of divine revelation in Scripture, then it is the next step in biblical religion. Thus we should all become Muslims. But, on the other hand, if Allah was a pre- Islamic pagan deity, then its core claim is refuted. Religious claims often fall before the results of hard sciences such as archeology. We can endlessly speculate about the past or go and dig it up and see what the evidence reveals. This is the only way to find out the truth concerning the origins of Allah. As we shall see, the hard evidence demonstrates that the god Allah was a pagan deity. In fact, he was the Moon-god who was married to the sun goddess and the stars were his daughters.
from http://www.biblebelievers.org.au/moongod.htm
AND FURTHER . . .
The Ultimate Reality
[quote]
In Hindu esoteric imagination, the supreme and ultimate reality is believed to reside in the Universal Soul, which is said to pervade the entire manifested cosmos. The cosmos itself is thought to have evolved from this abstract entity, which is formless and devoid of any qualitative attributes (Skt. Nirguna Brahman). It is neither male nor female, and is infinite, without beginning or end. It is both around us and inside us. The goal indeed of all spiritual practice is to unite with this Supreme Soul.
To the eternal credit of Indian creativity, abstract concepts such as the one above are made intelligible to ordinary mortals like you and me through the invention of various forms which make comprehensible the ultimate, formless reality. Thus the Nirguna Brahmana (Nirguna - without quality) becomes Saguna Brahmana (Saguna - having qualities). This transformed entity is known in Sanskrit as Ishvara.
The entire universe, along with the dynamic processes underlying it, is said to stem from Ishvara. For example, when Ishvara creates the universe, he is called Brahma, when he protects, he is called Vishnu, and when he destroys, he is Shiva. The three together constitute the trinity, which controls the universe and all its functions.
--from a Hindu site (not sure which one, but, as I recall, it might be http://www.hindu-blog.com/)
Farthest end of Universe Man has ever seen
By Ganpati Sarkar
"If you are thinking those bright dots as stars then think again . They are not. In fact, each of them , each single , bright or dim , every single dot here in this picture is a galaxy , and each galaxy has millions of stars in it".
www.ganpatisarkar.com/2008_04_01_archive.html
CU-Boulder Supercomputer Simulation Of Universe Expected To Help In Search For Missing Matter
Dec. 6, 2007
Pictured is a portion of a supercomputer simulation of the universe showing a region roughly 1.5 billion light-years on a side. The bright object in the center is a galaxy cluster about 1 million-billion times the mass of the sun. In between the filaments, which store most of the universe's mass, are giant, spherical voids nearly empty of matter.
Much of the gaseous mass of the universe is bound up in a tangled web of cosmic filaments that stretch for hundreds of millions of light-years, according to a new supercomputer study by a team led by the University of Colorado at Boulder. The study indicated a significant portion of the gas is in the filaments -- which connect galaxy clusters -- hidden from direct observation in enormous gas clouds in intergalactic space known as the Warm-Hot Intergalactic Medium, or WHIM, said CU-Boulder Professor Jack Burns of the astrophysical and planetary sciences department. The team performed one of the largest cosmological supercomputer simulations ever, cramming 2.5 percent of the visible universe inside a computer to model a region more than 1.5 billion light-years across. One light-year is equal to about six trillion miles. A paper on the subject will be published in the Dec. 10 issue of the Astrophysical Journal. In addition to Burns, the paper was authored by CU-Boulder Research Associate Eric Hallman of APS, Brian O'Shea of Los Alamos National Laboratory, Michael Norman and Rick Wagner of the University of California, San Diego and Robert Harkness of the San Diego Supercomputing Center. It took the researchers nearly a decade to produce the extraordinarily complex computer code that drove the simulation, which incorporated virtually all of the known physical conditions of the universe reaching back in time almost to the Big Bang, said Burns. The simulation -- which uses advanced numerical techniques to zoom-in on interesting structures in the universe -- modeled the motion of matter as it collapsed due to gravity and became dense enough to form cosmic filaments and galaxy structures. "We see this as a real breakthrough in terms of technology and in scientific advancement," said Burns. "We believe this effort brings us a significant step closer to understanding the fundamental constituents of the universe." According to the standard cosmological model, the universe consists of about 25 percent dark matter and 70 percent dark energy around 5 percent normal matter, said Burns. Normal matter consists primarily of baryons - hydrogen, helium and heavier elements -- and observations show that about 40 percent of the baryons are currently unaccounted for. Many astrophysicists believe the missing baryons are in the WHIM, Burns said. "In the coming years, I believe these filaments may be detectable in the WHIM using new state-of-the-art telescopes," said Burns, who along with Hallman is a fellow at CU-Boulder's Center for Astrophysics and Space Astronomy. "We think that as we begin to see these filaments and understand their nature, we will learn more about the missing baryons in the universe." Two of the key telescopes that astrophysicists will use in their search for the WHIM are the 10-meter South Pole Telescope in Antarctica and the 25-meter Cornell-Caltech Atacama Telescope, or CCAT, being built in Chile's Atacama Desert, Burns said. CU-Boulder scientists are partners in both observatories. The CCAT telescope will gather radiation from sub-millimeter wavelengths, which are longer than infrared waves but shorter than radio waves. It will enable astronomers to peer back in time to when galaxies first appeared -- just a billion years or so after the Big Bang -- allowing them to probe the infancy of the objects and the process by which they formed, said Burns. The South Pole Telescope looks at millimeter, sub-millimeter and microwave wavelengths of the spectrum and is used to search for, among other things, cosmic microwave background radiation - the cooled remnants of the Big Bang, said Burns. Researchers hope to use the telescopes to estimate heating of the cosmic background radiation as it travels through the WHIM, using the radiation temperature changes as a tracer of sorts for the massive filaments. The CU-Boulder-led team ran the computer code for a total of about 500,000 processor hours at two supercomputing centers --the San Diego Supercomputer Center and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign. The team generated about 60 terabytes of data during the calculations, equivalent to three-to-four times the digital text in all the volumes in the U.S. Library of Congress, said Burns. Burns said the sophisticated computer code used for the universe simulation is similar in some respects to a code used for complex supercomputer simulations of Earth's atmosphere and climate change, since both investigations focus heavily on fluid dynamics. The Astrophysical Journal study was funded by NASA, the National Science Foundation and the U.S. Department of Energy through the Los Alamos National Laboratory.
Contact: Jack Burns, (303) 735-0963 Jack.Burns@cu.edu Jim Scott, (303) 492-3114
Office of News Services584 UCB • Boulder, CO 80309-0584 • 303-492-6431 • FAX: 303-492-3126 • cunews@colorado.edu© Regents of the University of Colorado PrivacyA University Communications site
www.colorado.edu/news/releases/2007/478.html
A SWIRE Picture is Worth Billions of Years
These spectacular images, taken by the Spitzer Wide-area Infrared Extragalactic (SWIRE) Legacy project, encapsulate one of the primary objectives of the Spitzer mission: to connect the evolution of galaxies from the distant, or early, universe to the nearby, or present day, universe. The larger picture (top) depicts one-tenth of the SWIRE survey field called ELAIS-N1. In this image, the bright blue sources are hot stars in our own Milky Way, which range anywhere from 3 to 60 times the mass of our Sun. The fainter green spots are cooler stars and galaxies beyond the Milky Way whose light is dominated by older stellar populations. The red dots are dusty galaxies that are undergoing intense star formation. The faintest specks of red-orange are galaxies billions of light-years away in the distant universe. The three lower panels highlight several regions of interest within the ELAIS-N1 field. The Tadpole galaxy (bottom left) is the result of a recent galactic interaction in the local universe. Although these galactic mergers are rare in the universe's recent history, astronomers believe that they were much more common in the early universe. Thus, SWIRE team members will use this detailed image of the Tadpole galaxy to help understand the nature of the "faint red-orange specks" of the early universe. The middle panel features an unusual ring-like galaxy called CGCG 275-022. The red spiral arms indicate that this galaxy is very dusty and perhaps undergoing intense star formation. The star-forming activity could have been initiated by a near head-on collision with another galaxy. The most distant galaxies that SWIRE is able to detect are revealed in a zoom of deep space (bottom right). The colors in this feature represent the same objects as those in the larger field image of ELAIS-N1. The observed SWIRE fields were chosen on the basis of being "empty" or as free as possible from the obscuring dust, gas, and stars of our own Milky Way. Because Earth is located within the Milky Way galaxy, there is always a screen of Milky Way objects blocking our view of the rest of the universe. In some places, our view of the larger universe is less obscured than others and for the most part is considered "empty." These are prime observing spots for astronomers interested in studying objects beyond the Milky Way. ELAIS-N1 is only one of six SWIRE survey fields. The full survey covers 49 square degrees of the sky, equivalent to the area covered by about 250 full moons. The SWIRE image is a 3-channel false-color composite, where blue represents visible green light (light that would appear to be blue/green to the human eye), green captures 3.6 microns, and red represents emissions at 8 microns. Interesting Note: From the Earth the SWIRE image (top image) can be seen in one square degree of sky, or a patch of sky that is approximately the size of a pea held out at arms length.
http://www.nasaimages.org/luna/servlet/detail/nasaNAS~12~12~64107~168461:A-SWIRE-Picture-is-Worth-Billions-o
A SWIRE Picture is Worth Billions of Years
Spitzer Wide-area Infrared Extragalactic (SWIRE)
NASA/JPL-Caltech/C. Lonsdale (Caltech/IPAC) and the SWIRE Team
2005/10/27
Galaxy evolution in cyber universe matches
astronomical observations in fine detail
June 5th, 2006
This image from a supercomputer simulation of the evolution of the universe shows a cubic volume of outer space measuring approximately 280 million light years across. At this stage, the universe is 13.4 billion years old (the present). The bright dots correspond with high concentrations of dark matter, which are associated with sites of galaxy formation. The simulation shows how dark matter, an invisible material of unknown composition, herded luminous matter in the universe from its initial smooth state into the cosmic web of galaxies and galaxy clusters that populate the universe.
Credit: Image courtesy of Andrey Kravtsov
Scientists at the University of Chicago have bolstered the case for a popular scenario of the big bang theory that neatly explains the arrangement of galaxies throughout the universe. Their supercomputer simulation shows how dark matter, an invisible material of unknown composition, herded luminous matter in the universe from its initial smooth state into the cosmic web of galaxies and galaxy clusters that populate the universe.
Hindu Concept of the Beginning and End of Universe
http://www.hindu-blog.com/2007/04/hindu-concept-of-beginning-and-end-of.html
The video [at http://www.hindu-blog.com/2007/04/hindu-concept-of-beginning-and-end-of.html] compares the concept of beginning and end of universe in Hinduism with that of modern cosmology. The video is presented by Carl Edward Sagan - an American astronomer and astrobiologist and a highly successful popularizer of astronomy, astrophysics, and other natural sciences.
Below is the transcript of the video. This is because the subject matter is very complex and you might need repeated listening.
Hindu religion is the only one of the world’s great faiths dedicated to the idea that the cosmos itself undergoes an immense, indeed an infinite number of deaths and rebirths.
It is the only religion in which the time scales correspond, no doubt, by accident, to those of modern scientific cosmology. Its cycles run from our ordinary day and night to a day and night of Brahma 8.64 billion years long. Longer than the age of the earth or the sun and about half of the time since the big bang. And there are much longer time scales still.
There is the deep and the appealing notion that the universe is but the dream of the god who after a 100 Brahma years… dissolves himself into a dreamless sleep… and the universe dissolves with him… until after another Brahma century… he starts… recomposes himself and begins again the dream… the great cosmic lotus dream.
Meanwhile… elsewhere… there are an infinite number of other universes… each with its own god… dreaming the cosmic dream…
These great ideas are tempered by another perhaps still greater it is said that men may not be the dreams of the gods but rather that the gods are the dreams of men.
In India, there are many gods and each god has many manifestations. These Chola bronzes cast in the eleventh century include several different incarnations of the god Shiva. Seen here at his wedding.
The most elegant and sublime of these bronzes is a representation of the creation of the Universe at the beginning of each cosmic cycle – a motif known as the cosmic dance of Shiva. The god has four hands. In the upper right hand is the drum whose sound is the sound of creation. And in the upper left hand is a tongue of flame… a reminder that the universe now newly created… will billion of years from now will be utterly destroyed. Creation. Destruction.
These profound and lovely ideas are central to ancient Hindu beliefs as exemplified in this Chola temple at …. They are kind of reminiscent of modern astronomical ideas. Without doubt the universe has been expanding since the big bang but it is by no means clear that it will continue to expand for ever. If there is less than a certain amount of matter in the universe, then the mutual gravitation of the receiving galaxies will be insufficient to stop the expansion and the Universe will run away forever. But if there is more matter than we can see…hidden away in black holes… say or in hot but invisible gas between galaxies, then the universe holds together, and partakes in every Indian succession of cycles… expansion followed by contraction… cosmos upon cosmos…Universes without end. If we live in such an oscillating universe, then the Big Bang is not the creation of the cosmos but merely the end of the previous cycle the destruction of the last incarnation of the cosmos.
Neither of these modern cosmologies may be altogether to our liking. In one cosmology, the universe is created somehow from nothing 15 to 20 billion years ago and expands forever. The galaxy is mutually receding until the last one disappears over our cosmic horizon. Then the galactic astronomers are out of business… the stars cool and die…matter itself decays…and the Universe becomes a thin cold haze of elementary particles.
In the other, the oscillating universe, the cosmos has no beginning and no end… and we are in the midst of an infinite cycle of cosmic deaths and rebirths. With no information trickling through the cusps of the oscillation…nothing of the galaxies, stars, planets, life forms, civilizations evolved in the previous incarnation of the universe trickles through the cusp filters past the Big Bang to be known in our universe.
The death of the universe in either cosmology may seem little depressing. But we may take some solace in the time scales involved. These events will take tens of billions of years or more. Human beings or our descendants whoever they might be can do a great deal of good in the tens of billions of years before the cosmos dies.
For those people with small kpbs connection, the video might pause a number of times while playing the first time. But after the completion of the video once, you can play it again and view without pauses.
Posted by abhilash on 28.4.07
Follow the link for the latest on: Hindu, Hinduism, Science, videos
14 comments at
http://www.hindu-blog.com/2007/04/hindu-concept-of-beginning-and-end-of.html
Carl Sagan
From Wikipedia, the free encyclopedia
Carl Edward Sagan, Ph.D. (November 9, 1934 – December 20, 1996) was an American
astrochemist, author, and highly successful popularizer of astronomy, astrophysics and other natural sciences. He pioneered exobiology and promoted the Search for Extra-Terrestrial Intelligence (SETI).
[excerpt]
Sagan wrote frequently about religion and the relationship between religion and science, expressing his skepticism about the conventional conceptualization of God as a sapient being. Sagan once stated, for instance, that "The idea that God is an oversized white male with a flowing beard, who sits in the sky and tallies the fall of every sparrow is ludicrous. But if by 'God' one means the set of physical laws that govern the universe, then clearly there is such a God. This God is emotionally unsatisfying ... it does not make much sense to pray to the law of gravity."[24] Sagan is also widely regarded as a freethinker or skeptic; one of his most famous quotations, in Cosmos, was, "Extraordinary claims require extraordinary evidence." (This was actually based on a nearly identical earlier quote by fellow CSICOP founder Marcello Truzzi, "Extraordinary claims require extraordinary proof."[25]) The quote is also known, under different wording, as the principle of Laplace—attributed to Pierre-Simon Marquis de Laplace (1749-1827), a French mathematician and astronomer: "The weight of evidence for an extraordinary claim must be proportioned to its strangeness." Sagan was, however, not an atheist, expressing that, "An atheist has to know a lot more than I know."[26] In reply to a direct question in 1996 about his religious beliefs, "Sagan gave a direct answer: 'I'm agnostic.'"[27]
http://en.wikipedia.org/wiki/Carl_Sagan
ORIGIN OF HINDUISM
Origin of Hinduism - I
If you ask about the origin of Hinduism to a person who has perceived the essence of Hinduism, the answer will be a simple smile. This is because Hinduism has no history, it believes in the present. This might be hard for a common man to digest because we live in a world which gives so much importance to history. If you are looking for dates and other facts you can find it in these articles – Year of origin of Hinduism and History of Hindu religion and Origin of the term Hindu.
http://www.hindu-blog.com/2006/09/origin-of-hinduism-i.html
The Religion of Arabia
During the nineteenth century, Amaud, Halevy and Glaser went to Southern Arabia and dug up thousands of Sabean, Minaean, and Qatabanian inscriptions which were subsequently translated. In the 1940's, the archeologists G. Caton Thompson and Carleton S. Coon made some amazing discoveries in Arabia. During the 1950's, Wendell Phillips, W.F. Albright, Richard Bower and others excavated sites at Qataban, Timna, and Marib (the ancient capital of Sheba). Thousands of inscriptions from walls and rocks in Northern Arabia have also been collected.
Reliefs and votive bowls used in worship of the "daughters of Allah" have also been discovered. The three daughters, al-Lat, al-Uzza and Manat are sometimes depicted together with Allah the Moon-god represented by a crescent moon above them. The archeological evidence demonstrates that the dominant religion of Arabia was the cult of the Moon-god.
In Old Testament times, Nabonidus (555-539 BC), the last king of Babylon, built Tayma, Arabia as a center of Moon-god worship. Segall stated, "South Arabia's stellar religion has always been dominated by the Moon-god in various variations." Many scholars have also noticed that the Moon-god's name "Sin" is a part of such Arabic words as "Sinai," the "wilderness of Sin," etc. When the popularity of the Moon-god waned elsewhere, the Arabs remained true to their conviction that the Moon-god was the greatest of all gods. While they worshipped 360 gods at the Kabah in Mecca, the Moon-god was the chief deity. Mecca was in fact built as a shrine for the Moon-god.
http://www.biblebelievers.org.au/moongod.htm
Hindu Gods and Goddesses
. . . numerous gods and goddesses in the form of idols and images serve the purpose of concentration.
In the words of Sir C P Ramaswamy Iyer "Hinduism has recognized different stages of evolutionary progress in the case of several races and classes of mankind and has not only authorized but actually encouraged the adoration of pictures and images as a means of concentration."
Thousands of years ago Sanatana Dharma or Hinduism flourished in an India (Bharat) which had many races and tribes. They worshiped multiplicity of objects, gods and goddesses and many of them were manifestations of different aspects of Mature Nature.
Take a look at the three main deities in Hinduism – Brahma, Vishnu and Shiva. Brahma creates, Vishnu sustains and Shiva destroys. They are nothing but three different faces of Mother Nature. Nature creates, nature sustains and nature destroys.
Each god and goddess in Hinduism is a path to reach the ultimate reality. If you are not happy with the 330 million, create a new god and pray to it. This new god will be a new path to merge with the Brahman.
You may also like to read
The symbolism of 330 million gods and about the number 33
http://www.hindu-blog.com/2007/05/hindu-gods-and-goddesses.html
Vishnu is . . . described in the Bhagavad Gita as having a 'Universal Form' (Vishvarupa) which is beyond the ordinary limits of human sense perception.[8]
http://en.wikipedia.org/wiki/Vishnu
The ten incarnations or ‘Dasa Avatara’ of Lord Vishnu is an extraordinary recording of the evolution of human life and advance in human civilization. In Hindu religion, the three main deities are Lord Brahma, Vishu and Shiva. Brahma creates, Vishnu protects and Shiva destroys - three faces of Mother Nature. Lord Vishnu descends on Earth to uphold dharma and to cleanse the Earth of evil. So far, Lord Vishnu has appeared nine times on Earth and the tenth, kalki, is expected.
from Hinduism and the Evolution of Human civilization
http://www.hindu-blog.com/2007/06/ten-incarnations-of-lord-vishnu-in.html
Rare 'Star-Making Machine' Found in Distant Universe
July 10th, 2008
The green and red splotch in this image is the most active star-making galaxy in the very distant universe. Credit: NASA Credit: NASA
Astronomers have uncovered an extreme stellar machine -- a galaxy in the very remote universe pumping out stars at a surprising rate of up to 4,000 per year. In comparison, our own Milky Way galaxy turns out an average of just 10 stars per year.
The discovery, made possible by several telescopes including NASA's Spitzer Space Telescope, goes against the most common theory of galaxy formation. According to the theory, called the Hierarchical Model, galaxies slowly bulk up their stars over time by absorbing tiny pieces of galaxies -- and not in one big burst as observed in the newfound "Baby Boom" galaxy.
"This galaxy is undergoing a major baby boom, producing most of its stars all at once," said Peter Capak of NASA's Spitzer Science Center at the California Institute of Technology, Pasadena. "If our human population was produced in a similar boom, then almost all of the people alive today would be the same age." Capak is lead author of a new report detailing the discovery in the July 10th issue of Astrophysical Journal Letters .
The Baby Boom galaxy, which belongs to a class of galaxies called starbursts, is the new record holder for the brightest starburst galaxy in the very distant universe, with brightness being a measure of its extreme star-formation rate. It was discovered and characterized using a suite of telescopes operating at different wavelengths. NASA's Hubble Space Telescope and Japan's Subaru Telescope, atop Mauna Kea in Hawaii, first spotted the galaxy in visible-light images, where it appeared as an inconspicuous smudge due to is great distance.
It wasn't until Spitzer and the James Clerk Maxwell Telescope, also on Mauna Kea in Hawaii, observed the galaxy at infrared and submillimeter wavelengths, respectively, that the galaxy stood out as the brightest of the bunch. This is because it has a huge number of youthful stars.
When stars are born, they shine with a lot of ultraviolet light and produce a lot of dust. The dust absorbs the ultraviolet light but, like a car sitting in the sun, it warms up and re-emits light at infrared and submillimeter wavelengths, making the galaxy unusually bright to Spitzer and the James Clerk Maxwell Telescope.
To learn more about this galaxy's unique youthful glow, Capak and his team followed up with a number of telescopes. They used optical measurements from Keck to determine the exact distance to the galaxy -- a whopping12.3 billion light-years. That's looking back to a time when the universe was 1.3 billion years old (the universe is approximately 13.7 billion years old today).
"If the universe was a human reaching retirement age, it would have been about 6 years old at the time we are seeing this galaxy," said Capak.
The astronomers made measurements at radio wavelengths with the National Science Foundation's Very Large Array in New Mexico. Together with Spitzer and James Clerk Maxwell data, these observations allowed the astronomers to calculate a star-forming rate of about 1,000 to 4,000 stars per year. At that rate, the galaxy needs only 50 million years, not very long on cosmic timescales, to grow into a galaxy equivalent to the most massive ones we see today.
While galaxies in our nearby universe can produce stars at similarly high rates, the farthest one known before now was about 11.7 billion light-years away, or a time when the universe was 1.9 billion years old.
"Before now, we had only seen galaxies form stars like this in the teenaged universe, but this galaxy is forming when the universe was only a child," said Capak. "The question now is whether the majority of the very most massive galaxies form very early in the universe like the Baby Boom galaxy, or whether this is an exceptional case. Answering this question will help us determine to what degree the Hierarchical Model of galaxy formation still holds true."
"The incredible star-formation activity we have observed suggests that we may be witnessing, for the first time, the formation of one of the most massive elliptical galaxies in the universe," said co-author Nick Scoville of Caltech, the principal investigator of the Cosmic Evolution Survey, also known as Cosmos. The Cosmos program is an extensive survey of a large patch of distant galaxies across the full spectrum of light.
"The immediate identification of this galaxy with its extraordinary properties would not have been possible without the full range of observations in this survey," said Scoville.
Source: NASA Source: NASA
http://www.physorg.com/news134922376.html
Images and Comments from nexusnovel.wordpress.com/.../
COMMENTS:
Arvind
***
In many respects new physics echoes the understanding of mystic seers and sages. The world is apparently solid but at subatomic level it is largely comprised of empty space like the void or shunya, or as energy vibrations of atoms manifest in the sound of AUM.
Syed
what every written above is just foolishness!!!y Brahma have 4 heads??y not 1?y not 5??and if AUM is the sound,then u must know that sound doesnt travell in space,so ur Brahma is absent there means AUM also absent in space!!!!all concepts r false and imaginary!!!man created!!!nothing else!!!b calm and cool now think that what every u ppl r doing is correct or not???ask ur heart!!!very best of luck!! i dont want u ppl will face musik on judgmentday!!!save urself,u wont get nething by doing such deeds!!!bye
on October 18, 2006 at 2:59 am
Images and Comments from nexusnovel.wordpress.com/.../
Hulagu
Syed - You should stick to your education from mohammad and the koran who say the earth is flat like a carpet and that the sun sets in a muddy pond in the arabian desert.
The AUM "sound" is a vibrating frequency just like light is and many cosmic rays that travel through space and are measurable on earth.
AUM is not a literal "sound" heard by the ear - it is a vibration FELT in the body near the area of the ear, especialy when you close your ears to all mundane sounds and when it is all quiet. It is the vibration felt by the soul resident in the body in the head, just like the soul is the driving force behind breath. When you are calm, and can fell this vibration, even in a totally dark room, you can sense the light of the soul with eyes closed. Consciousness of this OM vibration, inner light and breath means that you are beginning to realize who you are beyond the body.
On your day of judgement, all this knowledge will be forced down on you will come as a complete shock to you.
Hulagu 1258
May 3, 2007 at 3:20 pm
This fact answers the questions, "Why is Allah never defined in the Qur'an? Why did Muhammad assume that the pagan Arabs already knew who Allah was?" Muhammad was raised in the religion of the Moon-god Allah. But he went one step further than his fellow pagan Arabs. While they believed that Allah, i.e. the Moon-god, was the greatest of all gods and the supreme deity in a pantheon of deities, Muhammad decided that Allah was not only the greatest god but the only god.
In effect he said, "Look, you already believe that the Moon-god Allah is the greatest of all gods. All I want you to do is to accept that the idea that he is the only god. I am not taking away the Allah you already worship. I am only taking away his wife and his daughters and all the other gods." This is seen from the fact that the first point of the Muslim creed is not, "Allah is great" but "Allah is the greatest," i.e., he is the greatest among the gods. Why would Muhammad say that Allah is the "greatest" except in a polytheistic context? The Arabic word is used to contrast the greater from the lesser. That this is true is seen from the fact that the pagan Arabs never accused Muhammad of preaching a different Allah than the one they already worshipped. This "Allah" was the Moon-god according to the archeological evidence. Muhammad thus attempted to have it both ways. To the pagans, he said that he still believed in the Moon-god Allah. To the Jews and the Christians, he said that Allah was their God too. But both the Jews and the Christians knew better and that is why they rejected his god Allah as a false god.
Al-Kindi, one of the early Christian apologists against Islam, pointed out that Islam and its god Allah did not come from the Bible but from the paganism of the Sabeans. They did not worship the God of the Bible but the Moon-god and his daughters al-Uzza, al-Lat and Manat. Dr. Newman concludes his study of the early Christian-Muslim debates by stating, "Islam proved itself to be...a separate and antagonistic religion which had sprung up from idolatry." Islamic scholar Caesar Farah concluded "There is no reason, therefore, to accept the idea that Allah passed to the Muslims from the Christians and Jews." The Arabs worshipped the Moon-god as a supreme deity. But this was not biblical monotheism. While the Moon-god was greater than all other gods and goddesses, this was still a polytheistic pantheon of deities. Now that we have the actual idols of the Moon-god, it is no longer possible to avoid the fact that Allah was a pagan god in pre-Islamic times. Is it any wonder then that the symbol of Islam is the crescent moon? That a crescent moon sits on top of their mosques and minarets? That a crescent moon is found on the flags of Islamic nations? That the Muslims fast during the month which begins and ends with the appearance of the crescent moon in the sky?
http://www.biblebelievers.org.au/moongod.htm
. . . among ancient references, we do seem to find in the Papyrus of Ani several references to the god, though here, his name has been translated as Lah:
In Chapter 2:
"A spell to come forth by day and live after dying. Words spoken by the Osiris Ani: O One, bright as the moon-god Iah; O One, shining as Iah; This Osiris Ani comes forth among these your multitudes outside, bringing himself back as a shining one. He has opened the netherworld. Lo, the Osiris Osiris [sic] Ani comes forth by day, and does as he desires on earth among the living."
And again, in Chapter 18:
"[A spell to] cross over into the land of Amentet by day. Words spoken by the Osiris Ani: Hermopolis is open; my head is sealed [by] Thoth. The eye of Horus is perfect; I have delivered the eye of Horus, and my ornament is glorious on the forehead of Ra, the father of the gods. Osiris is the one who is in Amentet. Indeed, Osiris knows who is not there; I am not there. I am the moon-god Iah among the gods; I do not fail. Indeed, Horus stands; he reckons you among the gods."
Yah, the Other Egyptian Moon Godby Jimmy Dunn
http://www.touregypt.net/featurestories/yah.htm
God, Allah, the Universe and Hinduism
Friday, January 16, 2009
In this time of "Peace at any Price", Why Study War?
From City Journal
[Affluent Western societies have often proved reluctant to use force to prevent greater future violence. “War is an ugly thing, but not the ugliest of things,” observed the British philosopher John Stuart Mill. “The decayed and degraded state of moral and patriotic feeling which thinks that nothing is worth war is much worse.” -- Victor Davis Hanson]
Why Study War?
Military history teaches us about honor, sacrifice, and the inevitability of conflict.
Victor Davis Hanson
Summer 2007
[Affluent Western societies have often proved reluctant to use force to prevent greater future violence. “War is an ugly thing, but not the ugliest of things,” observed the British philosopher John Stuart Mill. “The decayed and degraded state of moral and patriotic feeling which thinks that nothing is worth war is much worse.” -- Victor Davis Hanson]
Try explaining to a college student that Tet was an American military victory. You’ll provoke not a counterargument—let alone an assent—but a blank stare: Who or what was Tet? Doing interviews about the recent hit movie 300, I encountered similar bewilderment from listeners and hosts. Not only did most of them not know who the 300 were or what Thermopylae was; they seemed clueless about the Persian Wars altogether.
It’s no surprise that civilian Americans tend to lack a basic understanding of military matters. Even when I was a graduate student, 30-some years ago, military history—understood broadly as the investigation of why one side wins and another loses a war, and encompassing reflections on magisterial or foolish generalship, technological stagnation or breakthrough, and the roles of discipline, bravery, national will, and culture in determining a conflict’s outcome and its consequences—had already become unfashionable on campus. Today, universities are even less receptive to the subject.
This state of affairs is profoundly troubling, for democratic citizenship requires knowledge of war—and now, in the age of weapons of mass annihilation, more than ever.
I came to the study of warfare in an odd way, at the age of 24. Without ever taking a class in military history, I naively began writing about war for a Stanford classics dissertation that explored the effects of agricultural devastation in ancient Greece, especially the Spartan ravaging of the Athenian countryside during the Peloponnesian War. The topic fascinated me. Was the strategy effective? Why assume that ancient armies with primitive tools could easily burn or cut trees, vines, and grain on thousands of acres of enemy farms, when on my family farm in Selma, California, it took me almost an hour to fell a mature fruit tree with a sharp modern ax? Yet even if the invaders couldn’t starve civilian populations, was the destruction still harmful psychologically? Did it goad proud agrarians to come out and fight? And what did the practice tell us about the values of the Greeks—and of the generals who persisted in an operation that seemingly brought no tangible results?
I posed these questions to my prospective thesis advisor, adding all sorts of further justifications. The topic was central to understanding the Peloponnesian War, I noted. The research would be interdisciplinary—a big plus in the modern university—drawing not just on ancient military histories but also on archaeology, classical drama, epigraphy, and poetry. I could bring a personal dimension to the research, too, having grown up around veterans of both world wars who talked constantly about battle. And from my experience on the farm, I wanted to add practical details about growing trees and vines in a Mediterranean climate.
Yet my advisor was skeptical. Agrarian wars, indeed wars of any kind, weren’t popular in classics Ph.D. programs, even though farming and fighting were the ancient Greeks’ two most common pursuits, the sources of anecdote, allusion, and metaphor in almost every Greek philosophical, historical, and literary text. Few classicists seemed to care any more that most notable Greek writers, thinkers, and statesmen—from Aeschylus to Pericles to Xenophon—had served in the phalanx or on a trireme at sea. Dozens of nineteenth-century dissertations and monographs on ancient warfare—on the organization of the Spartan army, the birth of Greek tactics, the strategic thinking of Greek generals, and much more—went largely unread. Nor was the discipline of military history, once central to a liberal education, in vogue on campuses in the seventies. It was as if the university had forgotten that history itself had begun with Herodotus and Thucydides as the story of armed conflicts.
What lay behind this academic lack of interest? The most obvious explanation: this was the immediate post-Vietnam era. The public perception in the Carter years was that America had lost a war that for moral and practical reasons it should never have fought—a catastrophe, for many in the universities, that it must never repeat. The necessary corrective wasn’t to learn how such wars started, went forward, and were lost. Better to ignore anything that had to do with such odious business in the first place.
The nuclear pessimism of the cold war, which followed the horror of two world wars, also dampened academic interest. The postwar obscenity of Mutually Assured Destruction had lent an apocalyptic veneer to contemporary war: as President Kennedy warned, “Mankind must put an end to war, or war will put an end to mankind.” Conflict had become something so destructive, in this view, that it no longer had any relation to the battles of the past. It seemed absurd to worry about a new tank or a novel doctrine of counterinsurgency when the press of a button, unleashing nuclear Armageddon, would render all military thinking superfluous.
Further, the sixties had ushered in a utopian view of society antithetical to serious thinking about war. Government, the military, business, religion, and the family had conspired, the new Rousseauians believed, to warp the naturally peace-loving individual. Conformity and coercion smothered our innately pacifist selves. To assert that wars broke out because bad men, in fear or in pride, sought material advantage or status, or because good men had done too little to stop them, was now seen as antithetical to an enlightened understanding of human nature. “What difference does it make,” in the words of the much-quoted Mahatma Gandhi, “to the dead, the orphans, and the homeless whether the mad destruction is wrought under the name of totalitarianism or the holy name of liberty and democracy?”
The academic neglect of war is even more acute today. Military history as a discipline has atrophied, with very few professorships, journal articles, or degree programs. In 2004, Edward Coffman, a retired military history professor who taught at the University of Wisconsin, reviewed the faculties of the top 25 history departments, as ranked by U.S. News and World Report. He found that of over 1,000 professors, only 21 identified war as a specialty. When war does show up on university syllabi, it’s often about the race, class, and gender of combatants and wartime civilians. So a class on the Civil War will focus on the Underground Railroad and Reconstruction, not on Chancellorsville and Gettysburg. One on World War II might emphasize Japanese internment, Rosie the Riveter, and the horror of Hiroshima, not Guadalcanal and Midway. A survey of the Vietnam War will devote lots of time to the inequities of the draft, media coverage, and the antiwar movement at home, and scant the air and artillery barrages at Khe Sanh.
Those who want to study war in the traditional way face intense academic suspicion, as Margaret Atwood’s poem “The Loneliness of the Military Historian” suggests:
Confess: it’s my profession
that alarms you.
This is why few people ask me to dinner,
though Lord knows I don’t go out of my
way to be scary.
Historians of war must derive perverse pleasure, their critics suspect, from reading about carnage and suffering. Why not figure out instead how to outlaw war forever, as if it were not a tragic, nearly inevitable aspect of human existence? Hence the recent surge of “peace studies” (see “The Peace Racket”).
The university’s aversion to the study of war certainly doesn’t reflect public lack of interest in the subject. Students love old-fashioned war classes on those rare occasions when they’re offered, usually as courses that professors sneak in when the choice of what to teach is left up to them. I taught a number of such classes at California State University, Stanford, and elsewhere. They’d invariably wind up overenrolled, with hordes of students lingering after office hours to offer opinions on the battles of Marathon and Lepanto.
Popular culture, too, displays extraordinary enthusiasm for all things military. There’s a new Military History Channel, and Hollywood churns out a steady supply of blockbuster war movies, from Saving Private Ryan to 300. The post–Ken Burns explosion of interest in the Civil War continues. Historical reenactment societies stage history’s great battles, from the Roman legions’ to the Wehrmacht’s. Barnes and Noble and Borders bookstores boast well-stocked military history sections, with scores of new titles every month. A plethora of websites obsess over strategy and tactics. Hit video games grow ever more realistic in their reconstructions of battles.
The public may feel drawn to military history because it wants to learn about honor and sacrifice, or because of interest in technology—the muzzle velocity of a Tiger Tank’s 88mm cannon, for instance—or because of a pathological need to experience violence, if only vicariously. The importance—and challenge—of the academic study of war is to elevate that popular enthusiasm into a more capacious and serious understanding, one that seeks answers to such questions as: Why do wars break out? How do they end? Why do the winners win and the losers lose? How best to avoid wars or contain their worst effects?
A wartime public illiterate about the conflicts of the past can easily find itself paralyzed in the acrimony of the present. Without standards of historical comparison, it will prove ill equipped to make informed judgments. Neither our politicians nor most of our citizens seem to recall the incompetence and terrible decisions that, in December 1777, December 1941, and November 1950, led to massive American casualties and, for a time, public despair. So it’s no surprise that today so many seem to think that the violence in Iraq is unprecedented in our history. Roughly 3,000 combat dead in Iraq in some four years of fighting is, of course, a terrible thing. And it has provoked national outrage to the point of considering withdrawal and defeat, as we still bicker over up-armored Humvees and proper troop levels. But a previous generation considered Okinawa a stunning American victory, and prepared to follow it with an invasion of the Japanese mainland itself—despite losing, in a little over two months, four times as many Americans as we have lost in Iraq, casualties of faulty intelligence, poor generalship, and suicidal head-on assaults against fortified positions.
It’s not that military history offers cookie-cutter comparisons with the past. Germany’s World War I victory over Russia in under three years and her failure to take France in four apparently misled Hitler into thinking that he could overrun the Soviets in three or four weeks—after all, he had brought down historically tougher France in just six. Similarly, the conquest of the Taliban in eight weeks in 2001, followed by the establishment of constitutional government within a year in Kabul, did not mean that the similarly easy removal of Saddam Hussein in three weeks in 2003 would ensure a working Iraqi democracy within six months. The differences between the countries—cultural, political, geographical, and economic—were too great.
Instead, knowledge of past wars establishes wide parameters of what to expect from new ones. Themes, emotions, and rhetoric remain constant over the centuries, and thus generally predictable. Athens’s disastrous expedition in 415 BC against Sicily, the largest democracy in the Greek world, may not prefigure our war in Iraq. But the story of the Sicilian calamity does instruct us on how consensual societies can clamor for war—yet soon become disheartened and predicate their support on the perceived pulse of the battlefield.
Military history teaches us, contrary to popular belief these days, that wars aren’t necessarily the most costly of human calamities. The first Gulf War took few lives in getting Saddam out of Kuwait; doing nothing in Rwanda allowed savage gangs and militias to murder hundreds of thousands with impunity. Hitler, Mao, Pol Pot, and Stalin killed far more off the battlefield than on it. The 1918 Spanish flu epidemic brought down more people than World War I did. And more Americans—over 3.2 million—lost their lives driving over the last 90 years than died in combat in this nation’s 231-year history. Perhaps what bothers us about wars, though, isn’t just their horrific lethality but also that people choose to wage them—which makes them seem avoidable, unlike a flu virus or a car wreck, and their tolls unduly grievous. Yet military history also reminds us that war sometimes has an eerie utility: as British strategist Basil H. Liddell Hart put it, “War is always a matter of doing evil in the hope that good may come of it.” Wars—or threats of wars—put an end to chattel slavery, Nazism, fascism, Japanese militarism, and Soviet Communism.
Military history is as often the story of appeasement as of warmongering. The destructive military careers of Alexander the Great, Caesar, Napoleon, and Hitler would all have ended early had any of their numerous enemies united when the odds favored them. Western air power stopped Slobodan Milošević’s reign of terror at little cost to NATO forces—but only after a near-decade of inaction and dialogue had made possible the slaughter of tens of thousands. Affluent Western societies have often proved reluctant to use force to prevent greater future violence. “War is an ugly thing, but not the ugliest of things,” observed the British philosopher John Stuart Mill. “The decayed and degraded state of moral and patriotic feeling which thinks that nothing is worth war is much worse.”
Indeed, by ignoring history, the modern age is free to interpret war as a failure of communication, of diplomacy, of talking—as if aggressors don’t know exactly what they’re doing. Speaker of the House Nancy Pelosi, frustrated by the Bush administration’s intransigence in the War on Terror, flew to Syria, hoping to persuade President Assad to stop funding terror in the Middle East. She assumed that Assad’s belligerence resulted from our aloofness and arrogance rather than from his dictatorship’s interest in destroying democracy in Lebanon and Iraq, before such contagious freedom might in fact destroy him. For a therapeutically inclined generation raised on Oprah and Dr. Phil—and not on the letters of William Tecumseh Sherman and William Shirer’s Berlin Diary—problems between states, like those in our personal lives, should be argued about by equally civilized and peaceful rivals, and so solved without resorting to violence.
Yet it’s hard to find many wars that result from miscommunication. Far more often they break out because of malevolent intent and the absence of deterrence. Margaret Atwood also wrote in her poem: “Wars happen because the ones who start them / think they can win.” Hitler did; so did Mussolini and Tojo—and their assumptions were logical, given the relative disarmament of the Western democracies at the time. Bin Laden attacked on September 11 not because there was a dearth of American diplomats willing to dialogue with him in the Hindu Kush. Instead, he recognized that a series of Islamic terrorist assaults against U.S. interests over two decades had met with no meaningful reprisals, and concluded that decadent Westerners would never fight, whatever the provocation—or that, if we did, we would withdraw as we had from Mogadishu.
In the twenty-first century, it’s easier than ever to succumb to technological determinism, the idea that science, new weaponry, and globalization have altered the very rules of war. But military history teaches us that our ability to strike a single individual from 30,000 feet up with a GPS bomb or a jihadist’s efforts to have his propaganda beamed to millions in real time do not necessarily transform the conditions that determine who wins and who loses wars.
True, instant communications may compress decision making, and generals must be skilled at news conferences that can now influence the views of millions worldwide. Yet these are really just new wrinkles on the old face of war. The improvised explosive device versus the up-armored Humvee is simply an updated take on the catapult versus the stone wall or the harquebus versus the mailed knight. The long history of war suggests no static primacy of the defensive or the offensive, or of one sort of weapon over the other, but just temporary advantages gained by particular strategies and technologies that go unanswered for a time by less adept adversaries.
So it’s highly doubtful, the study of war tells us, that a new weapon will emerge from the Pentagon or anywhere else that will change the very nature of armed conflict—unless some sort of genetic engineering so alters man’s brain chemistry that he begins to act in unprecedented ways. We fought the 1991 Gulf War with dazzling, computer-enhanced weaponry. But lost in the technological pizzazz was the basic wisdom that we need to fight wars with political objectives in mind and that, to conclude them decisively, we must defeat and even humiliate our enemies, so that they agree to abandon their prewar behavior. For some reason, no American general or diplomat seemed to understand that crucial point 16 years ago, with the result that, on the cessation of hostilities, Saddam Hussein’s supposedly defeated generals used their gunships to butcher Kurds and Shiites while Americans looked on. And because we never achieved the war’s proper aim—ensuring that Iraq would not use its petro-wealth to destroy the peace of the region—we have had to fight a second war of no-fly zones, and then a third war to remove Saddam, and now a fourth war, of counterinsurgency, to protect the fledgling Iraqi democracy.
Military history reminds us of important anomalies and paradoxes. When Sparta invaded Attica in the first spring of the Peloponnesian war, Thucydides recounts, it expected the Athenians to surrender after a few short seasons of ravaging. They didn’t—but a plague that broke out unexpectedly did more damage than thousands of Spartan ravagers did. Twenty-seven years later, a maritime Athens lost the war at sea to Sparta, an insular land power that started the conflict with scarcely a navy. The 2003 removal of Saddam refuted doom-and-gloom critics who predicted thousands of deaths and millions of refugees, just as the subsequent messy four-year reconstruction hasn’t evolved as anticipated into a quiet, stable democracy—to say the least.
The size of armies doesn’t guarantee battlefield success: the victors at Salamis, Issos, Mexico City, and Lepanto were all outnumbered. War’s most savage moments—the Allied summer offensive of 1918, the Russian siege of Berlin in the spring of 1945, the Battle of the Bulge, Hiroshima—often unfold right before hostilities cease. And democratic leaders during war—think of Winston Churchill, Harry Truman, and Richard Nixon—often leave office either disgraced or unpopular.
It would be reassuring to think that the righteousness of a cause, or the bravery of an army, or the nobility of a sacrifice ensures public support for war. But military history shows that far more often the perception of winning is what matters. Citizens turn abruptly on any leaders deemed culpable for losing. “Public sentiment is everything,” wrote Abraham Lincoln. “With public sentiment nothing can fail. Without it nothing can succeed. He who molds opinion is greater than he who enacts laws.” Lincoln knew that lesson well. Gettysburg and Vicksburg were brilliant Union victories that by summer 1863 had restored Lincoln’s previously shaky credibility. But a year later, after the Wilderness, Spotsylvania, Petersburg, and Cold Harbor battles—Cold Harbor claimed 7,000 Union lives in 20 minutes—the public reviled him. Neither Lincoln nor his policies had changed, but the Confederate ability to kill large numbers of Union soldiers had.
Ultimately, public opinion follows the ups and downs—including the perception of the ups and downs—of the battlefield, since victory excites the most ardent pacifist and defeat silences the most zealous zealot. After the defeat of France, the losses to Bomber Command, the U-boat rampage, and the fall of Greece, Singapore, and Dunkirk, Churchill took the blame for a war as seemingly lost as, a little later, it seemed won by the brilliant prime minister after victories in North Africa, Sicily, and Normandy. When the successful military action against Saddam Hussein ended in April 2003, over 70 percent of the American people backed it, with politicians and pundits alike elbowing each other aside to take credit for their prescient support. Four years of insurgency later, Americans oppose a now-orphaned war by the same margin. General George S. Patton may have been uncouth, but he wasn’t wrong when he bellowed, “Americans love a winner and will not tolerate a loser.” The American public turned on the Iraq War not because of Cindy Sheehan or Michael Moore but because it felt that the battlefield news had turned uniformly bad and that the price in American lives and treasure for ensuring Iraqi reform was too dear.
Finally, military history has the moral purpose of educating us about past sacrifices that have secured our present freedom and security. If we know nothing of Shiloh, Belleau Wood, Tarawa, and Chosun, the crosses in our military cemeteries are just pleasant white stones on lush green lawns. They no longer serve as reminders that thousands endured pain and hardship for our right to listen to what we wish on our iPods and to shop at Wal-Mart in safety—or that they expected future generations, links in this great chain of obligation, to do the same for those not yet born. The United States was born through war, reunited by war, and saved from destruction by war. No future generation, however comfortable and affluent, should escape that terrible knowledge.
What, then, can we do to restore the study of war to its proper place in the life of the American mind? The challenge isn’t just to reform the graduate schools or the professoriate, though that would help. On a deeper level, we need to reexamine the larger forces that have devalued the very idea of military history—of war itself. We must abandon the naive faith that with enough money, education, or good intentions we can change the nature of mankind so that conflict, as if by fiat, becomes a thing of the past. In the end, the study of war reminds us that we will never be gods. We will always just be men, it tells us. Some men will always prefer war to peace; and other men, we who have learned from the past, have a moral obligation to stop them.
Studying War: Where to Start
While Thucydides’ Peloponnesian War, a chronicle of the three-decade war between Athens and Sparta, establishes the genre of military history, the best place to begin studying war is with the soldiers’ stories themselves. E. B. Sledge’s memoir of Okinawa, With the Old Breed, is nightmarish, but it reminds us that war, while it often translates to rot, filth, and carnage, can also be in the service of a noble cause. Elmer Bendiner’s tragic retelling of the annihilation of B-17s over Germany, The Fall of Fortresses: A Personal Account of the Most Daring, and Deadly, American Air Battles of World War II, is an unrecognized classic.
From a different wartime perspective—that of the generals—U. S. Grant’s Personal Memoirs is justly celebrated as a model of prose. Yet the nearly contemporaneous Memoirs of General W. T. Sherman is far more analytical in its dissection of the human follies and pretensions that lead to war. Likewise, George S. Patton’s War As I Knew It is not only a compilation of the eccentric general’s diary entries but also a candid assessment of human nature itself.
Fiction often captures the experience of war as effectively as memoir, beginning with Homer’s Iliad, in which Achilles confronts the paradox that rewards do not always go to the most deserving in war. The three most famous novels about the futility of conflict are The Red Badge of Courage, by Stephen Crane, All Quiet on the Western Front, by Erich Maria Remarque, and August 1914, by Aleksandr Solzhenitsyn. No work has better insights on the folly of war, however, than Euripides’ Trojan Women.
Although many contemporary critics find it passé to document landmark battles in history, one can find a storehouse of information in The Fifteen Decisive Battles of the World, by Edward S. Creasy, and A Military History of the Western World, by J. F. C. Fuller. Hans Delbrück’s History of the Art of War and Russell F. Weigley’s The Age of Battles center their sweeping histories on decisive engagements, using battles like Marathon and Waterloo as tools to illustrate larger social, political, and cultural values. A sense of high drama permeates William H. Prescott’s History of the Conquest of Mexico and History of the Conquest of Peru, while tragedy more often characterizes Steven Runciman’s spellbinding short account The Fall of Constantinople 1453 and Donald Morris’s massive The Washing of the Spears, about the rise and fall of the Zulu Empire. The most comprehensive and accessible one-volume treatment of history’s most destructive war remains Gerhard L. Weinberg’s A World at Arms: A Global History of World War II.
Relevant histories for our current struggle with Middle East terrorism are Alistair Horne’s superb A Savage War of Peace: Algeria 1954–1962, Michael Oren’s Six Days of War, and Mark Bowden’s Black Hawk Down. Anything John Keegan writes is worth reading; The Face of Battle remains the most impressive general military history of the last 50 years.
Biography too often winds up ignored in the study of war. Plutarch’s lives of Pericles, Alcibiades, Julius Caesar, Pompey, and Alexander the Great established the traditional view of these great captains as men of action, while weighing their record of near-superhuman achievement against their megalomania. Elizabeth Longford’s Wellington is a classic study of England’s greatest soldier. Lee’s Lieutenants: A Study in Command, by Douglas Southall Freeman, has been slighted recently but is spellbinding.
If, as Carl von Clausewitz believed, “War is the continuation of politics by other means,” then study of civilian wartime leadership is critical. The classic scholarly account of the proper relationship between the military and its overseers is still Samuel P. Huntington’s The Soldier and the State: The Theory and Politics of Civil-Military Relations. For a contemporary J’accuse of American military leadership during the Vietnam War, see H. R. McMaster’s Dereliction of Duty: Lyndon Johnson, Robert McNamara, the Joint Chiefs of Staff, and the Lies That Led to Vietnam.
Eliot A. Cohen’s Supreme Command: Soldiers, Statesmen, and Leadership in Wartime is purportedly a favorite read of President Bush’s. It argues that successful leaders like Ben-Gurion, Churchill, Clemenceau, and Lincoln kept a tight rein on their generals and never confused officers’ esoteric military expertise with either political sense or strategic resolution.
In The Mask of Command, Keegan examines the military competence of Alexander the Great, Wellington, Grant, and Hitler, and comes down on the side of the two who fought under consensual government. In The Soul of Battle, I took that argument further and suggested that three of the most audacious generals—Epaminondas, Sherman, and Patton—were also keen political thinkers, with strategic insight into what made their democratic armies so formidable.
How politicians lose wars is also of interest. See especially Ian Kershaw’s biography Hitler, 1936–1945: Nemesis. Mark Moyar’s first volume of a proposed two-volume reexamination of Vietnam, Triumph Forsaken: The Vietnam War, 1954–1965, is akin to reading Euripides’ tales of self-inflicted woe and missed chances. Horne has written a half-dozen classics, none more engrossing than his tragic To Lose a Battle: France 1940.
Few historians can weave military narrative into the contemporary political and cultural landscape. James McPherson’s Battle Cry of Freedom does, and his volume began the recent renaissance of Civil War history. Barbara Tuchman’s The Guns of August describes the first month of World War I in riveting but excruciatingly sad detail. Two volumes by David McCullough, Truman and 1776, give fascinating inside accounts of the political will necessary to continue wars amid domestic depression and bad news from the front. So does Martin Gilbert’s Winston S. Churchill: Finest Hour, 1939–1941. Donald Kagan’s On the Origins of War and the Preservation of Peace warns against the dangers of appeasement, especially the lethal combination of tough rhetoric with no military preparedness, in a survey of wars from ancient Greece to the Cuban missile crisis. Robert Kagan’s Dangerous Nation reminds Americans that their idealism (if not self-righteousness) is nothing new but rather helps explain more than two centuries of both wise and ill-considered intervention abroad.
Any survey on military history should conclude with more abstract lessons about war. Principles of War by Clausewitz remains the cornerstone of the science. Niccolò Machiavelli’s The Art of War blends realism with classical military detail. Two indispensable works, War: Ends and Means, by Angelo Codevilla and Paul Seabury, and Makers of Modern Strategy, edited by Peter Paret, provide refreshingly honest accounts of the timeless rules and nature of war.
—Victor Davis Hanson
http://www.city-journal.org/html/17_3_military_history.html
[Affluent Western societies have often proved reluctant to use force to prevent greater future violence. “War is an ugly thing, but not the ugliest of things,” observed the British philosopher John Stuart Mill. “The decayed and degraded state of moral and patriotic feeling which thinks that nothing is worth war is much worse.” -- Victor Davis Hanson]
Why Study War?
Military history teaches us about honor, sacrifice, and the inevitability of conflict.
Victor Davis Hanson
Summer 2007
[Affluent Western societies have often proved reluctant to use force to prevent greater future violence. “War is an ugly thing, but not the ugliest of things,” observed the British philosopher John Stuart Mill. “The decayed and degraded state of moral and patriotic feeling which thinks that nothing is worth war is much worse.” -- Victor Davis Hanson]
Try explaining to a college student that Tet was an American military victory. You’ll provoke not a counterargument—let alone an assent—but a blank stare: Who or what was Tet? Doing interviews about the recent hit movie 300, I encountered similar bewilderment from listeners and hosts. Not only did most of them not know who the 300 were or what Thermopylae was; they seemed clueless about the Persian Wars altogether.
It’s no surprise that civilian Americans tend to lack a basic understanding of military matters. Even when I was a graduate student, 30-some years ago, military history—understood broadly as the investigation of why one side wins and another loses a war, and encompassing reflections on magisterial or foolish generalship, technological stagnation or breakthrough, and the roles of discipline, bravery, national will, and culture in determining a conflict’s outcome and its consequences—had already become unfashionable on campus. Today, universities are even less receptive to the subject.
This state of affairs is profoundly troubling, for democratic citizenship requires knowledge of war—and now, in the age of weapons of mass annihilation, more than ever.
I came to the study of warfare in an odd way, at the age of 24. Without ever taking a class in military history, I naively began writing about war for a Stanford classics dissertation that explored the effects of agricultural devastation in ancient Greece, especially the Spartan ravaging of the Athenian countryside during the Peloponnesian War. The topic fascinated me. Was the strategy effective? Why assume that ancient armies with primitive tools could easily burn or cut trees, vines, and grain on thousands of acres of enemy farms, when on my family farm in Selma, California, it took me almost an hour to fell a mature fruit tree with a sharp modern ax? Yet even if the invaders couldn’t starve civilian populations, was the destruction still harmful psychologically? Did it goad proud agrarians to come out and fight? And what did the practice tell us about the values of the Greeks—and of the generals who persisted in an operation that seemingly brought no tangible results?
I posed these questions to my prospective thesis advisor, adding all sorts of further justifications. The topic was central to understanding the Peloponnesian War, I noted. The research would be interdisciplinary—a big plus in the modern university—drawing not just on ancient military histories but also on archaeology, classical drama, epigraphy, and poetry. I could bring a personal dimension to the research, too, having grown up around veterans of both world wars who talked constantly about battle. And from my experience on the farm, I wanted to add practical details about growing trees and vines in a Mediterranean climate.
Yet my advisor was skeptical. Agrarian wars, indeed wars of any kind, weren’t popular in classics Ph.D. programs, even though farming and fighting were the ancient Greeks’ two most common pursuits, the sources of anecdote, allusion, and metaphor in almost every Greek philosophical, historical, and literary text. Few classicists seemed to care any more that most notable Greek writers, thinkers, and statesmen—from Aeschylus to Pericles to Xenophon—had served in the phalanx or on a trireme at sea. Dozens of nineteenth-century dissertations and monographs on ancient warfare—on the organization of the Spartan army, the birth of Greek tactics, the strategic thinking of Greek generals, and much more—went largely unread. Nor was the discipline of military history, once central to a liberal education, in vogue on campuses in the seventies. It was as if the university had forgotten that history itself had begun with Herodotus and Thucydides as the story of armed conflicts.
What lay behind this academic lack of interest? The most obvious explanation: this was the immediate post-Vietnam era. The public perception in the Carter years was that America had lost a war that for moral and practical reasons it should never have fought—a catastrophe, for many in the universities, that it must never repeat. The necessary corrective wasn’t to learn how such wars started, went forward, and were lost. Better to ignore anything that had to do with such odious business in the first place.
The nuclear pessimism of the cold war, which followed the horror of two world wars, also dampened academic interest. The postwar obscenity of Mutually Assured Destruction had lent an apocalyptic veneer to contemporary war: as President Kennedy warned, “Mankind must put an end to war, or war will put an end to mankind.” Conflict had become something so destructive, in this view, that it no longer had any relation to the battles of the past. It seemed absurd to worry about a new tank or a novel doctrine of counterinsurgency when the press of a button, unleashing nuclear Armageddon, would render all military thinking superfluous.
Further, the sixties had ushered in a utopian view of society antithetical to serious thinking about war. Government, the military, business, religion, and the family had conspired, the new Rousseauians believed, to warp the naturally peace-loving individual. Conformity and coercion smothered our innately pacifist selves. To assert that wars broke out because bad men, in fear or in pride, sought material advantage or status, or because good men had done too little to stop them, was now seen as antithetical to an enlightened understanding of human nature. “What difference does it make,” in the words of the much-quoted Mahatma Gandhi, “to the dead, the orphans, and the homeless whether the mad destruction is wrought under the name of totalitarianism or the holy name of liberty and democracy?”
The academic neglect of war is even more acute today. Military history as a discipline has atrophied, with very few professorships, journal articles, or degree programs. In 2004, Edward Coffman, a retired military history professor who taught at the University of Wisconsin, reviewed the faculties of the top 25 history departments, as ranked by U.S. News and World Report. He found that of over 1,000 professors, only 21 identified war as a specialty. When war does show up on university syllabi, it’s often about the race, class, and gender of combatants and wartime civilians. So a class on the Civil War will focus on the Underground Railroad and Reconstruction, not on Chancellorsville and Gettysburg. One on World War II might emphasize Japanese internment, Rosie the Riveter, and the horror of Hiroshima, not Guadalcanal and Midway. A survey of the Vietnam War will devote lots of time to the inequities of the draft, media coverage, and the antiwar movement at home, and scant the air and artillery barrages at Khe Sanh.
Those who want to study war in the traditional way face intense academic suspicion, as Margaret Atwood’s poem “The Loneliness of the Military Historian” suggests:
Confess: it’s my profession
that alarms you.
This is why few people ask me to dinner,
though Lord knows I don’t go out of my
way to be scary.
Historians of war must derive perverse pleasure, their critics suspect, from reading about carnage and suffering. Why not figure out instead how to outlaw war forever, as if it were not a tragic, nearly inevitable aspect of human existence? Hence the recent surge of “peace studies” (see “The Peace Racket”).
The university’s aversion to the study of war certainly doesn’t reflect public lack of interest in the subject. Students love old-fashioned war classes on those rare occasions when they’re offered, usually as courses that professors sneak in when the choice of what to teach is left up to them. I taught a number of such classes at California State University, Stanford, and elsewhere. They’d invariably wind up overenrolled, with hordes of students lingering after office hours to offer opinions on the battles of Marathon and Lepanto.
Popular culture, too, displays extraordinary enthusiasm for all things military. There’s a new Military History Channel, and Hollywood churns out a steady supply of blockbuster war movies, from Saving Private Ryan to 300. The post–Ken Burns explosion of interest in the Civil War continues. Historical reenactment societies stage history’s great battles, from the Roman legions’ to the Wehrmacht’s. Barnes and Noble and Borders bookstores boast well-stocked military history sections, with scores of new titles every month. A plethora of websites obsess over strategy and tactics. Hit video games grow ever more realistic in their reconstructions of battles.
The public may feel drawn to military history because it wants to learn about honor and sacrifice, or because of interest in technology—the muzzle velocity of a Tiger Tank’s 88mm cannon, for instance—or because of a pathological need to experience violence, if only vicariously. The importance—and challenge—of the academic study of war is to elevate that popular enthusiasm into a more capacious and serious understanding, one that seeks answers to such questions as: Why do wars break out? How do they end? Why do the winners win and the losers lose? How best to avoid wars or contain their worst effects?
A wartime public illiterate about the conflicts of the past can easily find itself paralyzed in the acrimony of the present. Without standards of historical comparison, it will prove ill equipped to make informed judgments. Neither our politicians nor most of our citizens seem to recall the incompetence and terrible decisions that, in December 1777, December 1941, and November 1950, led to massive American casualties and, for a time, public despair. So it’s no surprise that today so many seem to think that the violence in Iraq is unprecedented in our history. Roughly 3,000 combat dead in Iraq in some four years of fighting is, of course, a terrible thing. And it has provoked national outrage to the point of considering withdrawal and defeat, as we still bicker over up-armored Humvees and proper troop levels. But a previous generation considered Okinawa a stunning American victory, and prepared to follow it with an invasion of the Japanese mainland itself—despite losing, in a little over two months, four times as many Americans as we have lost in Iraq, casualties of faulty intelligence, poor generalship, and suicidal head-on assaults against fortified positions.
It’s not that military history offers cookie-cutter comparisons with the past. Germany’s World War I victory over Russia in under three years and her failure to take France in four apparently misled Hitler into thinking that he could overrun the Soviets in three or four weeks—after all, he had brought down historically tougher France in just six. Similarly, the conquest of the Taliban in eight weeks in 2001, followed by the establishment of constitutional government within a year in Kabul, did not mean that the similarly easy removal of Saddam Hussein in three weeks in 2003 would ensure a working Iraqi democracy within six months. The differences between the countries—cultural, political, geographical, and economic—were too great.
Instead, knowledge of past wars establishes wide parameters of what to expect from new ones. Themes, emotions, and rhetoric remain constant over the centuries, and thus generally predictable. Athens’s disastrous expedition in 415 BC against Sicily, the largest democracy in the Greek world, may not prefigure our war in Iraq. But the story of the Sicilian calamity does instruct us on how consensual societies can clamor for war—yet soon become disheartened and predicate their support on the perceived pulse of the battlefield.
Military history teaches us, contrary to popular belief these days, that wars aren’t necessarily the most costly of human calamities. The first Gulf War took few lives in getting Saddam out of Kuwait; doing nothing in Rwanda allowed savage gangs and militias to murder hundreds of thousands with impunity. Hitler, Mao, Pol Pot, and Stalin killed far more off the battlefield than on it. The 1918 Spanish flu epidemic brought down more people than World War I did. And more Americans—over 3.2 million—lost their lives driving over the last 90 years than died in combat in this nation’s 231-year history. Perhaps what bothers us about wars, though, isn’t just their horrific lethality but also that people choose to wage them—which makes them seem avoidable, unlike a flu virus or a car wreck, and their tolls unduly grievous. Yet military history also reminds us that war sometimes has an eerie utility: as British strategist Basil H. Liddell Hart put it, “War is always a matter of doing evil in the hope that good may come of it.” Wars—or threats of wars—put an end to chattel slavery, Nazism, fascism, Japanese militarism, and Soviet Communism.
Military history is as often the story of appeasement as of warmongering. The destructive military careers of Alexander the Great, Caesar, Napoleon, and Hitler would all have ended early had any of their numerous enemies united when the odds favored them. Western air power stopped Slobodan Milošević’s reign of terror at little cost to NATO forces—but only after a near-decade of inaction and dialogue had made possible the slaughter of tens of thousands. Affluent Western societies have often proved reluctant to use force to prevent greater future violence. “War is an ugly thing, but not the ugliest of things,” observed the British philosopher John Stuart Mill. “The decayed and degraded state of moral and patriotic feeling which thinks that nothing is worth war is much worse.”
Indeed, by ignoring history, the modern age is free to interpret war as a failure of communication, of diplomacy, of talking—as if aggressors don’t know exactly what they’re doing. Speaker of the House Nancy Pelosi, frustrated by the Bush administration’s intransigence in the War on Terror, flew to Syria, hoping to persuade President Assad to stop funding terror in the Middle East. She assumed that Assad’s belligerence resulted from our aloofness and arrogance rather than from his dictatorship’s interest in destroying democracy in Lebanon and Iraq, before such contagious freedom might in fact destroy him. For a therapeutically inclined generation raised on Oprah and Dr. Phil—and not on the letters of William Tecumseh Sherman and William Shirer’s Berlin Diary—problems between states, like those in our personal lives, should be argued about by equally civilized and peaceful rivals, and so solved without resorting to violence.
Yet it’s hard to find many wars that result from miscommunication. Far more often they break out because of malevolent intent and the absence of deterrence. Margaret Atwood also wrote in her poem: “Wars happen because the ones who start them / think they can win.” Hitler did; so did Mussolini and Tojo—and their assumptions were logical, given the relative disarmament of the Western democracies at the time. Bin Laden attacked on September 11 not because there was a dearth of American diplomats willing to dialogue with him in the Hindu Kush. Instead, he recognized that a series of Islamic terrorist assaults against U.S. interests over two decades had met with no meaningful reprisals, and concluded that decadent Westerners would never fight, whatever the provocation—or that, if we did, we would withdraw as we had from Mogadishu.
In the twenty-first century, it’s easier than ever to succumb to technological determinism, the idea that science, new weaponry, and globalization have altered the very rules of war. But military history teaches us that our ability to strike a single individual from 30,000 feet up with a GPS bomb or a jihadist’s efforts to have his propaganda beamed to millions in real time do not necessarily transform the conditions that determine who wins and who loses wars.
True, instant communications may compress decision making, and generals must be skilled at news conferences that can now influence the views of millions worldwide. Yet these are really just new wrinkles on the old face of war. The improvised explosive device versus the up-armored Humvee is simply an updated take on the catapult versus the stone wall or the harquebus versus the mailed knight. The long history of war suggests no static primacy of the defensive or the offensive, or of one sort of weapon over the other, but just temporary advantages gained by particular strategies and technologies that go unanswered for a time by less adept adversaries.
So it’s highly doubtful, the study of war tells us, that a new weapon will emerge from the Pentagon or anywhere else that will change the very nature of armed conflict—unless some sort of genetic engineering so alters man’s brain chemistry that he begins to act in unprecedented ways. We fought the 1991 Gulf War with dazzling, computer-enhanced weaponry. But lost in the technological pizzazz was the basic wisdom that we need to fight wars with political objectives in mind and that, to conclude them decisively, we must defeat and even humiliate our enemies, so that they agree to abandon their prewar behavior. For some reason, no American general or diplomat seemed to understand that crucial point 16 years ago, with the result that, on the cessation of hostilities, Saddam Hussein’s supposedly defeated generals used their gunships to butcher Kurds and Shiites while Americans looked on. And because we never achieved the war’s proper aim—ensuring that Iraq would not use its petro-wealth to destroy the peace of the region—we have had to fight a second war of no-fly zones, and then a third war to remove Saddam, and now a fourth war, of counterinsurgency, to protect the fledgling Iraqi democracy.
Military history reminds us of important anomalies and paradoxes. When Sparta invaded Attica in the first spring of the Peloponnesian war, Thucydides recounts, it expected the Athenians to surrender after a few short seasons of ravaging. They didn’t—but a plague that broke out unexpectedly did more damage than thousands of Spartan ravagers did. Twenty-seven years later, a maritime Athens lost the war at sea to Sparta, an insular land power that started the conflict with scarcely a navy. The 2003 removal of Saddam refuted doom-and-gloom critics who predicted thousands of deaths and millions of refugees, just as the subsequent messy four-year reconstruction hasn’t evolved as anticipated into a quiet, stable democracy—to say the least.
The size of armies doesn’t guarantee battlefield success: the victors at Salamis, Issos, Mexico City, and Lepanto were all outnumbered. War’s most savage moments—the Allied summer offensive of 1918, the Russian siege of Berlin in the spring of 1945, the Battle of the Bulge, Hiroshima—often unfold right before hostilities cease. And democratic leaders during war—think of Winston Churchill, Harry Truman, and Richard Nixon—often leave office either disgraced or unpopular.
It would be reassuring to think that the righteousness of a cause, or the bravery of an army, or the nobility of a sacrifice ensures public support for war. But military history shows that far more often the perception of winning is what matters. Citizens turn abruptly on any leaders deemed culpable for losing. “Public sentiment is everything,” wrote Abraham Lincoln. “With public sentiment nothing can fail. Without it nothing can succeed. He who molds opinion is greater than he who enacts laws.” Lincoln knew that lesson well. Gettysburg and Vicksburg were brilliant Union victories that by summer 1863 had restored Lincoln’s previously shaky credibility. But a year later, after the Wilderness, Spotsylvania, Petersburg, and Cold Harbor battles—Cold Harbor claimed 7,000 Union lives in 20 minutes—the public reviled him. Neither Lincoln nor his policies had changed, but the Confederate ability to kill large numbers of Union soldiers had.
Ultimately, public opinion follows the ups and downs—including the perception of the ups and downs—of the battlefield, since victory excites the most ardent pacifist and defeat silences the most zealous zealot. After the defeat of France, the losses to Bomber Command, the U-boat rampage, and the fall of Greece, Singapore, and Dunkirk, Churchill took the blame for a war as seemingly lost as, a little later, it seemed won by the brilliant prime minister after victories in North Africa, Sicily, and Normandy. When the successful military action against Saddam Hussein ended in April 2003, over 70 percent of the American people backed it, with politicians and pundits alike elbowing each other aside to take credit for their prescient support. Four years of insurgency later, Americans oppose a now-orphaned war by the same margin. General George S. Patton may have been uncouth, but he wasn’t wrong when he bellowed, “Americans love a winner and will not tolerate a loser.” The American public turned on the Iraq War not because of Cindy Sheehan or Michael Moore but because it felt that the battlefield news had turned uniformly bad and that the price in American lives and treasure for ensuring Iraqi reform was too dear.
Finally, military history has the moral purpose of educating us about past sacrifices that have secured our present freedom and security. If we know nothing of Shiloh, Belleau Wood, Tarawa, and Chosun, the crosses in our military cemeteries are just pleasant white stones on lush green lawns. They no longer serve as reminders that thousands endured pain and hardship for our right to listen to what we wish on our iPods and to shop at Wal-Mart in safety—or that they expected future generations, links in this great chain of obligation, to do the same for those not yet born. The United States was born through war, reunited by war, and saved from destruction by war. No future generation, however comfortable and affluent, should escape that terrible knowledge.
What, then, can we do to restore the study of war to its proper place in the life of the American mind? The challenge isn’t just to reform the graduate schools or the professoriate, though that would help. On a deeper level, we need to reexamine the larger forces that have devalued the very idea of military history—of war itself. We must abandon the naive faith that with enough money, education, or good intentions we can change the nature of mankind so that conflict, as if by fiat, becomes a thing of the past. In the end, the study of war reminds us that we will never be gods. We will always just be men, it tells us. Some men will always prefer war to peace; and other men, we who have learned from the past, have a moral obligation to stop them.
Studying War: Where to Start
While Thucydides’ Peloponnesian War, a chronicle of the three-decade war between Athens and Sparta, establishes the genre of military history, the best place to begin studying war is with the soldiers’ stories themselves. E. B. Sledge’s memoir of Okinawa, With the Old Breed, is nightmarish, but it reminds us that war, while it often translates to rot, filth, and carnage, can also be in the service of a noble cause. Elmer Bendiner’s tragic retelling of the annihilation of B-17s over Germany, The Fall of Fortresses: A Personal Account of the Most Daring, and Deadly, American Air Battles of World War II, is an unrecognized classic.
From a different wartime perspective—that of the generals—U. S. Grant’s Personal Memoirs is justly celebrated as a model of prose. Yet the nearly contemporaneous Memoirs of General W. T. Sherman is far more analytical in its dissection of the human follies and pretensions that lead to war. Likewise, George S. Patton’s War As I Knew It is not only a compilation of the eccentric general’s diary entries but also a candid assessment of human nature itself.
Fiction often captures the experience of war as effectively as memoir, beginning with Homer’s Iliad, in which Achilles confronts the paradox that rewards do not always go to the most deserving in war. The three most famous novels about the futility of conflict are The Red Badge of Courage, by Stephen Crane, All Quiet on the Western Front, by Erich Maria Remarque, and August 1914, by Aleksandr Solzhenitsyn. No work has better insights on the folly of war, however, than Euripides’ Trojan Women.
Although many contemporary critics find it passé to document landmark battles in history, one can find a storehouse of information in The Fifteen Decisive Battles of the World, by Edward S. Creasy, and A Military History of the Western World, by J. F. C. Fuller. Hans Delbrück’s History of the Art of War and Russell F. Weigley’s The Age of Battles center their sweeping histories on decisive engagements, using battles like Marathon and Waterloo as tools to illustrate larger social, political, and cultural values. A sense of high drama permeates William H. Prescott’s History of the Conquest of Mexico and History of the Conquest of Peru, while tragedy more often characterizes Steven Runciman’s spellbinding short account The Fall of Constantinople 1453 and Donald Morris’s massive The Washing of the Spears, about the rise and fall of the Zulu Empire. The most comprehensive and accessible one-volume treatment of history’s most destructive war remains Gerhard L. Weinberg’s A World at Arms: A Global History of World War II.
Relevant histories for our current struggle with Middle East terrorism are Alistair Horne’s superb A Savage War of Peace: Algeria 1954–1962, Michael Oren’s Six Days of War, and Mark Bowden’s Black Hawk Down. Anything John Keegan writes is worth reading; The Face of Battle remains the most impressive general military history of the last 50 years.
Biography too often winds up ignored in the study of war. Plutarch’s lives of Pericles, Alcibiades, Julius Caesar, Pompey, and Alexander the Great established the traditional view of these great captains as men of action, while weighing their record of near-superhuman achievement against their megalomania. Elizabeth Longford’s Wellington is a classic study of England’s greatest soldier. Lee’s Lieutenants: A Study in Command, by Douglas Southall Freeman, has been slighted recently but is spellbinding.
If, as Carl von Clausewitz believed, “War is the continuation of politics by other means,” then study of civilian wartime leadership is critical. The classic scholarly account of the proper relationship between the military and its overseers is still Samuel P. Huntington’s The Soldier and the State: The Theory and Politics of Civil-Military Relations. For a contemporary J’accuse of American military leadership during the Vietnam War, see H. R. McMaster’s Dereliction of Duty: Lyndon Johnson, Robert McNamara, the Joint Chiefs of Staff, and the Lies That Led to Vietnam.
Eliot A. Cohen’s Supreme Command: Soldiers, Statesmen, and Leadership in Wartime is purportedly a favorite read of President Bush’s. It argues that successful leaders like Ben-Gurion, Churchill, Clemenceau, and Lincoln kept a tight rein on their generals and never confused officers’ esoteric military expertise with either political sense or strategic resolution.
In The Mask of Command, Keegan examines the military competence of Alexander the Great, Wellington, Grant, and Hitler, and comes down on the side of the two who fought under consensual government. In The Soul of Battle, I took that argument further and suggested that three of the most audacious generals—Epaminondas, Sherman, and Patton—were also keen political thinkers, with strategic insight into what made their democratic armies so formidable.
How politicians lose wars is also of interest. See especially Ian Kershaw’s biography Hitler, 1936–1945: Nemesis. Mark Moyar’s first volume of a proposed two-volume reexamination of Vietnam, Triumph Forsaken: The Vietnam War, 1954–1965, is akin to reading Euripides’ tales of self-inflicted woe and missed chances. Horne has written a half-dozen classics, none more engrossing than his tragic To Lose a Battle: France 1940.
Few historians can weave military narrative into the contemporary political and cultural landscape. James McPherson’s Battle Cry of Freedom does, and his volume began the recent renaissance of Civil War history. Barbara Tuchman’s The Guns of August describes the first month of World War I in riveting but excruciatingly sad detail. Two volumes by David McCullough, Truman and 1776, give fascinating inside accounts of the political will necessary to continue wars amid domestic depression and bad news from the front. So does Martin Gilbert’s Winston S. Churchill: Finest Hour, 1939–1941. Donald Kagan’s On the Origins of War and the Preservation of Peace warns against the dangers of appeasement, especially the lethal combination of tough rhetoric with no military preparedness, in a survey of wars from ancient Greece to the Cuban missile crisis. Robert Kagan’s Dangerous Nation reminds Americans that their idealism (if not self-righteousness) is nothing new but rather helps explain more than two centuries of both wise and ill-considered intervention abroad.
Any survey on military history should conclude with more abstract lessons about war. Principles of War by Clausewitz remains the cornerstone of the science. Niccolò Machiavelli’s The Art of War blends realism with classical military detail. Two indispensable works, War: Ends and Means, by Angelo Codevilla and Paul Seabury, and Makers of Modern Strategy, edited by Peter Paret, provide refreshingly honest accounts of the timeless rules and nature of war.
—Victor Davis Hanson
http://www.city-journal.org/html/17_3_military_history.html
Wednesday, December 31, 2008
HAPPY NEW YEAR!
HAPPY NEW YEAR!
. . . to all friends, readers, visitors, supporters, and supporting blogs,
from . . .
Islamic Danger to Americans
How to Stop the Islamic Jihad
Islamic Danger FU
The Jew in Yellow
islamic Danger 2U
Islamic Danger to Bharat (India)
Islamic Danger in History
Islamic Danger (original, now censored)
On the Back of My Mind
The Islamic Danger family of blogs
May the new year bring us all joy and glorious times, with the opposite to all who wish us ill and seek to destroy us.
. . . to all friends, readers, visitors, supporters, and supporting blogs,
from . . .
Islamic Danger to Americans
How to Stop the Islamic Jihad
Islamic Danger FU
The Jew in Yellow
islamic Danger 2U
Islamic Danger to Bharat (India)
Islamic Danger in History
Islamic Danger (original, now censored)
On the Back of My Mind
The Islamic Danger family of blogs
May the new year bring us all joy and glorious times, with the opposite to all who wish us ill and seek to destroy us.
Saturday, December 13, 2008
Nietzschean Musings: God’s Suicide
Nietzschean Musings: God’s Suicide
by Mandavya Atri
http://www.antichristian-phenomenon.com/
from
http://business-houses-of-jesus-christ.blogspot.com/2008/08/nietzschean-musings-gods-suicide.html
via http://anaryasviews.blogspot.com/
Friedrich Nietzsche wrote in his “Thus spoke Zarathustra” that “God is dead! And we have killed him!”. While I admire very much Nietzsche’s philosophy, I think we should dig even further into this matter. We never put a finger on God, our hands are clean! God committed suicide ! He’s dead, by his hand!
-
I won’t bother with the Old Testament, when he was just a tribal god, similar to so many others, and “inspired” by the other gods “who didn’t exist”. Let the Jews worry about that. I’m talking about when he was “promoted” to a universal god, for all human beings. Suddenly he changed his old ways and decided to use other methods. He appeared to us in the form of a bastard son and made his new rules heard. Now the whole human race became his “chosen ones” and bloodshed , mass exterminations and crushing the infidels were no longer required. Suddenly he spoke of love, peace and tolerance between people.
-
Jesus (as he is portrayed in the Bible) appears as a peaceful, passive being, devoid of any pride or ego. He asks us to love our neighbour, to love our fellow humans how we love ourselves and to grant permanent forgiveness to those who hate, hurt or do us wrong, no matter how much wrong. We always have to forgive and never resist when we are attacked, insulted or beaten. Also, the rules about chastity are even more tightened. Now, not only sex before marriage or cheating on your partner is sinful, but even looking at someone of the opposite sex and secretly desiring him/her will be treated as a deadly sin. The same will be said about verbally insulting someone and calling them “fools”. This is God’s new ideology that we must follow in order to achieve eternal happiness. This ideology brought by Jesus is more than just an ideology. This is what God represents. This is God.
-
But then we see how God inflicts wounds on himself when, just chapters after he told us not to call people “fools” he does so himself. He goes even further when he attacks the merchants selling things in the temples. His passivity and friendly attitude suddenly changes, and doing so he put another nail in his own coffin. If Jesus was the wisest man alive, why didn’t he use his infinite wisdom in order to peacefully convince the limited minded merchants to leave the temple?
-
Tolerance is abandoned again when he calls those who do not listen to his teachings "vipers and vermin". Where has “love thy neighbour” gone? Surely no one speaks this way to his neighbour when he loves him, right? Even when he does something wrong. But Jesus seals his fate, as God, when he calls for eternal punishment in hell for the infidels, who do not follow him. Rightful and goodhearted actions are deemed nothing without faith in Christ. Without this faith, we all go to hell. This is the moment God renders his ideology obsolete. This is when God committed suicide. He is the one who asks for unlimited forgiveness, but he is also the one that offers unlimited punishment for limited sins done during a pathetic lifetime of few years that don’t even matter in the whole context of history.
-
It is obvious that God cannot follow his own rules. His words are contradicted either by other words or by his actions. If one who does not follow these rules deserves eternal death and God himself cannot follow them, he sentences himself to death: He commits suicide. We are innocent. As such, his ideology is rendered pointless and useless to us.
-
But even after his suicide, God still has followers. They call themselves “Christians”. but what is a Christian ? Nietzsche wrote in “The Antichrist” that the only Christian in human history died on the cross. Again, I go even further and say there was never such thing as Christian, since no one ever followed the rules required by the suicidal god to achieve this title. If we go on the streets and ask 1000 people the question “what is a Christian and what does it take to be one?”, it is guaranteed that we will get exactly 1000 definitions. The truth is that after God committed suicide, the people performed an autopsy on him and dissected him, each taking a part, the one that suits him/her best. We will find out that some don’t consider sex before marriage or sexual lust a sin at all. When we bring to them the lines quoted above, they either say that they are not important or they are just a figure of speech. To some, who consider themselves fundamentalists, they don’t find anything wrong in rejecting atheists and people who don’t believe what they believe. Or, why don’t we try hitting one of these Christians in the face. Will they be real Christians and turn the other cheek, while, perhaps trying to talk to us rationally, or will they do like a “mere human” would and hit back?
-
Also, who actually believes in this shallow, senseless emotion called “love for everybody”? Do you really think someone actually experiences this “love” for everyone in the world ? No! Some of them might have the guts to say something like this. But they are just words. They couldn’t care less if 2 blocks away there was a car accident and some people died. But what about loving Christ more than your wife/girlfriend/mother/father etcetera? Does anyone even think of Christ when they are with someone they love? Christian teachings are very incompatible with human personality, that’s why no one can follow them. Like I said, the only thing Christians can do is take God apart and reassemble him in the way they choose.
-
Let’s face it, for every “Christian” there is a Jesus who fits their own interests and interpretation. There are orthodox Christians, catholic Christians, protestant Christians, Jehova’s Witness Christians, and another million kinds of Christians, but no actual Christian. There never has been.
Sunday, August 24, 2008
-Source: The Antichristian Phenomenon
URL: http://www.antichristian-phenomenon.com/
Posted by Mandavya Atri at 1:22 PM
Labels: Christianity, Friedrich Nietzche, God, Jesus Christ, Old Testament
by Mandavya Atri
http://www.antichristian-phenomenon.com/
from
http://business-houses-of-jesus-christ.blogspot.com/2008/08/nietzschean-musings-gods-suicide.html
via http://anaryasviews.blogspot.com/
Friedrich Nietzsche wrote in his “Thus spoke Zarathustra” that “God is dead! And we have killed him!”. While I admire very much Nietzsche’s philosophy, I think we should dig even further into this matter. We never put a finger on God, our hands are clean! God committed suicide ! He’s dead, by his hand!
-
I won’t bother with the Old Testament, when he was just a tribal god, similar to so many others, and “inspired” by the other gods “who didn’t exist”. Let the Jews worry about that. I’m talking about when he was “promoted” to a universal god, for all human beings. Suddenly he changed his old ways and decided to use other methods. He appeared to us in the form of a bastard son and made his new rules heard. Now the whole human race became his “chosen ones” and bloodshed , mass exterminations and crushing the infidels were no longer required. Suddenly he spoke of love, peace and tolerance between people.
-
Jesus (as he is portrayed in the Bible) appears as a peaceful, passive being, devoid of any pride or ego. He asks us to love our neighbour, to love our fellow humans how we love ourselves and to grant permanent forgiveness to those who hate, hurt or do us wrong, no matter how much wrong. We always have to forgive and never resist when we are attacked, insulted or beaten. Also, the rules about chastity are even more tightened. Now, not only sex before marriage or cheating on your partner is sinful, but even looking at someone of the opposite sex and secretly desiring him/her will be treated as a deadly sin. The same will be said about verbally insulting someone and calling them “fools”. This is God’s new ideology that we must follow in order to achieve eternal happiness. This ideology brought by Jesus is more than just an ideology. This is what God represents. This is God.
-
But then we see how God inflicts wounds on himself when, just chapters after he told us not to call people “fools” he does so himself. He goes even further when he attacks the merchants selling things in the temples. His passivity and friendly attitude suddenly changes, and doing so he put another nail in his own coffin. If Jesus was the wisest man alive, why didn’t he use his infinite wisdom in order to peacefully convince the limited minded merchants to leave the temple?
-
Tolerance is abandoned again when he calls those who do not listen to his teachings "vipers and vermin". Where has “love thy neighbour” gone? Surely no one speaks this way to his neighbour when he loves him, right? Even when he does something wrong. But Jesus seals his fate, as God, when he calls for eternal punishment in hell for the infidels, who do not follow him. Rightful and goodhearted actions are deemed nothing without faith in Christ. Without this faith, we all go to hell. This is the moment God renders his ideology obsolete. This is when God committed suicide. He is the one who asks for unlimited forgiveness, but he is also the one that offers unlimited punishment for limited sins done during a pathetic lifetime of few years that don’t even matter in the whole context of history.
-
It is obvious that God cannot follow his own rules. His words are contradicted either by other words or by his actions. If one who does not follow these rules deserves eternal death and God himself cannot follow them, he sentences himself to death: He commits suicide. We are innocent. As such, his ideology is rendered pointless and useless to us.
-
But even after his suicide, God still has followers. They call themselves “Christians”. but what is a Christian ? Nietzsche wrote in “The Antichrist” that the only Christian in human history died on the cross. Again, I go even further and say there was never such thing as Christian, since no one ever followed the rules required by the suicidal god to achieve this title. If we go on the streets and ask 1000 people the question “what is a Christian and what does it take to be one?”, it is guaranteed that we will get exactly 1000 definitions. The truth is that after God committed suicide, the people performed an autopsy on him and dissected him, each taking a part, the one that suits him/her best. We will find out that some don’t consider sex before marriage or sexual lust a sin at all. When we bring to them the lines quoted above, they either say that they are not important or they are just a figure of speech. To some, who consider themselves fundamentalists, they don’t find anything wrong in rejecting atheists and people who don’t believe what they believe. Or, why don’t we try hitting one of these Christians in the face. Will they be real Christians and turn the other cheek, while, perhaps trying to talk to us rationally, or will they do like a “mere human” would and hit back?
-
Also, who actually believes in this shallow, senseless emotion called “love for everybody”? Do you really think someone actually experiences this “love” for everyone in the world ? No! Some of them might have the guts to say something like this. But they are just words. They couldn’t care less if 2 blocks away there was a car accident and some people died. But what about loving Christ more than your wife/girlfriend/mother/father etcetera? Does anyone even think of Christ when they are with someone they love? Christian teachings are very incompatible with human personality, that’s why no one can follow them. Like I said, the only thing Christians can do is take God apart and reassemble him in the way they choose.
-
Let’s face it, for every “Christian” there is a Jesus who fits their own interests and interpretation. There are orthodox Christians, catholic Christians, protestant Christians, Jehova’s Witness Christians, and another million kinds of Christians, but no actual Christian. There never has been.
Sunday, August 24, 2008
-Source: The Antichristian Phenomenon
URL: http://www.antichristian-phenomenon.com/
Posted by Mandavya Atri at 1:22 PM
Labels: Christianity, Friedrich Nietzche, God, Jesus Christ, Old Testament
God Of History
by Rebecca Bynum (Dec. 2008)
published at
http://www.newenglishreview.org/custpage.cfm/frm/29231/sec_id/29231
One of the most confusing aspects of modern Judeo-Christian thought lies in the attempt to reconcile two opposing concepts of God. One is of God as the loving and merciful Father of the individual, who is concerned primarily with individual salvation and survival after death. The other is of God as an actor in history, who controls and shapes the historical drama for his purpose, disregarding the individual, as is often depicted in the Bible. In his book, After Auschwitz, Richard L. Rubenstein proposes that theology itself is essentially an attempt to diminish the cognitive dissonance that belief in both these aspects of God causes in the believer. There is a gulf between the Biblical God of history and the God of human individual experience which theologians attempt to bridge. That gulf has grown wider and those theological bridges less tenable in the face of the unprecedented scale of death and destruction wrought by man in the twentieth century.
In examining this problem it is evident that though God himself must be conceived of as eternal and unchanging, human awareness and understanding of God has been an evolving quest through the generations. The Bible contains a record of the concept of God as it has evolved over the centuries, but also an historical record of the Jewish people and descriptions of their national drama in which God is thought to take a special interest. This record is traditionally interpreted as that of a God who is involved in reward and punishment of the Jewish people as a whole, chastising them when they stray and rewarding them when they are faithful to his word. The Jews are thought to be held to a higher standard of obedience due to the idea that God has chosen them to be bearers of the divine light.
We are then confronted with the theological problem, not only of flawed divine justice (as all collective punishments and rewards would necessarily be flawed, if not entirely unjust), but also the idea that God must then be involved with evil, even to partake of evil, in order to dispense these collective punishments. So, either God is omnipotent and unjust or he is just but not omnipotent. A third option, that God is self-limited for the purpose of allowing mankind freedom of will, is rarely taken up, for the idea of a punishing God is deeply embedded in both Jewish and Christian thought.
Many Jewish and Christian theologians are of the opinion that God exists in a realm beyond good and evil and that he works his will by using evil as a necessary means to teach and perfect imperfect man, who is ever tending toward selfishness, egotism and greed, and is forever forgetting his obligation to God. In fact, some Jews and Christians even describe the Holocaust as part of the divine plan, that God actually used Hitler in order to punish the Jews. They differ only on the reason for this punishment. Some Jewish theologians have proposed that the Holocaust occurred because the Jews were not faithful enough to the Torah and too assimilated into gentile society (even though the conservative orthodox Polish population bore the full brunt of the atrocities while the more assimilated population in America escaped). Some Christian theologians have surmised that the Jews as a people were still being punished for having rejected Christ (even though the actual rejection of Jesus occurred only on the part of Annas, Caiaphas and a handful of leaders of the Sanhedrin – not the common people, whose very embrace of Jesus had aroused the fear and ire of those same men). Neither of these theological explanations evokes a loving, trustworthy, fatherly God, but rather an anthropomorphic despot, unworthy of enlightened worship. A world in which the creature is on a higher moral plane than the creator poses a theological dilemma of the most profound sort.
The depiction of deity as a vengeful and jealous despot is entirely in keeping, however, with the earliest records of human theological thought as contained in the Old Testament, where man’s conception of God began as a tribal deity, chiefly concerned with the welfare of the tribe and one who therefore backed the tribe against enemy tribes which had their own gods. Later, as the concept of deity enlarged, God was envisioned as being the God of all the peoples of the earth and finally of the universe as well.
When Isaiah proclaimed “Thus saith the Lord, ‘Heaven is my throne, and the earth is my footstool,’”[1] the older Bedouin tribal god of vengeance and jealousy was transformed into a God of transcendent majesty, a universal God ruling heaven and earth. One can easily imagine the emotional need of the people to uphold the idea that even though God has grown larger and is now Lord of all the earth, they who first understood this, nonetheless desired to think of themselves as the nearest to him. “I will take you as my people, and I will be your God. Then you shall know that I am the LORD your God who brings you out from under the burdens of the Egyptians.”[2]
Much theological confusion might be avoided if the older conception of God rooted in an ancient tribal deity, a bloodthirsty God who demands sacrifice and appeasement, could be seen as just that, an early human conception of God that can no longer be justified in the modern era.
It makes more sense to understand God as so respecting human free will, that he allows the full consequences of that freedom to reign, if only during man’s short time on earth, a time when God’s will bows to human will, so that man will be free to choose goodness over evil, truth over error, and the beauty of selflessness over the ugliness of the selfish act. The fact that God allows the tares to grow with the wheat until the harvest, does not necessarily mean that God actively participates in evil or that he is punishing man, only that he is giving mankind a choice.
If it is God’s desire to foster courage, faith, loyalty, altruism and devotion within the individual, then the environment man finds himself in must contain danger, uncertainty, betrayal, cruelty and loneliness. He must have an environment in which there is a difference between that which is and that which should be, otherwise there would be no necessity for faith, the reaching for that which is higher and better, for values which lie beyond the material world. There would be no need to reach for God.
We have inherited a tradition in which the higher concepts of God are shackled to those which are lower. In the Christian tradition, we have the idea that God is so bloodthirsty that he was not satisfied with human suffering until he saw his own innocent son dying upon the cross. And even then he was not satisfied with the suffering of the Jews who, after three millennia of persecution at the hands of the Babylonians, the Egyptians and the Christians, must be further punished for the supreme crime of deicide, even though it was God who required this sacrifice in the first place. The idea that the humanly conceived and executed cold-blooded murder of six million Jews, for the crime of being Jews, could seriously be considered as part of “God’s plan” by some Jewish and Christian theologians is appalling. A more stomach-turning conclusion can hardly be imagined. Is there any wonder millions of Jews and Christians are turning away from the old faiths? Cognitive dissonance has reached the breaking point.
Richard L. Rubenstein concludes that we are living through an age of the “death of God.” By this, I believe he means the death of the idea, or hope, that God will deal with his chosen people by means of miraculous intervention, that the Jews will have divine protection. This conception was dealt its death blow at Auschwitz. Indeed, there is even doubt that without the religious concept of “covenant and election” the Jewish people can survive as a distinct people. It is likewise doubtful whether Christianity can continue in its present form without the idea that Jesus was sacrificed for our sins, but it is equally impossible to believe that Jesus took away the sins of the world with his death. The Holocaust stands as a stunning rebuke to both religious conceptions, making both seem feeble and child-like in the face of such horror.
Furthermore, we are facing theological assault by a religion that claims to restore the original monotheistic concept of God. One which declares that man's conception of God cannot, must not and shall not evolve. Indeed it would be difficult to imagine a more primitive God. The bloodthirsty Allah delights in the tortures of his hell and rewards those who, by slaying his "enemies," serve him, by bestowing upon them the sensual delights of a heavenly brothel.
An effective response lies not in clinging to our own more primitive God concepts, but rather in declaring the God concept itself as one which has evolved and must evolve in order for civilization to be strengthened and renewed.
Perhaps the old God concepts must die before the new may spring forth to take their place. As spoke Jesus, “Most assuredly, I say to you, unless a grain of wheat falls into the ground and dies, it remains alone; but if it dies, it produces much grain.”[3]
--------------------------------------------------------------------------------
[1] The Bible, Isaiah 66:1 (King James version)
[2] The Bible, Exodus 6:7 (New King James version)
[3] The Bible, John 12:24 (New King James version)
http://www.newenglishreview.org/custpage.cfm/frm/29231/sec_id/29231
published at
http://www.newenglishreview.org/custpage.cfm/frm/29231/sec_id/29231
One of the most confusing aspects of modern Judeo-Christian thought lies in the attempt to reconcile two opposing concepts of God. One is of God as the loving and merciful Father of the individual, who is concerned primarily with individual salvation and survival after death. The other is of God as an actor in history, who controls and shapes the historical drama for his purpose, disregarding the individual, as is often depicted in the Bible. In his book, After Auschwitz, Richard L. Rubenstein proposes that theology itself is essentially an attempt to diminish the cognitive dissonance that belief in both these aspects of God causes in the believer. There is a gulf between the Biblical God of history and the God of human individual experience which theologians attempt to bridge. That gulf has grown wider and those theological bridges less tenable in the face of the unprecedented scale of death and destruction wrought by man in the twentieth century.
In examining this problem it is evident that though God himself must be conceived of as eternal and unchanging, human awareness and understanding of God has been an evolving quest through the generations. The Bible contains a record of the concept of God as it has evolved over the centuries, but also an historical record of the Jewish people and descriptions of their national drama in which God is thought to take a special interest. This record is traditionally interpreted as that of a God who is involved in reward and punishment of the Jewish people as a whole, chastising them when they stray and rewarding them when they are faithful to his word. The Jews are thought to be held to a higher standard of obedience due to the idea that God has chosen them to be bearers of the divine light.
We are then confronted with the theological problem, not only of flawed divine justice (as all collective punishments and rewards would necessarily be flawed, if not entirely unjust), but also the idea that God must then be involved with evil, even to partake of evil, in order to dispense these collective punishments. So, either God is omnipotent and unjust or he is just but not omnipotent. A third option, that God is self-limited for the purpose of allowing mankind freedom of will, is rarely taken up, for the idea of a punishing God is deeply embedded in both Jewish and Christian thought.
Many Jewish and Christian theologians are of the opinion that God exists in a realm beyond good and evil and that he works his will by using evil as a necessary means to teach and perfect imperfect man, who is ever tending toward selfishness, egotism and greed, and is forever forgetting his obligation to God. In fact, some Jews and Christians even describe the Holocaust as part of the divine plan, that God actually used Hitler in order to punish the Jews. They differ only on the reason for this punishment. Some Jewish theologians have proposed that the Holocaust occurred because the Jews were not faithful enough to the Torah and too assimilated into gentile society (even though the conservative orthodox Polish population bore the full brunt of the atrocities while the more assimilated population in America escaped). Some Christian theologians have surmised that the Jews as a people were still being punished for having rejected Christ (even though the actual rejection of Jesus occurred only on the part of Annas, Caiaphas and a handful of leaders of the Sanhedrin – not the common people, whose very embrace of Jesus had aroused the fear and ire of those same men). Neither of these theological explanations evokes a loving, trustworthy, fatherly God, but rather an anthropomorphic despot, unworthy of enlightened worship. A world in which the creature is on a higher moral plane than the creator poses a theological dilemma of the most profound sort.
The depiction of deity as a vengeful and jealous despot is entirely in keeping, however, with the earliest records of human theological thought as contained in the Old Testament, where man’s conception of God began as a tribal deity, chiefly concerned with the welfare of the tribe and one who therefore backed the tribe against enemy tribes which had their own gods. Later, as the concept of deity enlarged, God was envisioned as being the God of all the peoples of the earth and finally of the universe as well.
When Isaiah proclaimed “Thus saith the Lord, ‘Heaven is my throne, and the earth is my footstool,’”[1] the older Bedouin tribal god of vengeance and jealousy was transformed into a God of transcendent majesty, a universal God ruling heaven and earth. One can easily imagine the emotional need of the people to uphold the idea that even though God has grown larger and is now Lord of all the earth, they who first understood this, nonetheless desired to think of themselves as the nearest to him. “I will take you as my people, and I will be your God. Then you shall know that I am the LORD your God who brings you out from under the burdens of the Egyptians.”[2]
Much theological confusion might be avoided if the older conception of God rooted in an ancient tribal deity, a bloodthirsty God who demands sacrifice and appeasement, could be seen as just that, an early human conception of God that can no longer be justified in the modern era.
It makes more sense to understand God as so respecting human free will, that he allows the full consequences of that freedom to reign, if only during man’s short time on earth, a time when God’s will bows to human will, so that man will be free to choose goodness over evil, truth over error, and the beauty of selflessness over the ugliness of the selfish act. The fact that God allows the tares to grow with the wheat until the harvest, does not necessarily mean that God actively participates in evil or that he is punishing man, only that he is giving mankind a choice.
If it is God’s desire to foster courage, faith, loyalty, altruism and devotion within the individual, then the environment man finds himself in must contain danger, uncertainty, betrayal, cruelty and loneliness. He must have an environment in which there is a difference between that which is and that which should be, otherwise there would be no necessity for faith, the reaching for that which is higher and better, for values which lie beyond the material world. There would be no need to reach for God.
We have inherited a tradition in which the higher concepts of God are shackled to those which are lower. In the Christian tradition, we have the idea that God is so bloodthirsty that he was not satisfied with human suffering until he saw his own innocent son dying upon the cross. And even then he was not satisfied with the suffering of the Jews who, after three millennia of persecution at the hands of the Babylonians, the Egyptians and the Christians, must be further punished for the supreme crime of deicide, even though it was God who required this sacrifice in the first place. The idea that the humanly conceived and executed cold-blooded murder of six million Jews, for the crime of being Jews, could seriously be considered as part of “God’s plan” by some Jewish and Christian theologians is appalling. A more stomach-turning conclusion can hardly be imagined. Is there any wonder millions of Jews and Christians are turning away from the old faiths? Cognitive dissonance has reached the breaking point.
Richard L. Rubenstein concludes that we are living through an age of the “death of God.” By this, I believe he means the death of the idea, or hope, that God will deal with his chosen people by means of miraculous intervention, that the Jews will have divine protection. This conception was dealt its death blow at Auschwitz. Indeed, there is even doubt that without the religious concept of “covenant and election” the Jewish people can survive as a distinct people. It is likewise doubtful whether Christianity can continue in its present form without the idea that Jesus was sacrificed for our sins, but it is equally impossible to believe that Jesus took away the sins of the world with his death. The Holocaust stands as a stunning rebuke to both religious conceptions, making both seem feeble and child-like in the face of such horror.
Furthermore, we are facing theological assault by a religion that claims to restore the original monotheistic concept of God. One which declares that man's conception of God cannot, must not and shall not evolve. Indeed it would be difficult to imagine a more primitive God. The bloodthirsty Allah delights in the tortures of his hell and rewards those who, by slaying his "enemies," serve him, by bestowing upon them the sensual delights of a heavenly brothel.
An effective response lies not in clinging to our own more primitive God concepts, but rather in declaring the God concept itself as one which has evolved and must evolve in order for civilization to be strengthened and renewed.
Perhaps the old God concepts must die before the new may spring forth to take their place. As spoke Jesus, “Most assuredly, I say to you, unless a grain of wheat falls into the ground and dies, it remains alone; but if it dies, it produces much grain.”[3]
--------------------------------------------------------------------------------
[1] The Bible, Isaiah 66:1 (King James version)
[2] The Bible, Exodus 6:7 (New King James version)
[3] The Bible, John 12:24 (New King James version)
http://www.newenglishreview.org/custpage.cfm/frm/29231/sec_id/29231
Subscribe to:
Posts (Atom)