When the earliest civilizations appeared (in Mesopotamia, Egypt, India and China), they were largely constrained by their natural environment and by the climate. Religion, Science and Art were largely determined by extra-human factors, such as seasons and floods. Over the course of many centuries, humans have managed to change the equation in their favor, reducing the impact of natural events on their civilization and increasing the impact of their civilization on nature (for better and for worse). How this happened to be is pretty much the history of knowledge. Knowledge has been, first and foremost, a tool to become the “subject” of change, as opposed to being the “object” of change.
One could claim that the most important inventions date from prehistory, and that “history” has been nothing more than an application of those inventions. Here is a quick rundown (in parentheses the earliest specimen we found so far and the place where it was found): tools (2 million years ago, Africa), fire (1.9 million years ago, Africa), buildings (400,000 BC, France), burial (70,000 BC, Germany), art (28,000 BC), Farming (14,000 BC, Mesopotamia), animal domestication (12,000 BC), boat (8,000 BC, Holland), weapons (8,000 BC), pottery (7,900 BC, China), weaving (6,500 BC, Palestine), money (sometime before the invention of writing, Mesopotamia), musical instruments (5,000 BC, Mesopotamia), metal (4,500 BC, Egypt), wheel (3,500 BC, Mesopotamia), writing (3,300 BC, Mesopotamia), glass (3,000 BC, Phoenicia), sundial (3,000 BC, Egypt).
Once the infrastructure was in place, knowledge increased rapidly on all fronts: agriculture, architecture (from the ziggurat of the Sumerians to the pyramids of the Egyptians to the temples of the Greeks), bureaucracy (from the city-states of the Sumerians to the kingdom of Egypt, from the empire of Persia to the economic empire of Athens), politics (from the theocracies of Mesopotamia and Egypt to the democracy of Athens), religion (from the anthropomorphic deities of Mesopotamia to the complex metaphysics of Egypt, from the tolerant pantheon of the Greeks to the one God of the Persians and the Jews), writing (from the “Gilgamesh” in Mesopotamia to the “Adventures of Sinuhe” in Egypt to the “Bible” of the Jews to Homer’s epics in Greece), economics (from the agricultural societies of Mesopotamia and Egypt to the trade-based societies of Phoenicia and Athens), transportation (from to the horse-driven chariots of Mesopotamia to the Greek trireme), art (from the funerary painting of the Egyptians to the realistic sculptures of the Greeks), etc.
For a while, Religion acted as, basically, a compendium of knowledge (about life, society and the universe). In India, the Vedas and the Upanishads painted a cyclical picture of the universe. Right and wrong actions increase the positive and negative potential energy (“apurva”) associated with each person. Apurva is eventually released (in this or the next life) and causes good or evil to the person. Basically, misfortune is caused by prior wrongful deeds. It is not only deserved but even required. Life is a loop from the individual back to the individual. This was cosmic justice totally independent of the gods. Wisdom is the realization that everything is suffering, but the realization of suffering does not lead to pessimism: it leads to salvation. Salvation does not require any change in the world. It requires a realization that everything is part of an absolute, or Brahman. Salvation comes from the union of the individual soul (“atman”) with the universal soul (“brahman”). “Maya”, the plurality of the world is an illusion of the senses. Salvation comes from “moksha”: liberation from maya and experience of Brahman. By experiencing the divine within the self, one reaches pure knowledge and becomes one with the eternal, infinite, and conscious being. Nothing has changed in the world: it is the individual’s state of mind that has changed. Self-knowledge is knowledge of the absolute.
Buddha focused on the suffering, a ubiquitous state of living beings, but ended up denying the existence of the self: only events exist, the enduring self is an illusion (the “atman” is an illusion). Each moment is an entirely new existence, influenced by all other moments. To quote a Buddhist scripture, “only suffering exists, but no sufferer is to be found”. Suffering can be ended by overcoming ignorance and attachment to Earthly things.
From ancient times, China displayed a holistic approach to nature, man, and government. Chinese religion realized the fundamental unity of the physical, the emotional and the social. Particularly during the Zhou dynasty, Chinese religion was natural philosophy. There was no fear of damnation, no anxiety of salvation, no prophets, no dogmas. Confucius was much more interested in the fate of society than in the fate of the souls of ordinary people. He believed that the power of example was the ideal foundation of the social contract: a ruler, a father, a husband have to “deserve” the obedience that is due to them. Thus, Confucius’ philosophy was about the cultivation of the self, how to transform the ordinary individual into the ideal man. The ultimate goal of an individual’s life is self-realization through socialization. If Confucius focused on society, Lao-tzu focused on nature. He believe in a “tao”, an ultimate unity that underlies the world’s multiplicity. There is a fundamental reality in the continuous flow and change of the world: the “way” things do what they do. Understanding the “tao” means identifying with the patterns of nature, achieving harmony with nature. The ideal course of action is “action through inaction” (“wuwei”): to flow with the natural order. The “tao” is the infinite potential energy of the universe. “Qi” is vital energy/matter in constant flux that arises from the “Tao”, and “Qi” is regulated by the opposites of “Yin” and “Yang”. Everything is made of yin and yang.
Note that neither Buddhism nor Confucianism nor Taoism were “religions”, in the sense of worshipping a God. In fact, they all denied the importance of gods.
In Persia, on the other hand, Zarathustra believed in one supreme God that was similar to the Indian “absolute” of Brahman, except that it was opposed by a divine enemy, and the world was due to the titanic battle between these two supernatural beings: Ahura-Mazda, the spiritual, immaterial, creator god who is full of light and good, and Ahriman, the god of darkness and evil. Unlike previous religions, this one was eschatological: at the end of time, Ahura-mazda shall emerge victorious, and, after the apocalyptic ending and a universal judgement that will take place on Earth, all humans (even sinners) shall resurrect.
Judaism, which grew out of a synthesis of Mesopotamian, Arabian, Persian and Egyptian religious cults was originally only the religion of the Jews, and El was originally the nomadic god of a nomadic people (not tied to a sanctuary but “god of the father”). It was a god of punishment and wrath, and Jewish religion was conceived as, basically, obedience to El, with the reward for the Jewish people being the Promised Land. The “Old Testament” is largely silent about the rest of humanity, and largely silent about the afterlife. This was a god who spoke directly to its people (the Jews). The earliest prophets of the kingdom of Mari has been visionary mystics in charge of foretelling the future and interpreting natural events as divine messages on behalf of the royalty. The Biblical prophets, on the other hand, addressed the people (and, eventually, “all nations”) and their main mission was to promote a higher form of morality and justice. Judaism, in its expectation that a Messiah would come and deliverer the Jews from their suffering, was largely indifference towards unbelievers. In the meantime, the misadventures of the Jewish people were due to the fact that the Jews disobeyed their god. But, at some point, El and Yahweh became synonymous, and, eventually, Yahweh became the “only” god (“There is no other god besides me”). The “Old Testament”, which originally was a history of the Jews, acquired a universal meaning.
Both Mazdaism and Judaism became monotheistic religions and denounced all other gods as mere “idols” not worthy of worship.
A major step in the evolution of knowledge was the advent of Philosophy. Both in Greece and India, the explosion in Philosophy and Science was enabled by a lack of organized religion: both regions had a form of “rational superstition” rather than the theocracies of Mesopotamia and Egypt. The gods of the Greek and of the Indian pantheon were superhuman, but never tried to explain all that happens on this planet. Philosophers and scientists were able to speculate on the nature of the universe, of the human life and of the afterlife without offending the state and fearing for their lives.
In India, six “darshana” (philosophical schools) tried to answer the fundamental questions: is there a God? Is the world real? Samkhya believed that there is no God and that the world is real (due to the interaction between two substances, prakriti and purusha). Yoga believed in a supreme being (Isvara) and that the world is real.
Vedanta believed in Brahman and that the world is not real (it is an emanation of Brahman, the only substance that truly exists).
In Greece, Pythagoras was perhaps the first philosopher to speculate about the immortality of soul. Heraclitus could not believe in the immortality of anything, because he noticed that everything changes all the time (“you cannot enter the same river twice”), including us (“we are and we are not”). On the contrary, Parmenides, the most “Indian” of the Greek philosophers, believed that nothing ever changes: there is only one, infinite, eternal and indivisible reality, and we are part of this unchanging “one”, despite the illusion of a changing world that comes from our senses. Zeno even proved the impossibility of change with his famous paradoxes (for example, fast Achilles can never catch up with a slow turtle if the turtle starts ahead, because Achilles has to reach the current position of the turtle before passing it, and, when he does, the turtle has already moved ahead, a process that can be repeated forever). Democritus argued in favor of atomism and materialism: everything is made of atoms, including the soul. Socrates was a philosopher of wisdom, and noticed that wisdom is knowing what one does not know. His trial (the most famous religious trial before Jesus’) signaled the end of the dictatorship of traditional religion. Plato ruled out the senses as a reliable source of knowledge, and focused instead on “ideas”, which exist in a world of their own, are eternal and are unchangeable. He too believed in an immortal soul, trapped in a mortal body. By increasing its knowledge, the soul can become one with the ultimate idea of the universe, the idea of all ideas. On the contrary, Aristotle believed that knowledge “only” comes from the senses, and a mind is physically shaped by perceptions over a lifetime. He proceeded to create different disciplines to study different kinds of knowledge.
The Hellenistic age that followed Alexander’s unification of the “oikoumene” (the world that the Greeks knew) on a level never seen before fostered a new synthesis of views of the world. Hellenistic philosophy placed more emphasis on happiness of the individual, while Hellenistic religion place more emphasis on salvation of the individual. Cynics, who thought that knowledge is impossible, saw attachment to material things as the root problem, and advocated a return to nature. Skeptics, who agreed that knowledge is impossible, thought that the search for knowledge causes angst, and therefore one should avoid having beliefs of any sort. Epicureans, who had a material view of the world (the universe is a machine and humans have no special status), claimed that superstitions and fear of death cause angst. Stoics viewed the entire universe as a manifestation of god and happiness as surrendering the self to the divine order of the cosmos, as living in harmony with nature.
From the very beginning, knowledge was also the by-product of the human quest for an answer to the fundamental questions: Why are we here? What is the meaning of our lives? What happens when we die? Is it possible that we live forever in some other form? The afterlife and immortality are not knowledge, since we don’t “know” them yet, but humans used knowledge to reach different conclusions about these themes. The civilizations of Mesopotamia were mainly interested in “this” life. The Egyptians were obsessed with the afterlife, with immortality originally granted only to the pharaoh but eventually extended to everybody (via the mysteries of Osiris, the first major ritual about the resurrection). The ancient Greeks did not care much for immortality, as Ulysses showed when he declined the goddess’ invitation to spend eternity with her and preferred to return to his home; but later, in the Hellenistic period, a number of religious cults focused on resurrection: the Eleusinian mysteries (about Demeter’s search through the underworld for her daughter Persephone), the Orphic mysteries (about Orpheus’ attempt to bring back his wife Eurydice from the underworld) and the Dionysian mysteries (about Dionysus, resurrected by his father Zeus). The Romans cared for the immortality of their empire, and were resigned to the mortality of the individual; but it was under Roman rule that a new Jewish religion, Christianity, was founded on the notion that Jesus’ death and resurrection can save all humans.
The other great theme of knowledge was (and still is) the universe: what is the structure of the world that we live in? Neither the Indian nor the Greek philosophers could provide credible answers. They could only speculate. Nonetheless, the Hellenistic age fostered progress in mathematics (Euclid’s “Geometry” and Diophantus’ “Arithmetic”) and science (Erarosthenes’ calculation of the circumference of the Earth, Archimedes’ laws of mechanics and hydrostatics, Aristarchus’ heliocentric theory, Ptolemy’s geocentric theory). The Romans’ main contribution to the history of knowledge may well be engineering, which, after all, is but the practical application of science to daily life. The Romans, ever the practical people, made a quantum leap in construction: from aqueducts to public baths, from villas to amphitheaters. At the same time, they too created a new level of unification: the unification of the Mediterranean world.
The intellectual orgy of Greek philosophy opened the western mind. The Romans closed it when they adopted Christianity as “the” imperial religion and turned it into a dogma. Christianity was born a Jewish religion, but it was “relocated” to Rome and thus, automatically, turned into a universal religion. Jesus’ god was substantially different from the original El/Yahweh of the “Old Testament”: it was, first and foremost, a god of love. Jesus was the very son of God, sent to the Earth to die for humans and thus save them from the original sin. St Paul made it clear that it was love for everybody, not just for the Jews; and that the “kingdom” of the Christian faith, God’s reward for the faithful, was in heaven, not on Earth. The catch was that unbelievers were no longer immune from God’s judgement: they risked eternal damnation. The reward for the faithful was resurrection, just like Jesus had resurrected. Christianity was the culmination of a tradition of mysteries for the salvation of the individual, of religion for the ordinary man and woman, even for the slaves. Its central theme was one of resurrection and eternal life available to everybody. Indirectly, it was also an ideology of universality and equality.
In fact, both Buddhism and Christianity, and, to some extent, Confucianism, were universal and egalitarian. They were not exclusive of a race, a gender, or a social class. This achievement in religion marks a conceptual step in which ordinary people (even slaves) were beginning to see themselves as equal to the kings, albeit powerless.
Islam, another offshoot of Judaism, was the culmination of the trend towards monotheist, eschatological, egalitarian and universal religions. Islam borrowed from the Persian philosopher Mani the of a succession of revelations given to different peoples by the very same God (Allah) and it borrowed from Christianity the idea of universal brotherhood and the mission to convert the unbelievers. But, unlike its predecessors, Islam was also an ideology, because it prescribed how to build a state. It made it the duty of every Muslim to struggle for the creation of a universal Islamic state. Islam’s Earthly mission was to reform society and to form a nation. Islam’s mission was inherently political. The ultimate aim of the Islamic state is to develop social justice. What had been a subtle message in Christianity became an explicit message in Islam. In fact, the entire Muslim population (not just the priestly class) is in charge of running the Islamic state. Humans are granted limited popular sovereignty under the suzerainty of God.
The Islamic philosophers felt the need to reconcile Islam and Greek philosophy. The two who exerted the strongest influence on the West, Abu Ali al-Husain ibn Abdallah ibn Sina Avicenna and Abu al-Walid Muhammad ibn Ahmad ibn Muhammad ibn Rushd Averroes, achieved such a momentous unification of religion and philosophy by envisioning the universe as a series of emanations from Allah, from the first intelligence to the intelligence of humans. This allowed them to claim that there is only one truth, that appears like two truths: religion for the uneducated masses and philosophy for the educated elite. But there is no conflict between reason and revelation: ultimately, they both reach the same conclusions about the existence of Allah. The sufists, best represented by Ibn Arabi, added an almost Buddhist element: human consciousness is a mirror of the universal, eternal, infinite consciousness of Allah. Allah reveals himself to himself through human consciousness. The Sufi wants to achieve a state of participation in the act of self-revelation. The human condition is one of longing, of both joy (for having experienced the divine) and sorrow (for having lost the divine).
The invasions of the “barbaric” people of the east, the Arab invasion from the south and the wars against the Persian empire, led to the decadence of Roman civilization and to the “dark age” that lasted a few centuries. The obliteration of culture was such that, eventually, Europe had to re-learn its philosophy, science and mathematics from the Arabs.
The Christian dogma contributed to the decline of the Greek ideal. Rationality was replaced by superstition. Virtue was replaced by faith. Justice in this world was replaced with justice in the next world. The free exercise of reason was replaced with obedience to the Church. The Greek tolerance for foreign faiths was replaced by the intolerance of the Church. Nonetheless, Christianity emulated Islam in trying to reconcile religion and philosophy. St Augustine preached the separation (grounded in Greek philosophy) of body and soul, of bodily life and spiritual life: the pleasures of the body detract/distract from the truth of the soul.
During the “dark ages”, the Christian conversion of the European pagans, from Russia to Scandinavia, was completed. The Church, in fact, replaced the Roman empire as the unifying element of Europe. The Church controlled education. The Church controlled the arts. The Church even controlled the language: Latin.
The Arab invasion disrupted the economic and political unity of the Mediterranean Sea, and the rise of the Frankish kingdom, soon to be renamed “Holy Roman Empire” (a mostly landlocked empire) caused a redesign of the main trade routes away from the sea. Venice alone remained a sea-trading power, and, de facto, the only economic link between Holy and Eastern Roman Empires. This “inland” trade eventually caused a “commercial” revolution. Trade fairs appeared in Champagne, the Flanders, and northern Germany, creating a new kind of wealth in those regions. The Italian communes became rich enough to be able to afford their own armies and thus become de-facto independent and develop economies entirely based on trade. In northern Europe, a new kind of town was born, that did not rely on Mediterranean sea. Both in the north and in the south, a real bourgeois class was born. The medieval town was organized around the merchants, and then the artisans and the peasants.
As the horse became the main element in warfare, the landowner became the most powerful warrior. A new kind of nobility was created, a land-owning nobility. The collapse of central authority in western Europe led to feudalism, a system in which the nobility enjoyed ever greater power and freedom, a global “political” revolution.
Thus the “medieval synthesis”: Church, cities, kings (clergy, bourgeoisie, nobility).
But a fourth element was even more important for the history of knowledge. As Rome decayed, and Alexandria and Antioch fell to the Muslims, the capital of Christian civilization moved to Constantinople (Byzantium). Despite the Greek influence, this cosmopolitan city created great art but little or no philosophy or science. It was left to the monasteries of western Europe to preserve the speculative traditions of the Greek world, except that they were mainly used to prove the Christian dogma. Monasticism was nonetheless crucial for the development of philosophy, music, painting. The anarchy of the “dark age” helped monasteries become a sort of refuge for the intellectuals. As the choice of lay society came down to being a warrior or a peasant, being a monk became a more and more appealing alternative. Eventually, the erudite atmosphere of the monasteries inspire the creation of universities. And universities conferred degrees that allowed graduates to teach in any Christian country, thus fueling an “educational” revolution. Johannes Scotus Erigena, Peter Abelard, Thomas Aquinas, Johannes Eckhart, John Duns Scotus (the “scholastics”) were some of the beneficiaries. Western philosophy restarted with them. As their inquiries into the nature of the world became more and more “logical”, their demands on philosophy became stricter. Eventually, Roger Bacon came to advocate that Science be founded on logic and observation; and William Occam came to advocate the separation of Logic and Metaphysics, i.e. of Science and Church.
The commercial revolution of the new towns was matched by an “agricultural” revolution of the new manors. The plough (the first application of non-human power to agriculture), the three-field rotation (wheat/rye, oats/legumes, fallow) and the horseshoe caused an agricultural revolution in northern Europe that fostered rapid urbanization and higher standards of living. Improved agricultural techniques motivated the expansion of arable land via massive deforestation.
In the cities, a “technological” revolution took place. It started with the technology of the mill, which was pioneered by the monasteries. Mills became pervasive for grinding grain, fulling clothes, pressing olives and tanning. Textile manufacturing was improved by the spinning wheel (the first instance of belt transmission of power). And that was only the most popular instance of a machine, because this was the first age of the machines. The mechanical clock was the first machine made entirely of metal.
There also was a military revolution, due to the arrival of gunpowder. Milan became the center of weapon and armor manufacturing. Demand for cannons and handguns created a whole new industry.
Finally, an “engineering/artistic” revolution also took place, as more and more daring cathedrals started dotting the landscape of Christianity. Each cathedral was an example of “total art”, encompassing architecture, sculpture, painting, carpentry, glasswork. The construction of a cathedral was a massive enterprise that involved masons, workers, quarrymen, smiths, carpenters, etc. Not since the Egyptian pyramids had something so spectacular been tried. Each cathedral was a veritable summa of European civilization.
The political, commercial, agricultural, educational, technological and artistic revolutions of the Middle Ages converged in the 13th century (the “golden century”) to create an economic boom as it had not been seen for almost a millennium.
Improved communications between Europe and Asia, thanks to the Mongol Empire that had made travel safe from the Middle East to China, particularly on the “silk road”, and to the decline of the Viking and Saracen pirates, led to a revival of sea trade, especially by the Italian city-states that profited from a triangular trade Byzantium-Arabs-Italy.
Florence, benefiting from the trade of wool, and Venice, benefiting from the sea trade with the East, became capitalistic empires Venice sponsored technological innovation that enabled long-distance and winter voyages, while Florence sponsored financial innovation that enabled to lend/borrow and invest capital worldwide. The Italian cities had a vested interest in improved education, as they need people skilled in geography, writing, accounting, technology, etc. It is not a coincidence that the first universities were established in Italy.
The economic boom came to an abruptly stop by a plague epidemics (“the Black Death”) that decimated the European population. But the Black Death also had its beneficial effects. The dramatic decrease in population led to a higher standard of living for the survivors, as the farmers obtained more land per capita and the city dwellers could command higher wages. The higher cost of labor prompted investments in technological innovation. At the same time, wealthy people bequeated their fortunes to the creation of national universities which greatly increased the demand for books. The scarcity of educated people prompted the adoption of vernacular languages instead of Latin in the universities.
Throughout the Middle Ages, the national literatures had produced national epics such as “Beowulf” (900, Britain), “Edda” (1100, Scandinavia), “Cantar del Cid” (1140, Spain), Chretien de Troyes’ “Perceval” (1175, France), “Slovo o Ploku Igoreve” (1185, Russia), “Nibelungen” (1205, Germany), “Chanson de Roland” (1200, France), Wolfram Von Eschenbach’s “Parzival” (1210, Germany). Dante Alighieri’ “Divine Comedy” (1300) heralded a new age, in which the vernacular was used for the highest possible artistic aims, a veritable compendium of knowledge. After languishing for centuries, European poetry bloomed with Francesco Petrarca’s “Canti” (1374, Italy), Geoffrey Chaucer’s “Canterbury Tales” (1400, England), Inigo Santillana’s “Cancionero” (1449, Spain), Francois de Villon’s “Testament” (1462, France). And Giovanni Boccaccio’s “Decameron” (1353, Italy) laid the foundations for narrative prose.
In observance with the diktat of the Second Council of Nicaea (787), that the visual artist must work for the Church and remain faithful to the letter of the Bible, medieval art was permeated by an aesthetics of “imitation”. Christian art was almost a reversal of Greek art, because the emphasis shifted from the body (mortal, whose movement is driven by emotions) to the soul (immortal, immune to emotions), from realism and movement to spirituality and immanence. Christian art rediscovered Egyptian and Middle-eastern simplicity via Byzantine art. Nonetheless, centuries of illuminated manuscripts, mosaics, frescoes and icons eventually led to the revolution in painting best represented by Giotto’s “Scrovegni Chapel” (1305). While Italian artists were re-founding Greco-Roman art based on mathematical relationships and a sense of three-dimensional space, as in Paolo Uccello’s “Battle of St Romano” (1456), Masaccio’s “Trinity” (1427) and Piero della Francesca’s “Holy Conversation” (1474), Northern European painters became masters of a “photographic” realism as in Jan Van Eyck’s “The Virgin of the Chancellor Rolin” (1436) and “The Arnolfini Marriage” (1434).
Before Europe had time to recover from the Black Death, the unity of the Mediterranean was shattered again by the fall of Byzantium (1453) and the emergence of the Ottoman empire (a Muslim empire) as a major European power.
However, Europe was coming out of the “dark age” with a new awareness of the world. Marco Polo had brought news of the Far East. Albertus Magnus did not hesitate to state that the Earth is a sphere. Nicolas Oresme figured out that the rotation of the Earth on an axis explains the daily motion of the universe.
In China, the Han and Tang dynasties had been characterized by the emergence of a class of officials-scholars and by a cultural boom. The Sung dynasty amplified those social and cultural innovations. The scholar-officials become the dominant class in Chinese society. The state was run like an autocratic meritocracy, but nonetheless a meritocracy. As education was encouraged by the state, China experienced a rapid increase in literacy which led to a large urban literate class. The level of competence by the ruling class fosterd technological and agrarian innovations that created the most advanced agriculture, industry and trade in the world. When Europe was just beginning to get out of its “dark age”, China was the world’s most populous, prosperous and cultured nation in the world. The Mongol invasion (the Yuan dynasty) did not change the character of that society, but, in fact, added an element of peace: the “pax tatarica” guaranteed by the invincible Mongol armies.
India was the only part of the non-Chinese world that Chinese scholars were fascinated with. They absorbed Indian culture over the centuries, and particularly adopted one philosophical school of India: Buddhism. First came “Pure Land” or Jodo Buddhism (4th c), with its emphasis on devotion instead of meditation, Then Tiantai/Tendai (6th c), Huayan/Kegon (7th c) and Chan/Zen (6th c). The latter, a fusion of Buddhism and Taoism, focused on attainment of sudden enlightenment (“satori”). According to the Northern school (Shen-hsiu) satori was to be obtained by gradual enlightenment through guided meditation, while the Southern school (Huineng) allowed for satori through individual meditation. Zen promoted spontaneous thinking, as opposed to the philosophical investigation of Confucianism, spontaneous behavior as opposed to the calculated behavior of Confucianism. Zen is the “everyday mind”.
Japan had adopted Buddhism as a state religion already in 604, under prince Shotoku Taishi, next to a Confucian-style constitution and the native shinto cult. The various Buddhist schools arrived from China in the following centuries (the Tendai school in the 9th century, the Jodo school in the 12th century), until Zen Buddhism reached Japan during the 13th century. Zen became popular among the military class (the “samurai”) that embodied the noble values in an age of anarchy. In turn, the Zen monk came to behave like a spiritual samurai. From 1192 till 1333, Japan was ruled by “shogun” (military leaders) with residence in Kamakura (the “bakufu” system of government), while the emperor (or “mikado”) became a figurehead. Even the equivalent of the scholar-official of China was military: during the 17th century, the ideal man was the literate warrior who lived according to “bushido” (“way of the warrior”). Japan remained largely isolated until 1854, when the USA forced Japan to sign a treaty that opened Japan to foreign trade, a humiliation that led to the restoration of imperial power (1868) after so many centuries of military rule.
Japan’s native religion, Shinto, provides the bases for the imperial institutions. It is, in fact, a form of Japanese patriotism. It declares Japan a divine country, and the emperor a descendant of the gods. Shinto is polytheist to the extreme, admitting in its pantheon not only thousands of spirits (“kami”), personifying the various aspects of the natural world, and ancestors, but also the emperors and the deified heroes of the Japanese nation, and even foreign deities. Shinto is non-exclusive: a Shintoist can be a Buddhist, a Catholic, etc. The reason is that there is no competition between Shinto and the metaphysics of the other religions. Shinto is a religion to deal with ordinary lives, based on the belief that humans can affect Nature by properly honoring the spirits. When Japan adopted Buddhism, the Native spirits were recast as manifestations of Buddha.
The “Rinzai” school of Zen Buddhism believed in sudden enlightenment while concentrating to solve a koan (“sanzen”, or conversation with a master). The “Soto” school believed in gradual enlightenment through meditation in daily life (“zazen”, or sitting meditation). But the traditions of Japanese society surfaced also in Zen Buddhism: satori can be facilitated by martial arts, tea ceremonies, gardening, Haiku poetry, calligraphy, No drama, etc.
In marked contrast to the western civilizations, the eastern civilizations of India, China and Japan displayed little interested in the forceful spread of their religious beliefs.
Luckily for Christian Europe, in 1492 Spain opened a new front of knowledge: having freed itself of the last Arab kingdom, it sponsored the journey of Christopher Columbus to the “West Indies”, which turned out to be a new continent. That more or less accidental event marked the beginning of the “colonial” era, of “world trade”, and of the Atlantic slave trade; and, in general, of a whole new set of mind.
Other factors were shaping the European mind: Gutenberg’s printing press (1456), which made it possible to satisfy the growing demand for books; Martin Luther’s Reformation (1517), which freed the northern regions from the Catholic dogma; Copernicus’ heliocentric theory (1530), that removed the Earth (and thus Man) from the center of the universe; and the advent of the nation states (France, Austria, Spain, England and, later, Prussia).
However, it was not the small European nations that ruled the world at the end of the Middle Ages. The largest empires (the “gunpowder empires”) were located outside Europe. Gunpowder was only one reason for their success. They had also mastered the skills of administering a strong, centralized bureaucracy required to support an expensive military. In general, they dwarfed Europe at one basic dimension: knowledge. While Europe was just coming out of its “dark age”, the gunpowder empires were at their cultural peak. The Ottoman Empire, whose capital Istanbul was the largest city in Europe, was a melting pot of races, languages and religions. It was a sophisticated urban society, rich in universities and libraries, devoted to mathematics, medicine and manufacturing. The Safavid Empire of Persia, that controlled the silk trade, was a homogeneous state of Muslim Persians. The Mughal Empire of India, an Islamic state in a Hindu country, was also a melting pot of races, languages and religions. Ming China was perhaps the most technologically and culturally advanced of all countries.
The small European countries could hardly match the knowledge and power of these empires. And, still, a small country like Portugal or Holland ended up controlling a larger territory (stretching multiple continents) than any of those empires. A dis-united Europe of small and poor states caught up in an endless loop of intestine wars, speaking different languages, technologically backwards, that had to import science, philosophy and technology from the Muslims, that had fewer people and resources than the Asian empires, managed to conquer the entire world (with the only notable exception of Japan). Perhaps the problem was with the large-scale bureaucracies of those Asian empires, that, in the long term, became less and less competitive, more and more obscurantist. In some cases, their multi-ethnic nature caused centrifugal forces. Or perhaps Europe benefited from its own anarchy: continuous warfare created continuous competition and a perennial arms race. Perhaps the fact that no European power decisively defeated the others provided a motivation to improve that was missing in the more stable empires of the East. After all, the long-range armed sailing ships, which opened the doors to extra-European colonization, were the product of military build-up. Soon, world trade came to be based on sea transportation, which was controlled by Europeans. The printing press, which the gunpowder empire were slow to adopt (or even banned), slowly changed the balance of knowledge. World trade was creating more demand for technological innovation (and science), while the printing press was spreading knowledge throughout the continent. And all of this was funded with the wealth generated by colonialism. While the Asian empires were busy enjoying their stability, the small European countries were fighting for supremacy, anywhere anytime; and, eventually, they even overthrew those much larger empires.
Nowhere was the apparent oxymoron more intriguing than in Italy, a fragmented, war-torn peninsula that, nonetheless, became the cultural center of Europe. On a smaller scale, it was the same paradox: the tiny states of Italy and the Netherlands were superior in the arts to the powerful kingdoms of Spain, France and England. In this case, though, the reason is to be found in the socio-economic transformation of the Middle Ages that had introduced a new social class: the wealthy bourgeoisie. This class was more interested in the arts than the courts (which were mainly interested in warfare). The main “customer” of the arts was still the Church, but private patronage of art became more and more common. This, in turn, led to an elite of art collectors and critics. Aesthetics led to appreciation of genius: originality, individuality, creativity. Medieval art was imitation, Renaissance art was creation.
Perhaps the greatest invention of the Renaissance was the most basic of all from the point of view of knowledge: the self. The Egyptians and the Greeks did not have a truly unified view of the self, a unique way to refer to the “I” who is the protagonist of a life and, incidentally, is also a walking body. The Greeks used different terms (pneuma, logos, nous, psyche) to refer to different aspects of the “I”. The Middle Ages were the formative stage of the self, when the “soul” came to be identified with the thinking “I”. The Renaissance simply exalted that great medieval invention, the “I”, that had long been enslaved to religion. The “I” was now free to express and affirm itself.
In a nutshell, the “Rinascimento” (Renaissance art) adapted classical antiquity to Biblical themes. This was its fundamental contradiction: a Christian art based on Pagan art. An art that was invented (by the Greeks) to please the pagan gods and (by the Romans) to exalt pagan emperors was translated into an art to pay tribute to the Christian dogma. Leonardo da Vinci’s “The Last Supper” (1497) and Michelangelo Buonarroti’s “The Universal Judgement” (1541) are possibly the supreme examples in painting, while architects such as Donato Bramante and Gianlorenzo Bernini dramatically altered the urban landscapes. But there was also an obsession with ordering space, as manifested in Sandro Botticelli’s “Allegory of Spring” (1478) and Raffaello Sanzio’s “The School of Athens” (1511). In the Netherlands, Hieronymous Bosch’s “The Garden of Delights” (1504) was perhaps the most fantastic piece of art in centuries.
The Renaissance segued into the Baroque age, whose opulence really signified the triumph of European royalty and religion. Aesthetically speaking, the baroque was a restoration of order after the creative disorder of the Renaissance. The least predictable of the visual arts remained painting, with Pieter Bruegel’s “Triumph of Death” (1562), Domenico El Greco’s “Toledo” (1599), Pieter Rubens’ “Debarquement de Marie de Medicis” (1625), Rembrandt’s “Nightwatch” (1642), Jan Vermeer’s Malkunst (1666). In Italy, Giovanni Palestrina, Claudio Monteverdi (1567) and Girolamo Frescobaldi (1583) laid the foundations for classical music and the opera. The national literary scenes bloomed. Masterpieces of poetry included Ludovico Ariosto’s “Orlando Furioso” (1532), Luiz Vas de Camoes’ “Os Lusiadas” (1572), Torquato Tasso’s “Gerusalemme Liberata” (1575), Pierre de Ronsard’s “Sonnets pour Helene” (1578), John Donne’s “Holy Sonnets” (1615), John Milton’s “Paradise Lost” (1667). Even more characteristic of the era was theater: Gil Vicente’s “Auto da Barca do Inferno” (1516), Christopher Marlowe’s “Faust” (1592), William Shakespeare’s “Hamlet” (1601) and “King Lear” (1605), Lope de Vega Carpio’s “Fuente Ovejuna” (1614), Pedro Calderon’s “El Gran Teatro del Mundo” (1633), Moliere’s “Le Misanthrope” (1666) and JeanBaptiste Racine’s “Phedre” (1677). Francois Rabelais’ “Gargantua et Pantagruel” (1552) and Miguel Cervantes’ “Don Quijote” (1615) laid the foundations of the novel.
Progress in science was as revolutionary as progress in the arts. Tycho Brahe, who discovered a new star, and Johannes Kepler, who discovered the laws of planetary motion, Francis Bacon, who advocated knowledge based on objective empirical observation and inductive reasoning, and finally Galileo Galilei, who envisioned that linear uniform motion (not rest) is the natural motion of all objects and that forces cause acceleration (which is the same for all falling objects, i.e. the same force must cause objects to fall), Suddenly, the universe did not look like the perfect, eternal, static order that humans had been used to for centuries. Instead, it looked as disordered, imperfect and dynamic as the human world.
New inventions included: the telescope (1608), the microscope (1590s), the pendulum clock (1657), the thermometer (1611), the barometer (1644).
Both the self and the world were now open again to philosophical investigation. Rene‚ Descartes neatly separated matter and mind, two different substances, each governed by its set of laws (physical or mental). While the material world, including the body, is ultimately a machine, the soul is not: it cannot be “reduced” to the material world. His “dualism” was opposed by Thomas Hobbes’ “materialism”, according to which the soul is merely a feature of the body and human behavior is caused by physical laws.
Baruch Spinoza disagreed with both. He thought that only one substance exists: God. Nature is God (“pantheism”). The universe is God. This one substance is neither physical nor mental, and it is both. Things and souls are (finite) aspects (or “modes”) of that one (infinite) substance. Immortality is becoming one with God/Nature, realizing the eternity of everything.
Gottfried Leibniz went in the other direction: only minds exist, and everything has a mind. Matter is made of minds (“panpsychism”). Minds come in degrees, starting with matter (whose minds are very simple) and ending with God (whose mind is infinite). The universe is the set of all finite minds (or “monads”) that God has created. Their actions have been pre-determined by God. Monads are “clocks that strike hours together”.
Clearly, the scientific study of reality depended on perception, on the reliability of the senses. John Locke thought that all knowledge derives from experience (“empiricism”), and noticed that we only know the ideas and sensations in our mind. Those ideas and sensations are produced by perceptions, but we will never know for sure what caused those perceptions, how reality truly is out there: we only know the ideas that are created in our mind. Ideas rule our mind
On the contrary, George Berkeley, starting from the same premises (all we know is our perceptions) reached the opposite conclusion: that matter does not even exist, that only mind exists (“idealism”). Reality is inside our mind: an object is an experience. Objects do not exist apart from a subject that thinks them. The whole universe is a set of subjective experiences. Locke thought that we can never know how the world really is, but Berkeley replied that the world is exactly how it appears: it “is” what appears, and it is inside our mind. Our mind rules ideas
David Hume increased the dose of skepticism: if all ideas come from perception, then mind is only a theater in which perceptions play their parts in rapid succession. The self is an illusion. Mental life is a series of thoughts, feelings, sensations. A mind is a series of mental events. The mental events do exist. The self that is supposed to be thinking or feeling those mental events is a fiction.
Observation led physicists to their own view of the world. By studying gases, Robert Boyle concluded that matter must be made of innumerable elementary particles, or atoms. The features of an object are due to the features and to the motion of the particles that compose it.
Following Galileo’s intuitions and adopting Boyle’s atomistic view, Isaac Newton worked out a mathematical description of the motion of bodies in space and over time. He posited an absolute time and an absolute space, made of ordered instants and points. He assumed that forces can act at distance, and introduced an invisible “gravitational force” as the cause of planetary motion. He thus unified terrestrial and celestial Mechanics: all acceleration is caused by forces, the force that causes free fall being the gravitational force, that force being also the same force that causes the Earth to revolve around the Sun. Forces act on masses, a mass being the quantitative property that expressed Galileo’s inertia (the property of a material object to either remain at rest or in a uniform motion in the absence of external forces). Philosophers had been speculating that the universe might be a machine, but Newton did not just speculate: he wrote down the formulas.
Significant innovations were also introduced, for the first time in a long time, in Mathematics. Blaise Pascal invented the mathematical theory of probability (and built the first mechanical adding machine). Leibniz envisioned a universal language of logic (a “lingua characteristica”) that would allow to derive all possible knowledge simply by applying combinatorial rules of logic. Arabic numbers had been adopted in the 16th century. Signs for addition, subtraction, multiplication were introduced by Francois Vieta. John Napier invented logarithms. Descartes had developed analytical geometry, and Newton and Leibnitz independently developed calculus.
It might not be a coincidence that a similar scientific, mathematical approach can be found in the great composers of the era: Antonio Vivaldi, George-Frideric Handel and Johann Sebastian Bach.
The next big quantum leap in knowledge came with the “industrial” revolution. It is hard to pinpoint the birth date of the industrial revolution (in 1721 Thomas Lombe built perhaps the first factory in the world, in 1741 Lewis Paul opened the first cotton mill, in 1757 James Watt improved the steam engine), but it is clear where it happened: Manchester, England. That city benefited from a fortunate combination of factors: water mills, coal mines, Liverpool’s port and, last but not least, clock-making technology (the earliest factory mechanics were clock-makers). These factors were all in the hands of the middle class, so it is not surprising that the middle class (not the aristocracy or the government) ended up managing most of the enterprises.
The quantum leap in production translated into a quantum leap in transportation: in 1782 the first steamboat sailed up the Clyde, in 1787 John Wilkinson built the first iron boat, in 1812 Henry Bell started the first commercial steamboat service in Glasgow, in 1819 the “Savannah” completed the first transatlantic crossing by a steamboat, in 1820 the first iron steamship was built, etc. By 1892 Britain’s tonnage and sea-trade exceeds the rest of the world together. ). At its peak, Britain had only 2% of the world’s population, but produced almost 20% of the world’s manufacturing output
One of the most tangible side-effects of the industrial revolution was the British Empire. There had been “empires” before, and even larger ones (the Mongol empire). But never before had an empire stretched over so many continents: Africa, America, Oceania, Asia. The Roman empire had viewed itself as an exporter of “civilization” to the barbaric world, but the British Empire upped the ante by conceiving its imperialism as a self-appointed mission to redeem the world. Its empire was a fantastic business venture, that exported people, capital and goods, and created “world trade”, not just regional trade. This enterprise was supported by a military might that was largely due to financial responsibility at home. Despite the fact that France had a larger population and more resources, Britain managed to defeat France in the War of the Spanish Succession (1702-1713), in the Seven Years’ war (1756-1763) and in the Napoleonic wars (1795-1815.
Managing the British Empire was no easy task. One area that had to be vastly improved to manage a global empire was the area of global communications: steamships, railroads, the telegraph, the first undersea cable and a national post system unified the colonies as one nation. They created the first worldwide logistical system. Coal, a key element in a country in which wood was scarce, generated additional momentum for the improvement of shipbuilding technology and the invention of railroads (1825).
Other areas that the British Empire needed to standardize were finance and law. Thus the first economic and legal systems that were global, not only regional, were born. British economic supremacy lasted until 1869, when the first transcontinental railroad connecting the American prairies with the Atlantic Coast introduced a new formidable competitor: the USA.
No wonder, thus, that Adam Smith felt a new discipline had to be created, one that studied the dynamics of a complex economy based on the production and distribution of wealth. He explained the benefits of free competition and free trade, and how competition can work for the common good (as an “invisible hand”).
Jeremy Bentham (1789) introduced “utilitarian” criteria to decide what is good and what is bad: goodness is what guarantees “the greatest happiness for the greatest number of people”. The philosophy of “utilitarianism” was later perfected by John Stuart Mill, who wrote that “pleasure and freedom from pain are the only things desirable as ends” thus implying that good is whatever promote pleasure and prevents pain
France was much slower in adopting the industrial revolution, and never came even close to matching the pace of Britain industrialization, but the kingdom of the Bourbons went through a parallel “intellectual” revolution that was no less radical and influential: “Les Lumieres”, or the Enlightenment. It started in the salons of the aristocracy, usually run by the ladies, and then it spread throughout the French society. The “philosophes” believed, first and foremost, in the power of Reason and in Knowledge, as opposed to the religious and political dogmas. They hailed progress and scorned conservative attitudes. The mood changed dramatically, as these philosophers were able to openly say things that a century earlier would have been anathema. Scientific discoveries (Copernicus, Galileo, Newton), the exploration of the world, the printing press and a religious fatigue after so many religious wars led to cultural relativism: there are no dogmas, and only facts and logic should determine opinions. So they questioned authority (Aristotle, the Bible) across the board. Charles de Montesquieu, Denis Diderot, Voltaire, Rousseau favored a purely rational religion and carried out a moral crusade against intolerance, tyranny, superstition.
Julien LaMettrie was the ultimate materialist: he thought the mind is nothing but a machine (a computer, basically) and thoughts are due to the physical processes of the brain. There is nothing special about a mind or a life. Humans are just like all other animals.
Charles Bonnet speculated that the mind may not be able to influence the body, but might simply be a side-effect of the brain (“epiphenomenalism”).
Paul-Henri Holbach believed that humankind’s miseries are mostly caused by religion and superstition, that there is no God handing out rewards or punishment, that the soul dies when the body dies, that all phenomena can be understood in terms of the features of matter.
Georges Buffon concocted the first western account of the history of life and of the Earth that was not based on the Bible.
The American revolution (1776) was, ultimately, a practical application of the Enlightenment, a feasibility study of the ideas of the Enlightenment. The French Revolution (1789-94) was a consequence of the new political discourse, but also signaled an alliance between the rising bourgeoisie, the starving peasants and the exploited workers. Its outcome was that the “nation” replaced “God” and “King”: nationalism was born. By the turn of the century, the Enlightenment had also fathered a series of utopian ideologies, from Charles Fourier’s phalanxes to Claude Saint-Simon’s proto-socialism to Pierre Proudhon’s anarchy.
In marked contrast with the British and French philosophers, the Germans developed a more “spiritual” and less “materialistic” philosophy. The Germans were less interested in economy, society and politics, and much more interested in explaining the universe and the human mind, what we are and what is the thing out there that we perceive.
Immanuel Kant single-handedly framed the problem for future generations of philosophers. Noticing that the mind cannot perceive reality as it is, he believed that phenomena exist only insofar as the mind turns perceptions into ideas. The empirical world that appears to us is only a representation that takes place inside our mind. Our mind builds that representation thanks to some a-priori knowledge in the form of categories (such as space and time). These categories allow us to organize the chaotic flow of perceptions into an ordered meaningful world. Knowledge consists in categorizing perceptions. In other words, Kant said that knowledge depends on the structure of the mind.
Other German philosophers envisioned an even more “idealistic” philosophy.
Johann Fichte thought the natural world is construed by an infinite self as a challenge to itself and as a field in which to operate. The Self needs the non-Self in order to be.
Peter Schelling believed in a fundamental underlying unity of nature, which led to view Nature as God, and to deny the distinction between subject and object.
The spiritual theory of reality reached its apex with Georg-Wilhelm-Friedrich Hegel. He too believed in the unity of nature, that only the absolute (infinite pure mind) exists, and that everything else is an illusion. He proved it by noticing that every “thesis” has an “antithesis” that can be resolved at a higher level by a “synthesis”, and each synthesis becomes, in turns, a thesis with its own antithesis, which is resolved at a higher level of synthesis, and so forth. This endless loop leads to higher and higher levels of abstraction. The limit of this process is the synthesis of all syntheses: Hegel’s absolute. Reality is the “dialectical” unfolding of the absolute. Since we are part of the absolute as we develop our dialectical knowledge, it is, in a sense, the absolute that is trying to know itself. We suffer because we are alienated from the absolute instead of being united with it. Hegel applied the same “dialectical” method to history, believing that history is due to the conflict of nations, conflicts that are resolved on a higher plane of political order.
Arthur Schopenhauer (1819) opened a new dimension to the “idealistic” discourse by arguing that a human being is both a “knower” and a “willer”. As knowers, humans experience the world “from without” (the “cognitive” view). As free-willing beings, humans are also provided with a “view from within” (the “conative” view). The knowing intellect can only scratch the surface of reality, while the will is able to grasp its essence. Unfortunately, the will’s constant urge for ever more knowledge and action causes human unhappiness: we are victims of our insatiable will. In Buddhist-like fashion, Schopenhauer reasoned that the will is the origin of humansufferings: the less one “wills”, the less one suffers. Salvation can come through an “euthanasia of the will”.
Ludwig Feuerbach inverted Hegel’s relationship between the individual and the Absolute and saw religion as a way to project the human experience (“species being”) into the concept of God.
Soren Kierkegaard (1846) saw philosophy and science as vain and pointless, because the thinker can never be a detached, objective, external observer: the thinker is someone who exists and is part of what is observed. Existence is both the thinker’s object and condition. He thought that philosophers and scientists missed the point. What truly matters is the pathos of existing, not the truth of Logic. Logic is defined by necessity, but existence is dominated by possibility. Necessity is a feature of being, possibility is a feature of becoming. He focused on the fact that existence is possibility, possibility means choice, and choice causes angst. We are trapped in an “aut-aut”, between the aesthetic being (whose life is paralyzed by multiple possibilities) and the ethic being (whose life is committed to one choice). The only way out of the impasse is faith in God.
Inventions and discoveries of this age include Alessandro Volta’s battery, a device that converts chemical energy into electricity, John Dalton’s theory that matter is made of atoms of differing weights. By taking Newton to the letter, Pierre-Simon LaPlace argued that the future is fully determined: given the initial conditions, every future event in the universe can be calculated. The primacy of empirical science (“positivism”) was championed by Auguste Comte, who described the evolution of human civilization as three stages, corresponding to three stages of the human mind: the theological stage (in which events are explained by gods and kings rule); the abstract stage (in which events are explained by philosophy, and democracy rules); and the scientific (“positive”) stage (in which there is no absolute truth, but science provides generalizations that can be applied to the real world).
Hermann von Helmholtz offered a detailed picture of how perception works, one that emphasized how an unconscious process in the brain was responsible for turning sense data into thought and for mediating between perception and action.
In Mathematics, George Boole resuscitated Leibniz’s program of a “lingua characteristica” by applying algebraic methods to a variety of fields. His idea was that the systematic use of symbols eliminated the ambiguities of natural language. A number of mathematicians realized that the traditional (Euclidean) geometry was not the only possible geometry. Non-Euclidean geometries were developed by Carl-Friedrich Gauss, Nikolaj Lobachevsky (1826), Janos Bolyai (1829) and Georg Riemann (1854). The latter realized that the flat space of Euclidean geometry (the flat space used by Newton) was not necessarily the only possible kind of space: space could be curved, and he developed a geometry for curved space (in which even a straight line is curved, by definition). Each point of that space can be more or less curved, according to a “curvature tensor”.
Somehow, the convergence of utopianism, idealism and positivism yielded Karl Marx’s historical materialism. Marx was fully aware that humans are natural beings who have to interact with nature (work) in order to survive. Labor converts the raw materials of nature into the products that help humans survive. But in the industrial society the difference between the time/cost of manufacturing a product versus the price that people are willing to pay for it: had created a “surplus value” that was making the capitalist class richer and richer, while hardly benefiting the working class at all. Marx set out to analyze the “alienation” caused to the working class by the fact that producer and product had been separated. He envisioned the society of his time as divided into two antagonistic classes: the proletariat and the bourgeoisie. And he envisioned the whole of human history as a conflict not of nations but of classes. His remedy was socialism: all citizens should own the tools of production. After socialism, the final stage of human history was to be communism: the: full equality of a class-less society.
While human knowledge was expanding so rapidly, literature was entering the “romantic” age. The great poets of the age were William Blake and William Wordsworth in England, Friedrich Hoelderlin and Johann-Wolfgang Goethe in Germany, Giacomo Leopardi in Italy. With the exception of Carlo Goldoni’s comedies in Italy, theater was dominated by German drama: Gotthold-Ephraim Lessing in Germany), Friedrich von Schiller, Georg Buchner. The novel became a genre of equal standing with poetry and theater via Goethe’s “Wilhelm Meister” (1796), Stendhal’s “Le Rouge et Le Noir” (1830), Honore’ de Balzac’s “Le Pere Goriot” (1834), Emily Bronte’s “Wuthering Heights” (1847), ), Herman Melville’s “Moby Dick” (1851), Nikolaj Gogol’s “Dead Souls” (1852), Gustave Flaubert’s “Madame Bovary” (1857), Victor Hugo’s “Les Miserables” (1862
While painting was relatively uneventful compared with the previous age, despite the originality of works such as Francisco Goya’s“Aquelarre” (1821) and Jean-Francois Millet’s “The Gleaners” (1851), this was the age of classical music, that boasted the geniuses of Wolfgang-Amadeus Mozart, Franz-Peter Schubert and Ludwig Van Beethoven.
In the meantime, the world had become a European world. The partition of Africa (1885) had given Congo to Belgium, Mozambique and Angola to Portugal, Namibia and Tanzania to Germany, Somalia to Italy, Western Africa and Madagascar to France, and then Egypt, Sudan, Nigeria, Uganda, Kenya, South Africa, Zambia, Zimbabwe, Botswana to Britain. Then there were the “settler societies” created by the European immigrants who displaced the natives: Canada, USA, Australia, South Africa. In subject societies such as India’s (and, de facto, China’s), few Europeans ruled over huge masses of natives. The mixed-race societies of Latin America were actually the least “European”. There were fewer and shorter Intra-European wars but many more wars of conquest elsewhere. Europeans controlled about 35% of the planet in 1800, 67% in 1878, 84% in 1914.
Japan was the notable exception. It had been the least “friendly” to the European traders, and it became the first (and only) non-European civilization to “modernize” rapidly. In a sense, it became a “nation” in the European sense of the word. It was also the first non-European nation to defeat a European power (Russia). No wonder that the Japanese came to see themselves as the saviors of Asia: they were the only ones that had resisted European colonization.
To ordinary people, the age of wars among the European powers seemed to be only a distant memory. The world was becoming more homogeneous and less dangerous. One could travel from Cairo to Cape Town, from Lisbon to Beijing carrying with minimal formalities. It was “globalization” on a scale never seen before and not seen again for a century. Such a sense of security had not been felt since the days of the Roman empire, although, invisible to most, this was also the age of a delirious arms race that the world never had seen before.
No wonder that European population increased dramatically at the end of the 19th century. In 30 years, Germany’s population grew by 43%, Austria-Hungary’s by 35%, Britain’s by 26%. A continuous flow of people emigrated to the Americas.
After the French revolution, nationalism became the main factor of war. Wars were no longer feuds between kings, they were conflicts between peoples. This also led to national aspirations by the European peoples who did not have a country yet: notably Italians and Germans, who were finally united in 1861 and 1871 (but also the Jews, who had to wait much longer for a homeland). Nationalism was fed by mass education (history, geography, literature), which included, more or less subtly, the exaltation of the national past.
France lived its “Belle Epoque” (the 40 years of peace between 1871 and 1914). It was the age in which cafes (i.e., the lower classes) replaced the salons (i.e., the higher classes) as the cultural centers. And this new kind of cultural center witnessed an unprecedented convergence of sex, art and politics. Poetry turned towards “Decadentism” and “Symbolism”, movements pioneered by Charles Baudelaire’s “Les Fleurs du Mal” (1857), Isidore de Lautreamont’s “Les Chants de Maldoror” (1868), Arthur Rimbaud’s “Le Bateau Ivre” (1871), Paul Verlaine’s “Romances sans Paroles” (1874) and Stephane Mallarme’s “L’apres-midi d’un Faune” (1876). Painters developed “Impressionism”, which peaked with Claude Monet, and then “Cubism”, which peaked with Pablo Picasso, and, in between, original styles were pursued by Pierre Renoir, Georges Seurat, Henry Rousseau, Paul Gaugin and Henri Matisse. France had most of the influential artistic movements of the time. In the rest of Europe, painting relied on great individualities: Vincent van Gogh in Holland, Edvard Munch in Norway, Gustav Klimt in Austria and Marc Chagall in Russia. French writers founded “Dadaism” (1916) and “Surrealism” (1924), and an Italian in Paris founded “Futurism” (1909). They inherited the principle of the “Philosophes”: question authority and defy conventions, negate aesthetic and moral values. At the same time, they reacted against the ideological values of the Enlightenment itself: Dadaism exalted irrationality, Surrealism was fascinated by dreams, Futurism worshipped machines.
TM, ®, Copyright © 2003 Piero Scaruffi All rights reserved.
Berlin, in the meantime, had become not only the capital of a united Germany but also the capital of electricity. Germany’s pace of industrialization had been frantic. Werner Von Siemens founded Siemens in 1847. In 1866 that company invented the first practical dynamo. In 1879 Siemens demonstrated the first electric railway and In 1881 it demonstrated the first electric tram system. In 1887 Emil Rathenau founded Siemens’ main competitor, the Algemeine Elektrizitats Gesellschaft (AEG), specializing in electrical engineering, whereas Siemens was specializing in communication and information. In1890 AEG developed the alternating-current motor (invented in the USA by Nikola Tesla) and the generator, which allowed to build the first power plants: alternating current made it easier to transmit electricity over long distances. In 1910, Berlin was the greatest center of electrical production in the world Germany’s industrial output had passed from France’s (in 1875) and Britain’s (in 1900). Berlin was becoming a megalopolis, as its population grew from 1.9 million in 1890 to 3 million in 1910.
Electricity changed the daily lives of millions of people, mainly in the USA, because it enabled the advent of appliances, for example Josephine Cochrane’s dishwasher (1886), Willis Carrier’s air conditioner (1902), and General Electric’s commercial refrigerator (1911). Life in the office also changed dramatically. First (in 1868) Christopher Latham Sholes introduced a practical typewriter that changed the concept of corresponding, and then (in 1885) William Burroughs introduced an adding machine that changed the concept of accounting.
TM, ®, Copyright © 2003 Piero Scaruffi All rights reserved.
Progress in transportation continued with Friedrich von Harbou’s dirigible (1873), Daimler and Maybach’s motorcycle (1885), Karl Benz’s gasoline-powered car (1886) and Wilbur and Orville Wright’s airplane (1903). But, more importantly, the USA introduced a new kind of transportation, not physical (of people) but virtual (of information). The age of communications was born with Samuel Morse’s telegraph (1844), Alexander Bell’s telephone (1876), Thomas Edison’s phonograph (1877), Kodak’s first consumer camera (1886). Just like Louis Daguerre had invented the “daguerrotype” in 1839, but his invention had been improved mainly in the USA, so the Lumiere brothers invented cinema (in 1895) but the new invention soon became an American phenomenon. The most dramatic of these events was perhaps Guglielmo Marconi’s transatlantic radio transmission of 1901, when the world seemed to shrink.
“Creationist” views of the world had already been attacked in France by the “philosophes”. In the age of Progress, a new, much more scientific attack, came from Britain.
Herbert Spencer attempted a synthesis of human knowledge that led him to posit the formation of order as a pervasive feature of the universe. Basically, the universe is “programmed” to evolve towards more and more complex states. In particular, living matter continuously evolves. The fittest forms of life survive. Human progress (wealth, power) results from a similar survival of more advanced individuals, organizations, societies and cultures over their inferior competitors
Charles Darwin explained how animals evolved: through the combination of two processes, one of variation (the fact that children are not identical to the parents, and are not identical to each other) and selection (the fact that only some of the children survive). The indirect consequence of these two processes is “adaptation”, whereby species tend to evolve towards the configuration that can best cope with the environment. The “struggle for survival” became one of the fundamental laws of life. In a sense, Darwin had merely transferred Adam Smith’s economics to biology. But he had also introduced an important new paradigm: “design without a designer “. Nature can create amazingly complex and efficient organisms without any need for a “designer” (whether human or divine). Humans are used to the idea that someone designs, and then builds, an artifact. A solution to a problem requires some planning. But Darwin showed that Nature uses a different paradigm: it lets species evolve through the combined forces of variation and selection, and the result is a very efficient solution to the problem (survival). No design and no planning are necessary. It was more than a theory of evolution: it was a new way of thinking, that was immediately applied to economics, sociology, history, etc.
Ernst Haeckel argued that “ontogeny recapitulates phylogeny”: the development of the body in the individual of a species (or ontogeny) summarizes the evolutionary development of that species (phylogeny).
Far less publicized, but no less dramatic, was the discovery of Gregor Mendel. He set out to explain why children do not inherit the average of the traits of their parents (e.g., a color in between the black eyes of the mother and the blue eyes of the father) but only the trait of one or the other (black or blue eyes). He came up with a simple but, again, revolutionary explanation: there are units of transmission of traits (which today we call “genes”), and one inherit not a mathematical combination of her parents’ traits but either one or the other unit. Mendel introduced an important distinction: the “genotype” (as the program that determines how an organism looks like) versus “the phenotype” (the way the organism looks like).
Similar progress was going on in the study of the human mind. Paul Broca studied brain lesions to understand the structure of the brain and how it determines human behavior and personality.
The acceleration in Physics had been dramatic since Newton’s unification of terrestrial and celestial Mechanics, but the age of steam and electrical power introduced the first strains on its foundations. In 1824, Sadi Carnot had worked out a preliminary science of heat, or Thermodynamics, and in 1864 James Clerk Maxwell unified electricity and magnetism, thus founding Electromagnetism. In 1887 Heinrich Herz discovered radio waves, and in 1895 Wilhelm-Conrad Roentgen discovered X rays. Newton’s Physics had not been designed for these phenomena.
In 1896 radioactivity was discovered and led Physicists to believe that the atom was not indivisible, that it had its own structure. In 1900 Max Planck invented Quantum Theory by positing that energy can only be transmitted in discrete “quanta”. In 1905 Albert Einstein published “The Special Theory of Relativity”. In 1911 Ernest Rutherford showed how the atom is made of a nucleus and orbiting electrons.
Newton’s Physics viewed the world as a static and reversible system that undergoes no evolution, whose information is constant in time. Newton’s Physics was the science of being. But his Physics was not very useful to understand the world of machines, a dynamic world of becoming. Thermodynamics describes an evolving world in which irreversible processes occurs: something changes and can never be undone. Thermodynamics was the science of becoming. The science of being and the science of becoming describe dual aspects of nature. Thermodynamics was born to study gases: systems made of a myriad small particles in frantic motion. Newton’s Physics would require a dynamic equation for each of them, which is just not feasible. Thermodynamics describes a macroscopic system by global properties (such as temperature, pressure, volume). Global properties are due to the motion of its particles (e.g., temperature is the average kinetic energy of the molecules of a system). They are fundamentally stochastic, which implies that the same macro-state can be realized by different micro-states (e.g., a gas can have the same temperature at different points in time even if its internal status is changing all the time). Sadi Carnot realized that a “perpetual-motion” machine was not possible: it is not possible to continuously convert energy from one form to another and back. The reason is that any transformation of energy has a “cost” that came to be called “entropy”. That quantity became the real oddity of Thermodynamics. Everything else was due to a view of a complex system stochastic (as opposed to Newton’s deterministic view of a simple system), but entropy was a new concept, that embodied a fundamental feature of our universe: things decay, and some processes are not reversible. Heat flows spontaneously from hot to cold bodies, but the opposite never occurs. You can dissolve a lump of sugar in a cup of coffee, but, once it is dissolved, you can never bring it back. You may calculate the amount of sugar, its temperature and many other properties, but you cannot bring it back. Some happenings cannot be undone. The second law of Thermodynamics states that the entropy (of an isolated system) can never decrease. It is a feature of this universe that natural processes generate “entropy”. This translates into a formula that is not an equality. Newton’s Physics was built on the equal sign (something equals something else). Thermodynamics introduced the first law of nature that was an inequality.
Ludwig Boltzmann interpreted entropy as a measure of disorder in a system. He offered a statistical definition of entropy: the entropy of a macrostate is the logarithm of the number of its microstates. Entropy measures the very fact that many different microscopic states of a system may result in the same macroscopic state. One can interpret this fact as “how disordered the system is”. This interpretation combined with the second law of Thermodynamics led to the fear of an “eternal doom”: the universe must evolve in the direction of higher and higher entropy, thus towards the state of maximum entropy, which is absolute disorder, or the “heat death”.
Maxwell’s Electromagnetism introduced another paradigm shift: the concept of field (pioneered by Michael Faraday). Maxwell proved that electricity and magnetism, apparently related to different phenomena, are, in reality, the same phenomenon. Depending on circumstances, one can witness only the electrical or only the magnetic side of things, but they actually coexist all the time. The electric force is created by changes in the magnetic field. The magnetic force is created by changes in the electric field. The oddity was that the mathematical expression of these relations between electric and magnetic forces turned out to be “field equations”, equations describing not the motion of particles but the behavior of fields. Fields are generated by waves that radiate through space. Gravitation was not any longer the only example of action at distance: just like there was a “gravitational field” associated to any mass, so there turned out to exist an “electromagnetic” field to any electrical charge. Light itself was shown to be made up of electromagnetic waves.
Ernst Mach was another influential Physicist who had a powerful intuition. He envisioned the inertia of a body (the tendency of a body at rest to remain at rest and of a body in motion to continue moving in the same direction) as the consequence of a relationship of that body with the rest of the matter in the universe. Basically, he thought that each body in the universe interacts with all the other bodies in the universe, even at gigantic distances, and its inertia is the sum of those myriad interactions.
The last of the major ideas in Physics before Relativity came from Henri Poincare, who pioneered “chaos” theory when he pointed out that a slight change in the initial conditions of some equations results in large-scale differences. Some systems live “at the edge”: a slight change in the initial conditions can have catastrophic effects on their behavior.
The intellectual leadership, though, was passing to the Mathematicians.
By inventing “set theory”, Georg Cantor emancipated Mathematics from its traditional domain (numbers). He also introduced “numbers” to deal with infinite quantities (“transfinite” numbers) because he realized that space and time are made of infinite points, and that, between any two points, there exists always an infinite number of points. Nonetheless, an infinite series of numbers can have a finite sum. These were, after all, the same notions that, centuries before Cantor, had puzzled Zeno. Cantor gave them mathematical legitimacy.
Gottlob Frege (1884) aimed at removing intuition from arithmetic. He thus set out, just like Leibniz and Boole before him, to replacing natural language with the language of Logic, “predicate calculus”. Extending Cantor’s program, Frege turned Mathematics itself into a branch of Logic: using Cantor’s “sets”, he reconstructed the cardinal numbers by a purely logical method that did not rely on intuition.
Frege realized that Logic was about the “syntax”, not the “semantics” of propositions. An expression has a “sense” (or intension) and “a reference” (or extension): “red” is the word for the concept of redness and the word for all the things that are red. In some cases, expressions with different senses actually have the same referent. For example, “the star of the morning” and “the star of the evening”, that both refer to Venus. In particular, propositions of Logic can have many senses, but only have one of two referents: true or false.
Giuseppe Peano was pursuing a similar program at the same time, an “axiomatization” of the theory of natural numbers.
Charles Peirce gave a pragmatic definition of “truth”: something is true if it can be used and validated. Thus, truth is defined by consensus. Truth is not agreement with reality, it is agreement among humans. Truth is “true enough”. Truth is not eternal. Truth is a process, a process of self-verification. In general, he believed that an object is defined by the effects of its use: a definition that works well is a good definition. An object “is” its behavior. The meaning of a concept consists in its practical effects on our daily lives: if two ideas have the same practical effects on us, they have the same meaning.
Peirce was therefore more interested in “beliefs” than in “truths”. Beliefs lead to habits that get reinforced through experience. He saw that the process of habit creation is pervasive in nature: all matter can be said to acquire habits, except that the “beliefs” of inert matter have been fixed to the extent that they can’t be changed anymore. Habit is, ultimately, what makes objects what they are. It is also what makes us what we are: I am my habits. Habits are progressively removing chance from the universe. The universe is evolving from an original chaos in which chance prevailed and there were no habits towards an absolute order in which all habits have been fixed.
At the same time, Peirce realized that Frege’s theory of sense and referent was limited. Peirce introduced the first version of “semiotics” that focused on what signs are. An index is a sign that bears a causal relation with its referent (for example, cigarette smokes “means” that someone was in the room). An icon is a sign that bears a relation of similarity with its referent (for example, the image of a car refers to the car). A symbol is a sign that bears a relation with its referent that is purely conventional (for example, the letters “car” refers to a car). A sign refers to an object only through the mediation of other signs (or “interpretants”). There is an infinite regression of interpretants from the signifier (the sign) to the signified (the referent). A dictionary defines a word in terms of other words, which are defined in terms of other words, which are defined in terms of other words, and so forth. Peirce believed that knowing is “semiosis” (making signs) and semiosis is an endless process.
Philosophy was less interested in Logic and more interested in the human condition, the “existentialist” direction that Schopenhauer and Kierkegaard had inaugurated.
Friedrich Nietzsche believed that humans are driven by the “will to power”, an irresistible urge to order the course of one’s experiences (an extension of Schopenhauer’s will to live). All living beings strive for a higher order of their living condition to overcome their present state’s limitations. Human limitations are exemplified by Science: Science is only an interpretation of the world. Truth and knowledge are only relative to how useful they are to our “will to power”. He viewed Christian morality as a device invented by the weak to assert their will to power over the strong, a “slave morality”. He believed that Christian values had become obsolete (“God is dead”) and advocated a new morality founded on the ideal of the “superman”, who rises above the masses and solves the problems of this world, not of the otherworld.
Henri Bergson was, instead, a very spiritual philosopher, for whom reality was merely the eternal flow of a pantheistic whole. This flow has two directions: the upward flow is life, the downward flow is inert matter. Humans are torn between Intellect and Intuition: Intellect is life observing inert matter (in space), whereas Intuition is life observing life (in time). Intellect can “understand” inert matter, not only Intuition can “grasp” life. In order to understand matter, Intellect breaks it down into objects located in space. Intuition, instead, grasps the flow of life as a whole in time.
Francis-Herbert Bradley was the last major “idealist”. He argued that all categories of science (e.g., space and time) can be proven to be contradictory, which proves that the world is a fiction, a product of the mind. The only reality has to be a unity of all things, the absolute.
Inevitably, the focus of knowledge shifted towards the psyche.
William James adapted Peirce’s “pragmatism” to the realm of the mind. He believed that the function of mind is to help the body to live in an environment, just like any other organ. The brain is an organ that evolved because of its usefulness for survival. The brain is organized as an associative network, and associations are governed by a rule of reinforcement, so that it creates “habits” out of regularities (stimulus-response patterns). A habit gets reinforced as it succeeds. The function of thinking is pragmatic: to produce habits of action. James was intrigued by the fact that the brain, in doing so, also produced “consciousness”, but thought that mental life is not a substance, it is a process (“the stream of consciousness”).
Edward Thorndike postulated the “law of effect”: animals learn based on the outcome of their actions. He envisioned the brain as a network: learning occurs when elements are connected. Behavior is due to the association of stimuli with responses that is generated through those connections. A habit is a chain of “stimulus-response” pairs.
Wilhelm-Max Wundt had founded Psychology to study the psyche via experiments and logic, not mere speculation. The classical model of Psychology was roughly this. Actions have a motive. Motives are hosted in our minds and controlled by our minds. Motives express an imbalance between desire and reality that the mind tries to remedy by changing the reality via action. An action, therefore, is meant to restore the balance between reality and our desires. But what about dreams?
Sigmund Freud was less revolutionary than he seemed to be, because, in principle, he simply applied the classical model of Psychology. He decided that dreams have a motive, that those motives are in the mind, and that they are meant to remedy an imbalance. Except that the motives of dreams are not conscious: the mind contains both conscious motives and unconscious motives. There is a repertory of motives that our mind, independent of our will, has created over the years, and they participate daily in determining our actions. Freud’s revolution was in separating motive and awareness. A dream is only apparently meaningless: it is meaningless if interpreted from the conscious motives. But the dream is perfectly logical if one considers also the unconscious motives. The meaning of dreams are hidden and reflect memories of emotionally meaningful experience. Dreams are not prophecies, as ancient oracles believed, but hidden memories. Psychoanalysis was the discipline invented by Freud to sort out the unconscious mess.
Freud divided the self in different parts that coexist. The ego perceives, learns and acts consciously. The super-ego is the (largely unconscious) moral conscience which was created during childhood by parental guidance as an instrument of self-repression The id is the repertory of unconscious memories created by “libido”.
Somewhat unnecessarily, Freud painted a repulsive picture of the human soul. He believed that the main motive was “libido” (sexual desires) and that a child is, first and foremost, a sexual being. As parents repress the child’s sexuality, the child undergoes oral, anal and phallic stages. Boys desire sex with their mother and are afraid their father wants to castrate them. Girls envy the penis and are attracted to their father. And so forth.
Carl Jung shifted the focus towards a different kind of unconscious, the collective unconscious. He saw motives not so much in the history of the individual as in the history of the entire human race. His unconscious is a repertory of motives created over the millennia and shared by all humakind. Its “archetypes” spontaneously emerge in all minds. All human brains are “wired” to create some myths rather than others. Thus mythology is the key to understanding the human mind, because myths are precisely the keys to unlock those motives. Dreams reflect this collective unconscious, and therefore connect the individual with the rest of humankind and its archaic past. For Jung, the goal of Psychoanalysis is a spiritual renewal through the mystical connection with our primitive ancestors.
Another discipline invented at the turn of the century was Hermeneutics. Wilhelm Dilthey argued that human knowledge can only be understood by placing the knower’s life in its historical context. Understanding a text implies understanding the relationship between the author and its age. This applies in general to all cultural products, because they are all analogous to written texts.
Ferdinand Saussure was the father of “Structuralism”. The meaning of any human phenomenon (e.g, language) lies the network of relationships that it is part of. A sign is meaningful only within the entire network of signs, and the meaning of a sign “is” its relationship to other signs. Language is a system of signs having no reference to anything outside itself. He also separated “parole” (a specific utterance in a language, or a speaker’s performance) from “langue” (the entire body of the language, or a speaker’s competence), thus laying the foundations for Linguistics.
Edmund Husserl’s aim was to found “Phenomenology”, the science of phenomena. He believed that the essence of events is not their physical description provided by science, but the way we experience them. In fact, science caused a crisis by denying humans the truth of what they experience, by moving away from phenomena as they are. He pointed out that consciousness is “consciousness of”: it correlates the act of knowing (“noesis”) of the subject and the object that is known (“noema”). The self knows a phenomenon “intuitively”. The essence (“eidos”) of a phenomenon is the sum of all possible “intuitive” ways of knowing that phenomenon. The eidos can be achieve only after “bracketing out” the physical description of the phenomenon, only after removing the pollution of science from the human experience, so that the self can experience a purely transcendental knowledge of the phenomenon. This would restore the unity of subject and object that science separated.
In Physics, a number of ideas were converging towards the same view of the world. Henri Poincare` showed that the speed of light has to be the maximum speed and that mass depends on speed. Hendrik Lorentz unified Newton’s equations for the dynamics of bodies and Maxwell’s equations for the dynamics of electromagnetic waves in one set of equations, the “Lorentz transformations”. These equations, which were hard to dispute because both Newton’s and Maxwell’s theories were confirmed by countless experiments, contained a couple of odd implications: bodies seemed to contract with speed, while clocks seemed to slow down.
Albert Einstein devised an elegant unification of all these ideas that matched, in scope, the one provided two centuries earlier by Newton. He used strict logic. His axioms were that the laws of nature must be uniform, that those laws must be the same in all frames of reference that are “inertial” (at rest or moving of linear uniform motion), and that the speed of light was the same in all directions. He took the oddities of the Lorentz transformations literally: length and duration appear different to different observers, depending on their state of motion, because space and time are relative. “Now” and “here” became meaningless concepts. The implications of his axioms were even more powerful. All physical quantities were now expressed in four dimensions, a time component and a three-dimensional space component. One, in particular, represented both energy and momentum, depending on the space-time coordinate that one examined. It also yield the equivalence between mass and energy (E=mc2). Time does not flow (no more than space does): it is just a dimension. A life is a series of points in space-time, points that have both a spatial and a temporal component.
Einstein’s world was still Newton’s world, though, in some fundamental ways. For example, it was deterministic: the past determines the future. There was one major limitation: because nothing can travel faster than light, there is a limit to what can happen in one’s life. Each observer’s history is constrained by a cone of light within the space-time continuum radiating from the point (space and time) where the observer “is”.
Einstein’s next step was to look for a science that was not limited to “inertial” systems. He believed that phenomena should appear the same for all systems accelerated with respect to one another. His new formulas had new startling implications. The dynamic of the universe was reduced to the interaction between masses and the geometry of space-time: masses curve space-time, and the curvature of space-time determines how masses move. Space-time is warped by all the masses that is studded with. Every object left to itself moves along a “geodesic” of space-time (the shortest route between two points on the warped surface of space-time). It so happens that space-time is warped, and thus objects appear to be “attracted” by the objects in space-time that have warped it. But each object is simply moving on a geodesic (the equivalent of a straight line in traditional “flat” space). It is space-time that is curved, not the geodesic (the trajectory) of the body. Space-time “is” the gravitational field. Einstein thus reduced Physics to Geometry. The curvature of space-time is measured by a “curvature tensor” (as in Riemann’s geometry) such that each point in space-time is described by ten numbers (the “metric tensor”). If the metric tensor is reduced to zero curvature, one obtains traditional Physics in traditional flat space. Curvature (i.e., a gravitational field) also causes clocks to slow down and light to be deflected.
Surprisingly, Einstein’s Relativity, that granted a special status to the observer, re-opened the doors to Eastern spirituality. Nishida Kitaro was perhaps the most distinguished Eastern philosopher to attempt a unification of western science and Zen Buddhism. In Kitaro’s system, western science is like a robot without feelings or ethics that provides the rational foundations for life, while Zen provides the feelings and the ethics. “Mu” is the immeasurable moment in space-time (“less than a moment”) that has to be “lived” in order to reach the next “mu”. The flow of “mu” creates a space-time topology. Mu’s infinitesemal brief presence creates past, present, and future. The “eternal now” contains one’s whole being and also the being of all other things. The present is merely an aspect of the eternal. The eternal generates all the time a present. Mu also creates self-consciousness and free will. There is a fundamental unity of the universe, in particular between the self and the world. Each self and each thing are expressions of the same reality, God. The self is not a substance: it is nothingness (“to study the self is to forget the self”). Religion, not science, is the culmination of knowledge. It is also the culmination of love.
The European countries (and at least two of their former colonies, Brazil and the USA) experienced an unprecedented boom in literature. The great novels of the time expanded over the genres invented by the previous generations: Leo Tolstoj’s “War and Peace” (1869), George Eliot’s “Middlemarch” (1872), Emile Zola’s “L’Assommoir” (1877), Fodor Dostoevsky’s “Brothers Karamazov” (1880), Joaquim-Maria Machado de Assis’ “Memorias Postumas” (1881), Joris Huysmans’ “A Rebours” (1884), Perez Galdos’ “Tristana” (1892), Jose-Maria Eca de Queiros’ “Casa de Ramires” (1897), Thomas Mann’s “Buddenbrooks” (1901), Henry James’ Golden Bowl (1904), Luigi Pirandello’s “Il Fu Mattia Pascal” (1904), Joseph Conrad’s “Nostromo” (1904), Maksim Gorkij’s “The Mother” (1907) and Franz Kafka’s “Der Prozess” (1915).
Theatre was largely reinvented both as a realist and as a fantastic art through Henrik Ibsen’s “Wild Duck” (1884), Alfred Jarry’s “Ubu Roi” (1894), August Strindberg’s “The Dream” (1902), Anton Chekhov’s “The Cherries Garden” (1904), Gerhart Hauptmann’s “The Weavers” (1892), Arthur Schnitzler’s “Reigen “ (1896), Frank Wedekind’s “The Book of Pandora” (1904), Bernard Shaw’s “Pygmalion” (1914).
Poetry works outside of France’s “isms” ranged from Robert Browning’s “The Ring And The Book” (1869) to Gerald-Manley Hopkins’ “The Wreck Of The Deutschland” (1876), from Ruben Dario’s “Prosas Profanas” (1896) to Giovanni Pascoli’s “Canti di Castelvecchio” (1903), from Antonio Machado’s “Campos de Castilla” (1912) to Rabindranath Tagore’s “Gitanjali” (1913). In the new century, France still led the way of literary fashion with Guillaume Apollinaire’s “Alcools” (1913) and Paul Valery’s “La Jeune Parque” (1917).
Classical music reflected the nationalist spirit of the age (Richard Wagner in Germany, Hector Berlioz in France, Modest Moussorgsky in Russia, Giuseppe Verdi in Italy, Antonin Dvorak in the Czech Republic, Fryderyk Chopin in Poland, Ferencz Liszt in Hungary) and the impact of Beethoven’s symphonies on the German-speaking world (Johannes Brahms, Richard Strauss, Joseph Bruckner and Gustav Mahler).
At the beginning of the new century, a number of compositions announced that the classical format was about to exhaust its mission: Aleksandr Skrjabin’s “Divine Poem” (1905), Arnold Schoenberg’s “Pierrot Lunaire” (1912), Claude Debussy’s “Jeux” (1912), Igor Stravinskij’s “Le Sacre du Printemps” (1913), Charles Ives’ “Symphony 4” (1916), Sergej Prokofev’ “Classic Symphony” (1917) and Erik Satie’s “Socrates” (1918).
All the progress in Science, Philosophy and the Arts did not help avert a new international war, one so large that was called a “world war”. Its immediate causes (1914) were insignificant. The real causes were the “nations” themselves. The nationalistic spirit caused the confrontation, and the confrontation caused a massive arms race, and this race turned each European nation into a formidable war machine. Soldiers were transported by battleship, submarine, zeppelin, air fighter, train, car and tank. Enemies were killed with grenades, cannons, machine guns, torpedoes, bombs. 60 million men were mobilized. 8 million died. Serbia, Russia, France, Britain, Japan, Canada, Australia, Italy (1915), China (1917) and the USA (1917) won against Austria, Germany and Turkey. Russia was allied with the winners, but had to withdraw to take care of its own revolution (1917).
The post-war age opened with three new political “isms”: Vladimir Ilic Lenin’s communism (1917), Benito Mussolini’s fascism (1922) and Adolf Hitler’s nazism (1933). Mussolini and Hitler capitalized on the nationalist spirit of the two youngest nations of Europe. The Russian revolution was two revolutions in one. The first one (in february) was caused by food shortages, and involved women, workers and soldiers. The second one (in october) was in reality a coup by Lenin’s Bolshevik Party, determined to apply Leon Trotsky’s program of “permanent revolution” (bypass the bourgeoise-democratic society and aim directly for the dictatorship of the proletariat). Lenin inaugurated a collectivist economy supported by a terror apparatus. Lenin was succeeded by Joseph Stalin, under whose rule Marxism-Leninism became the euphemism for a vast, pervasive, centralized bureaucracy in charge of every aspect of life (the “nomenklatura” system). The communist goal required the mobilization of all human and material resources to generate economic power which guaranteed political and military power.
The three “isms” had something in common, besides the totalitarian regime: they soon became ideologies of mass murder. Lenin’s was scientific, with the goal to create absolute dictatorship (of the proletariat) via absolute violence; Stalin’s was political, to safeguard and increase his own power; Hitler’s was racist, to annihilate inferior races; Mao’s was idealist, to create a just society.
But they did not invent genocide: 2.4 million Chinese died in the 1911 revolution and 2 million would die in the civil war of 1928-1937, the Ottoman empire slaughtered 1.2 million Armenians in 1915, World War I killed 8 million soldiers. Britain had already experimented on concentration camps in the Boer war (1899-02).
However, the numbers escalated with the new ideologies of mass murder: Lenin’s “revolution” killed 5 million; Stalin’s purges of 1936-37 killed 13 million; World War 2 killed 55 million, of which millions in Hitler’s gas chambers; Mao’s “Great Leap Forward” (1958-1961) caused the death of perhaps 30 million and his “cultural revolution” (1966-1969) caused the death of perhaps 11 million.
In the meantime, Physics was still in a fluctuating state.
Niels Bohr (1913) showed that electrons are arranged in concentric shells outside the nucleus of the atom, with the number of electrons determining the atomic number of the atom and the outermost shell of electrons determining its chemical behavior. Paul Rutherford (1919) showed that the nucleus of the atom contains positively charged particles (protons) in equal number to the number of electrons. In 1932 James Chadwick showed that the nucleus of the atom contains electrically neutral particles (neutrons): isotopes are atoms of the same element (containing the same number of electrons/protons) but with different numbers of neutrons. Their model of the atom was another case of Nature preferring only discrete values instead of all possible values. (Max Planck had shown in 1900 that atoms can emit energy only in discrete amounts).
At this point, Physics was aware of three fundamental forces: the electromagnetic force, the gravitational force and now the nuclear force.
The theory that developed from these discoveries was labeled “Quantum Mechanics”. It was born to explain why Nature prefers some “quanta” instead of all possible values. Forces are due to exchanges of discrete amounts of energy (“quanta”).
The key intuition came in 1923, when Louis DeBroglie argued that matter can be viewed both as particles and waves: they are dual aspects of the same reality..This also explained the energy-frequency equivalence discovered by Albert Einstein in 1905: the energy of a photon is proportional to the frequency of the radiation.
Max Born realized (1926) that the “wave” corresponding to a particle was a wave of probabilities, it was a representation of the state of the particle. Unlike a pointless particle, a wave can be in several places at the same time. The implication was that the state of a particle was not a specific value, but a range of values. A “wave function” specifies the values that a certain quantity can assume, and, in a sense, states that the quantity “has” all those values (e.g., the particle “is” in all the places compatible with its wave function). The “wave function” summarizes (“superposes”) all the possible alternatives. Erwin Schroedinger’s equation describes how this wave function evolves in time, just like Newton’s equations describe how a classical physical quantity evolves in time. The difference is that, at every point in time, Schroedinger’s equation yields a range of values (the wave function) not a specific value.
The probability associated with each of those possible values is the probability that an observation would reveal that specific value (e.g., that an observation would find the particle in one specific point). This was a dramatic departure for Physics. Determinism was gone, because the state of a quantum system cannot be determined anymore. Chance had entered the picture, because, when a Physicist performs an observation, Nature decides randomly which of the possible values to reveal. And a discontinuity had been introduced between unobserved reality and observed reality: as long as nobody measures it, a quantity has many values (e.g., a particle is in many places at the same time), but, as soon as someone measures it, the quantity assumes only one of those values (e.g, the particle is in one specific point).
The fact that the equations of different quantities were linked together (a consequence of Einstein’s energy-frequency equivalence) had another odd implication, expressed by Werner Heisenberg’s “uncertainty principle”: there is a limit to the precision with which we can measure quantities. The more precise we want to be about a certain quantity, the less precise we will be about some other quantity.
Space-time turns out to be discrete: there is a minimum size to lengths and intervals, below which Physics ceases to operate. Thus, there is a limit to how small a physical system can be.
Later, Physicists would realize that vacuum itself is unrecognizable in Quantum Mechanics: it is not empty.
Besides randomness, which was already difficult to digest, Physicists also had to accept “non-locality”: a system can affect a distant system despite the fact that they are not communicating. If two systems get entangled in a wave, they will remain so forever, even if they move to the opposite sides of the universe, at a distance at which a signal cannot travel in time to tell one what the other one is doing.
If this were not enough, Paul Dirac (1928) realized that the equations of Quantum Mechanics allowed for “anti-matter” to exist next to usual matter, for example a positively charged electron exists that looks just like the electron but has the opposite charge. Paul Dirac’s equations for the electron in an electromagnetic field, which combined Quantum Mechanics and Special Relativity, transferred Quantum Theory outside Mechanics, into Quantum Electrodynamics.
Perhaps the most intriguing aspect of Quantum Mechanics is that a measurement causes a “collapse” of the wave function. The observer changes the course of the universe by the simple act of looking at a particle inside a microscope.
This led to different “interpretations” of Quantum Mechanics. Niels Bohr argued that maybe only phenomena are real. Werner Heisenberg, instead, thought that maybe the world “is” made of possibility waves. Paul Dirac thought that Quantum Mechanics simply represents our (imperfect) knowledge of a system. Hugh Everett took the multiple possible values of each quantity literally, and hypothized that we live in an ever multiplying “multiverse”: at each point in time, the universe splits according to all the possible values of a measurement. In each new universe one of the possible values is observed, and life goes on.
John Von Neumann asked at which point does the collapse occur. If a measurement causes Nature to choose one value, and only one, among the many that are allowed by Schroedinger’s equation, “when” does this occur? In other words, where in the measuring apparatus does this occur? The measurement is performed by having a machine interact with the quantum system and eventually deliver a visual measurement to the human brain. Somewhere in this process a range of possibilities collapses into one specific value. Somewhere in this process the quantum world of waves collapses into the classical world of objects. Measurement consists in a chain of interactions between the apparatus and the system, whereby the states of the apparatus become dependent on the states of the system. Eventually, states of the observer’s consciousness are made dependent on states of the system, and the observer “knows” what the value of the observable is. If we proceed backwards, this seems to imply that the “collapse” occurs in the conscious being, and that consciousness creates reality.
Einstein was the main critic: he believed that Quantum Mechanics was an incomplete description of the universe, and that some “hidden variables” would eventually turn it into a deterministic science just like traditional science and his own Relativity.
From the beginning, it was obvious what was going to be the biggest challenge for Quantum Mechanics: discovering the “quantum” of gravitation. Einstein had explained gravitation as the curvature of space-time, but Quantum Mechanics was founded on the premise that each force is due to the exchange of quanta: Gravity did not seem to work that way, though.
A further blow to the traditional view of the universe came from Edwin Hubble’s discovery (1929) that the universe is expanding. It is not only the Earth that is moving around the Sun, and the Sun that is moving around the center of our galaxy: but all galaxies are moving away from each other.
The emerging discipline was Biology. By the 1940s Darwin’s theory of evolution (variation plus selection) had been finally wed to Mendel’s theory of genetic transmission (mutation), yielding the “modern synthesis”. Basically, Mendel’s mutation explained were Darwin’s variation came from. At the same time, biologists focused on population, not individuals, using the mathematical tool of probabilities. “Population Genetics” was born.
Erwin Schroedinger noticed an apparent paradox in the biological world: as species evolve and as organisms grow, life creates order from disorder, thus contradicting the second law of Thermodynamics. The solution to this paradox is that life is not a “closed” system: the biological world is a world of energy flux. An organism stays alive (i.e., maintains its highly organized state) by absorbing energy from the outside world and processing it to decrease its own entropy (i.e., increase its own order). “Living organisms feed upon negative entropy”. Life is “negentropic”. The effect of life’s negentropy is that entropy increases in the outside world. The survival of a living being depends on increasing the entropy of the rest of the universe.
However, the lives of ordinary people were probably more affected by a humbler kind of science that became pervasive: synthetic materials. In 1907 Leo Baekeland invented the first plastic (“bakelite”). In 1925 cellophane was introduced and in 1930 it was the turn of polystyrene. In 1935 Wallace Carothers invented nylon.
The influence of Einstein can also be seen on Samuel Alexander, who believed in “emergent evolution”: existence is hierarchically arranged and each stage emerges from the previous one. Matter emerges from space-time, life emerges from matter, mind emerges from life, God emerges from mind.
Arguing against both idealism, materialism and dualism, Bertrand Russell took Einstein literally and adopted the view that there is no substance (“neutral monism”): everything in the universe is made of space-time events, and events are neither mental nor physical. Matter and mind are different ways of organizing space-time.
Elsewhere, he conceived of consciousness as a sense organ that allows us to perceive some of the processes that occur in our brain. Consciousness provides us with direct, immediate awareness of what is in the brain, whereas the senses “observe” what is in the brain. What a neurophysiologist really sees while examining someone else’s brain is part of her own (the neurologist’s) brain.
But Bertrand Russell was perhaps more influential in criticizing Frege’s program. He found a paradox that seemed to terminate the program to formalize Mathematics: the class of all the classes that are not members of themselves is both a member and not a member of itself (the barber who shaves all barbers who do not shave themselves both shaves and does not shave himself). He solved the paradox (and other similar paradoxes, such as the proposition “I am lying” which is true if it is false and false if it is true) by introducing a “theory of types”, which basically resolved logical contradictions at a higher level.
Ludwig Wittgenstein erected another ambitious logical system. Believing that most philosophical problems are non-issues created by linguistic misunderstandings, he set out to investigate the nature of language. He concluded that the meaning of the world cannot be understood from inside the world, and thus metaphysics cannot be justified from inside the world (no more and no less than religion or magic). Mathematics also lost some of its appeal: it cannot be grounded in the world, therefore it is but a game played by mathematicians.
Wittgenstein saw that language has a function, that words are tools. Language is a game between people, and it involves more than a mere transcription of meaning: it involves assertions, commands, questions, etc. The meaning of a proposition can only be understood in its context, and the meaning of a word is due to the consensus of a society. To understand a word is to understand a language.
Edward Sapir argued that language and thought influence each other. Thought shapes language, but language also shapes thought. In fact, the structure of a language exerts an influence on the way its speakers understand the world. Each language contains a “hidden metaphysics”, an implicit classification of experience, a cultural model, a system of values. Language implies the categories by which its speakers not only communicate but also think.
Lev Vygotsky reached a similar conclusion from a developmental viewpoint: language mediates between society and the child. Language guides the child’s cognitive growth. Thus, cognitive faculties are merely internalized versions of social processes that we learned via language as children. Thus, one’s cognitive development (way of thinking) depends on the society in which she grew up.
Something similar to the wave/particle dualism of Physics was taking place in Psychology. Behaviorists such as John Watson, Ivan Pavlov and Burrhus Skinner believed that behavior is due to stimulus-response patterns. Animals learn how to respond to a stimulus based on reward/punishment, i.e. via selective reinforcement of random responses. All of behavior can be reduced to such “conditioned” learning. This also provided an elegant parallel with Darwinian evolution, which is also based on selection by the environment of random mutations. Behaviorists downplayed mind: thoughts have no effect on our actions.
Cognitivists such as Max Wertheimer, Wolfgang Kohler and Karl Lashley (the “gestalt” school) believed just the opposite: an individual stimulus does not cause an individual response. We perceive (and react to) “form”, as a whole, not individual stimuli. We recognize objects not by focusing on the details of each image, but by focusing the image as a whole. We solve problems not by breaking them down in more and more minute details, but via sudden insight, often by restructuring the field of perception. Cognitivists believed that the processing (thought) between input and output was the key to human behavior, whereas Behaviorists believed that behavior was just a matter of linking outputs with inputs.
Cognitivists conceived the brain as a holistic system. Functions are not localized but distributed around the brain. If a piece of the brain stops working, the brain as a whole may still be working. They envisioned memory as an electromagnetic field, and a specific memory as a wave within that field.
Otto Selz was influenced by this school when he argued that to solve a problem entails to recognize the situation and to fill in the gaps: information in excess contains the solution. Thus solving a problem consists in comprehending it, and comprehending it consists in reducing the current situation to a past situation. Once we “comprehend” it, we can also anticipate what comes next: inferring is anticipating.
Last, but not least, Fredrick Bartlett suggested that memory is not a kind of storage, because it obviously does not remember the single words and images. Memory “reconstructs” the past. We are perfectly capable of describing a scene or a novel or a film even though we cannot remember the vast majority of the details. Memory has “encoded” the past in an efficient format of “schemata” that bear little resemblance to the original scenes and stories, but that take little space and make it easy to reconstruct them when needed.
Kurt Goldstein’s theory of disease is also an example of cognitivist thinking. Goldstein took issue against dividing an organism into separate “organs”: it is the whole that reacts to the environment. A “disease” is the manifestation of a change in the relationship between the organism and its environment. Healing is not a “repair”, but an adaptation of the whole organism to the new state. A sick body is, in fact, a system that is undergoing global reorganization.
Jean Piaget focused entirely on the mind, and precisely on the “growth” of the mind. He realized that, during our lifetime, the mind grows, just like the body grows. For him cognition was self-regulation: organisms need to constantly maintain a state of equilibrium with their environment.
Piaget believed that humans achieve that equilibrium through a number of stages, each stage corresponding with a reorganization of our cognitive life. This was not a linear, gradual progress of learning, but a discontinuous process of sudden cognitive jumps. Overall, the growth of the mind was a transition from the stage of early childhood, in which the dominant factor is perception, which is irreversible, to the stage of adulthood in which the dominant factor is abstract thought, which is reversible.
Charlie-Dunbar Broad was a materialist in the age of behaviorists and cognitivists. He believed that mind was an emergent property of the brain, just like electricity is an emergent property of conductors. Ultimately, all is matter.
That is not to say that the “spiritual” discourse was dead. Martin Buber that humans were mistaken in turning subjects into objects and losing the meaning of God. He argued that our original state was one of “I-You”, in which the “I” recognizes other “I”’s in the world, but we moved towards a “I-It” state, in which the “I” sees both objects and people merely as means to an end. This changes the way in which we engage in dialogue with each other, and thus our existence. Thus we lost God, which is the “Eternal You”.
For Martin Heidegger, the fundamental question was the question of “being”. A conceptual mistake is to think of the human being as a “what” instead of a “who”. Another conceptual mistake is to separate the “who” from the “what”: the human being is part of the world, at the same time that is the observer of the world. The human being is not “Dasein” (existence) but “Dase-in” (“existing in” the world). We cannot detach ourselves from reality because we are part of it. We just “act”: we are “thrown” in an action. We know what to do because the world is not a world of particles or formulas: it is a world of meaning, that the mind can understand. Technology alienates humans because it recasts the natural environment as merely a reservoir of natural resources to be exploited, when in fact it provided them with an identity.
Vladimir Vernadsky introduced the concept of the “biosphere” to express the unity of all life.
Alfred Whitehead believed in the fundamental unity of the world, due to the continuous interaction of its constituents, and that matter and mind were simply different aspects of the one reality, due to the fact that mind is part of the bodily interaction with the world. He thought that every particle is an event having both an “objective” aspect of matter and a “subjective” aspect of experience. Some material compounds, such as the brain, create the illusion that we call “self”. But the mental is not exclusive to humans, it is ubiquitous in nature.
The relationship of the self with the external reality was also analyzed by George Herbert Mead, who saw consciousness as, ultimately, a feature in the world, located outside the organism and due to the interaction of the organism with the environment. Consciousness “is” the qualities of the objects that we perceive. Those qualities are perceived the way they are because of the acts that we performed. The world is the result of our actions. It is our acting in the environment that determines what we perceive as objects. Different organisms may perceive different objects. We are actors as well as observers (of the consequences of our actions). Consciousness is not the brain process: the brain process is only the switch that turns consciousness on or off. Consciousness is pervasive in nature. What is unique to humans, as social species, is that they can report on their conscious experiences. That “reporting” is what we call the “self”. A self always belongs to a society of selves.
Sarvepalli Radhakrishnan believed that science was proving a universal process of evolution at different levels (material, organic, biological, social) whose ultimate goal was to reveal the absolute (the spiritual level). Human consciousness is not the last step in evolution, but will be succeeded by the emergence of a super-consciousness capable of realizing the union with a super-human reality that human science cannot grasp.
Muhammad Iqbal believed that humans are imperfect egos who are striving to reach God, the absolute ego.
However, it was an economist, John Maynard Keynes, to frame the fundamental philosophical problem of the post-industrial state. As citizens no longer need to worry about survival, “man will be faced with his real, permanent problem: how to use his freedom”.
But Karl Jaspers saw existence as a contradiction in terms. In theory humans are free to choose the existence they prefer, but in practice it is impossible to transcend the historical and social background. Thus one is only truly free of accepting of one’s destiny. Ultimately, we can only glimpse the essence of our own existence, but we cannot change it.
TM, ®, Copyright © 2003 Piero Scaruffi All rights reserved.
The ambition of creating a universal language a` la Leibniz to find the solution to all philosophical problems had not died either. Several philosophers, particularly the “logical positivists” (such as Rudolf Carnap and Alfred-Jules Ayer), shed new light on this program. Carnap believed that meaning could be found only in the marriage of science and Frege’s symbolic logic. He believed in the motto “the meaning of a proposition is its method of verification”, which put all the responsibility on the senses. He demoted Philosophy to a second-rate discipline whose only function would be to clarify the “syntax” of the logical-scientific discourse. The problem is that the senses provide a subjective view of the world, and therefore the “meaning” derived from verification is personal, not universal. Also, it is not clear how one can “verify” statements about history. Soon it became clear that even scientific propositions cannot quite be “verified” in an absolute way. Last, but not least, Carnap could not prove the very principle of verification based on the principle of verification.
Karl Popper clarified that truth is always and only relative to a theory: no definition of absolute truth is possible. The issue is not what is “true”, but what is “scientific”. Popper argued that matching the facts was not enough to qualify as “scientific”: a scientific theory should also provide the means to falsify itself.
Symbolic logic had made huge progress and reached an impressive level of sophistication. The implicit premise of much work on Logic was that the laws of thought are the laws of logic, and viceversa. After Frege, contributions to “axiomatizating” Mathematics and Language had come from Russell, Whitehead and Wittgenstein. David Hilbert interpreted the spirit of his age when he advanced his program of “formal systems”, which, again, was an adaptation of Leibniz’s old dream: devising an automatic procedure such that, by applying a set of rules on a set of axioms, one could prove any possible theorem. A major setback for Hilbert’s program was Kurt Goedel’s theorem of incompleteness (1931): every formal system (that contains arithmetic) also contains at least one proposition that cannot be proven true or false (an unprovable proposition). In other words, there is always an unprovable theorem in every system a` la Hilbert. Thus Hilbert’s program appeared to be impossible. Nonetheless, Alan Turing eventually (1936) found Hilbert’s procedure, which came to be called “Turing Machine” (an imaginary machine, not a physical one). Such a machine is capable of a few elementary operations on symbols (read current symbols, process them, write new symbols, examine new symbols) and is capable of remembering its own state. One can imagine an infinite number of Turing machines, depending on the rules to manipulate the symbols. Turing then imagined a “universal” machine capable of simulating all possible Turing machines. Turing showed that, given infinite time and infinite memory, such a universal machine could prove any theorem (except, of course, Goedel’s unprovable one).
Turing did more than complete Hilbert’s program: he introduced a whole new vocabulary. Reasoning had been reduced to computation, which was manipulation of symbols. Thus “thinking” had been reduced to symbol processing. Also, Turing shifted the emphasis from “formulas” to “algorithms”: the Turing Machine was a series of instructions. Today’s computer is but the physical implementation of a universal Turing machine with a finite memory.
Alfred Tarski found the “truth” the Carnap was looking for. Tarski realized a subtle but key difference between the fact that “p” is true and the sentence “p is true”. The fact and the sentence are actually referring to two different things, or the same thing at different levels. The latter is a “meta-sentence”, expressed in a meta-language. The sentences of the meta-language are about sentences of the language. Tarski realized that truth within a theory can be defined only relative to another theory, the meta-theory. In the meta-theory one can define (one can list) all the statements that are true in the theory. Tarski replaced the intuitive notion of “truth” with an infinite series of rules which define truth in a language relative to truth in another language.
Ernst Cassirer adopted the view that was coming from Logic: that the human mind is a symbolic system, and that “Understanding” the world is turning it into symbols. The difference between animals and humans is that animals live in the world, whereas humans live in a symbolic representation of the world. All cultural artifacts are symbolic forms that mediate between the self and the world.
Alfred Korzybski made a similar distinction between animals and humans. Animals are only hunters and gatherers, activities that are bound to the territory, i.e. they are “space-binders”. Humans, instead, developed agriculture, that is bound to a memory of the past and to prediction of the future, i.e. they are “time-binders”. Time-binding is enabled by the manipulation of symbols, and allows to transmit knowledge to other humans.
The influence of Peirce was offering a different take on truth and meaning. John Dewey viewed knowledge as a way to generate certainty from doubt (habits from chaos). When faced with an indeterminate situation, humans work out a scientific or common-sense theory of that situation that reduces its indeterminacy. Charles Morris developed a theory of signs (“semiotics”) based on the theories of Peirce and Saussure, and separated three disciplines of signs: “syntax” studies the relation between signs and signs; “semantics” studies the relation between signs and objects; “pragmatics” studies the relation between signs, objects and users.
World War 2 (1939-1945) actually started during the 1930s, when Germany (Austria 1938, Czechoslovakia 1938), Italy (Ethiopia 1936) and Japan (Manchuria 1931, China 1937, Indochina 1940) began expanding their respective territories through a policy of aggression and annexation. Eventually, after Germany (1939-1940) invaded Poland and France, the powers of the world fell into two camps: Britain, USA and Russia (who won) against Germany, Italy and Japan (the “axis”). World War 2 made explicit the commitment to genocide: Germany slaughtered Jews by the millions in gas chambers, the USA won the war by detonating the first nuclear weapons (the first practical application of Einstein’s Relativity).
In fact, both shame of the apocalypse (the two world wars just ended) and fear of the apocalypse (the nuclear holocaust) permeated the cultural mood of the era.
In poetry the apocalyptic spirit of the time was captured by Rainer Maria Rilke’s “Duineser Elegien” (1923), William Yeats’ “The Tower” (1928), Czeslaw Milosz’s “Poem of the Stony Time” (1933), Fernando Pessoa’s “Mensagem” (1933), Federico Garcia-Lorca’s “Llanto por Ignacio Sanchez Mejias” (1935), Eugenio Montale’s “La Bufera” (1941), Nazim Hikmet’s In This Year” (1941), Wallace Stevens’s “Notes Toward A Supreme Fiction” (1942), Thomas-Stearns Eliot’s “Four Quartets” (1942), Juan-Ramon Jimenez’s “La Estacion Total” (1946).
In theater, new forms of expression were invented to deliver the message by Ernst Toller’s “Masse-Mensch” (1921), Luigi Pirandello’s “Enrico IV” (1922), Paul Claudel (1868): “Le Soulier de Satin” (1928), Jean Giraudoux’s “Electre” (1937), Bertold Brecht’s “Leben Des Galilei” (1939).
But this was certainly the century of the novel. The spirit of the times was captured by James Joyce’s “Ulysses” (1922), Marcel Proust’s “A la Recherche du Temp Perdu” (1922), Italo Svevo’s “La Coscienza di Zeno” (1923), Francis-Scott Fitzgerald’s “The Great Gatsby” (1925), Andre’ Gide’s “Les Faux-Monnayeurs” (1925), Virginia Woolf’s “To The Lighthouse” (1927), Julien Green’s “Adrienne Mesurat” (1927), Stanislaw Witkiewicz’s “Insatiability” (1930), Louis-Ferdinand Celine’s “Voyage a Bout de la Nuit” (1932), William Faulkner’s “Light in August” (1932), Robert Musil’s “The Man Without Qualities” (1933), Elias Canetti’s “Auto Da Fe” (1935), Flann O’Brien’s “At Swim-two-birds” (1939), Jean-Paul Sartre’s “La Nausee” (1938), Joseph Roth’s “Die Legende vom heiligen Trinker” (1939), Mikhail Bulgakov’s The Master and Margarita” (1940), Albert Camus (1913, France): “The Stranger” (1942), Jorge-Luis Borges’ “Ficciones” (1944), Julien Gracq’s “Un Beaux Tenebreux” (1945), Hermann Broch’s “Der Tod des Vergil” (1945).
If literature was getting, overall, more “narrative”, painting became more abstract and symbolic with Rene Magritte’s “Faux Miroir” (1928), Salvator Dali’s “La Persistence de la Memoire” (1931), Paul Klee’s “Ad Parnassum” (1932), Pablo Picasso’s “Guernica” (1937), Max Ernst’s “Europe After the Rain II” (1942). Constantin Brancusi and Hans Arp were the giants of sculpture.
Music continued its journey away from the classical canon with Leos Janacek’s “Glagolitic Mass” (1926), Bela Bartok’s “Music for Strings, Percussion and Celesta” (1936) Edgar Varese’s “Ionisation” (1933) Alban Berg’s “Violin Concerto” (1935) Olivier Messiaen’s “Quatuor pour la Fin du Temps” (1941) and Goffredo Petrassi’s “Coro di Morti” (1941).
On the lighter side, new forms of entertainment and mass media were born, mostly in the USA. In 1914 composer Jerome Kern had staged the first “musical”. In 1926 Hollywood debuted the “talking movie” (films with synchronized voice and music). In 1927 Philo Farnsworth invented television.
Cinema was by far the most influential of new forms of art, thanks to films such as David-Wark Griffith’s “The Birth of a Nation” (1915), Victor Sjostrom’s “Phantom Chariot” (1920), Erich von Stroheim’s “Greed” (1924), Sergei Eisenstein’s “Battleship Potemkin” (1925), Fritz Lang’s “Metropolis” (1926), Josef von Sternberg’s “Das Blaue Engel” (1930), the Marx Brothers’ “Duck Soup” (1933), Charlie Chaplin’s “Modern Times” (1936), Jean Renoir’s “La Grande Illusion” (1937), Howard Hawks’s “Bringing Up Baby” (1938), Orson Welles’s “Citizen Kane” (1941), Frank Capra’s “John Doe” (1941).
But the visual arts also added a new one: the comics. The comics came to compete with the novel and the film, and reached their artistic peak with “Little Nemo” (1905), “Popeye” (1929), “Buck Rogers” (1929), “Tintin” (1929), “Mickey Mouse” (1930), “Dick Tracy” (1931), “Alley Oop” (1933), “Brick Bradford” (1933), “Flash Gordon” (1934), “Li’l Abner” (1934), “Terry Lee” (1934).
America’s contribution to music included Afro-American music: the blues was born around 1912, jazz in 1917, gospel in 1932, rhythm’n’blues in 1942, bebop in 1945.
After World War 2, Stalin’s Soviet Union became an exporter of “revolutions” throughout the world, an ideological empire that had few precedents in history: Mao Tze-tung’s China in 1949, Ho Chi Min’s Vietnam in 1954, Fidel Castro’s Cuba in 1959, Julius Nyere’s Tanzania in 1961, Kenneth Kaunda’s Zambia in 1964, Siad Barre’s Somalia in 1969, Haile Mengitsu’s Ethiopia in 1974, Samora Machel’s Mozambique in 1975, Pol Pot’s Cambodia in 1975, Robert Mugabe’s Zimbabwe in 1980, Arap Moi’s Kenya in 1982, etc. The USA retaliated by supporting anti-communist regimes around the world (often as totalitarian as the ones imposed by the communist revolutions). In Latin America, for example, the Soviet Union, via its proxy of Cuba, sponsored a series of national insurrections, while the USA supported “caudillos” that were no more democratic than Hitler (Guatemala 1960, Bolivia 1965, Chile 1973, Peru 1970, Colombia 1979, El Salvador 1980).
Both the Soviet Union and the USA fought for supremacy in what was termed a “Cold War”, a war that was never fought directly but only indirectly, everywhere and all the time. They both became military superpowers by amassing thousands of nuclear weapons. The nuclear deterrence worked insofar as they never struck at each other. But the consequence was that the theater of military operations was the entire planet.
The “Cold War” resulted in a partition of the world in two spheres of influence: Soviet and American. An “iron curtain” divided Europe in two, and the Wall (1961) that divided West and East Berlin was its main symbol.
A parallel process, soon engulfed in the Cold War, was the decolonization of Africa and Asia. Mahatma Gandhi was the most celebrate of the independence leaders. The European powers granted independence to most of their colonies. New countries were born (India and Pakistan in 1947, Israel in 1948, Indonesia in 1949, Ghana in 1957, and most of Africa followed within a decade). The exceptions (Algeria, Angola, Portugal) suffered from decade-long independence wars. Even where independence had been granted, intestine civil wars caused massive convulsions, again exploited by the two superpowers for their power games.
Another by-product of the post-war order was the birth of Arab nationalism with Egyptian leader Gamal Nasser.
The most visible political decline was the one of Britain. While its empire was disintegrating and its economy was slower than the economies of Germany, Japan and France (that soon passed it in GDP terms), Britain maintained an aloof attitude, reveling in its diversity: it did not join the European Community, it did not adopt the metric system, etc. Outdated industrial infrastructure. It reorganized the empire as the Commonwealth, but that was a cost, no longer a source of revenues. Despite being the real winner of World War 2, Britain became rapidly irrelevant.
The western European countries, assembled around a USA-led alliance (NATO), were free and democratic (with the exception of the Iberian peninsula) but were nonetheless torn between a socialist left and a capitalist right. De facto, they all adopted different versions of the same model: a social-democratic state that guaranteed rights to workers and sheltered citizens through a generous social net.
Among armed conflicts, two were particularly significant: the Arab-Israeli conflict (1948-2004) and the USA-Vietnam war (1964-1973). They both dragged the USA into long and expensive military ventures.
Despite the political gloom, the post-war age was the age of consumerism, of the economic boom (in the USA, Japan and western Europe), of the “baby boom” and of the mass media.
The office was mechanized and electrified thanks to a deluge of calculators, photocopiers, telefax machines, telex machines, touch-tone phones, and, finally, mainframe computers (1964).
Landmarks in communications were the telephone cable across the Atlantic (1956) and the first telecommunication satellite (1962).
Landmarks in transportation were Pan Am’s first transatlantic flight (1939), the long-distance jet (1958) and the wide-body jet (1967).
Commercial television introduced cheap forms of mass entertainment.
The 33-1/3 RPM long-playing vinyl record (1948) and the transistor radio (1954) changed the way people (especially young people) listened to music.
A youth culture began to appear in the USA in the 1950s, initially blasted as a culture of “juvenile delinquents”, and evolved into the generation of the free-speech movement, of the civil rights, of the anti-war movement and of the hippies. It then migrated to Europe, where it transformed into the student riots of 1968.
Rock music was very much the soundtrack of the youth movement. Rock’n’Roll was the music of the young rebels who reacted against the repressive conventions of post-war society. Bob Dylan and the militant folk-singers interpreted young people’s distrust of the Establishment and their idealistic dreams. Psychedelic music was an integral part of the hippie movement (and dramatically changed the concept of “song” by introducing atonal and anarchic elements).
The 1960s were also the age of the sexual revolution and of feminism, announced by Simone de Beauvoir.
The single most emotional event for the collective imagination was space exploration, largely fueled by the rivalry between the USA and the Soviet Union. In 1957 the Soviet Union launched the first artificial satellite, the “Sputnik”. In 1961 Yuri Gagarin became the first human astronaut. In 1962 the USA launched the first telecommunication satellite, the “Telstar”. In 1969 Neil Armstrong became the first human to set foot on the Moon.
However, progress in Physics was certainly not limited to the space. In fact, Physics was booming from the very small to the very large.
Astronomy revealed a whole new world to the peoples of the Earth who used to believe (just a few thousand years earlier) that the Earth was all there was to it. There are billions of galaxies in the universe, each made of billion of stars (200 billion in our galaxy, roughly the “Milky Way”), and planets orbit around the stars (nine around ours, the Sun). Pluto, the last of the solar planets, turned out to be 5.9 billion kms from the Sun, a distance that no human could hope to cover during a lifetime. Distances were suddenly measured in “light-year”, one light-year being 9 trillion kms, a distance that would have been unimaginable just a century before. The nearest star is “Alpha Centauri”, 4.3 light-years from the Earth. Sirius, the brightest star in the sky, is actually 8.7 light-years away. The center of the Milky Way is 26 thousand light-years from the Sun. Andromeda, the nearest galaxy, is 2.2 million light-years far.
In 1965 the “microwave background radiation” was discovered, a remnant of some catastrophic event a long time back in the past of the universe. That event was named “Big Bang”: the universe was born when a massive explosion sent the original energy hurling away in all directions. Eventually, gravitation caused pieces of matter to coalesce together, thus forming the structures that we observe today (galaxies, stars, planets), leaving behind the background radiation and causing the expansion of the universe that is still going on. Depending on how much mass there is in the universe, this expansion may some day be reversed (and end in a “Big Crunch”) or continue forever. Cosmologists also realizes that there are different kinds of “stars”. Some of them are very small and very heavy, and spin frantically around their axis (“pulsars”). Some of them collapsed into “black holes”, which are bodies whose gravitational field is so strong that nothing can escape them, not even light. Inside black holes, time and space sort of switch roles: an object can only proceed ahead in space (towards the center of the black hole) while being able to move around in time.
The oddities of Cosmology fueled a boom in science fiction (comics, films, novels, tv series).
As for the “very small”, the view of matter made of three particles (electron, proton and neutron) was shattered by the discoveries of a multitude of subatomic particles. The radioactive decay of atomic nuclei, first observed in 1896 by Antoine Becquerel, Pierre Curie and Marie Curie, had already signaled the existence of a fourth kind of fundamental force (the “weak” force) to the known three (gravitational, electromagnetic, and nuclear or “strong”). Wolfgang Pauli in 1930 inferred the existence of the neutrino to explain a particular case of radioactive decay. Another source of new particles was the study of “cosmic rays”, that Victor Franz Hess reduced to radiation coming from the space (1912). This led to the discovery of muons (1937) and pions (predicted in 1935 by Yukawa Hideki). In 1963 Murray Gell-Man hypothesized that the nucleus of the atom was made of smaller particles. In 1967 the theory of quarks (Quantum Chromodynamics) debuted: the nucleus of the atom (neutrons and protons) is made of quarks, that are held together by gluons. Quarks differ from previously known particles because their magic number is “three”, not two: there are six quarks, each coming in three “flavors” (and each having, as usual, its anti-quark) and they combine not in pairs but in trios.
Forces are mediated by discrete packets of energy, represented as virtual particles or “quanta”. The quantum of the electromagnetic field (e.g., of light) is the photon: any electromagnetic phenomenon involves the exchange of a number of photons between the particles taking part in it. Other forces are defined by other quanta: the weak force by the W particle, gravitation (supposedly) by the graviton and the nuclear force by gluons. Particles can be divided according to a principle first formulated (in 1925) by Wolfgang Pauli: some particles (the “fermions”, named after Enrico Fermi) never occupy the same state at the same time, whereas other particles (the “bosons”, named after Satyendra Bose) do. The wave functions of two fermions can never completely overlap, whereas the wave fuctions of two bosons can completely overlap (the bosons basically lose their identity and become one). Fermions (such as electrons and its family, the leptons, and quarks and their “hadrons”, protons and neutrons) make up the matter of the universe, while bosons (photons, gravitons, gluons) are the virtual particles that glue the fermions together. Bosons therefore represent the forces that act on fermions. They are the quanta of interaction. An interaction is always implemented via the exchange of bosons between fermions. (There exist particles that are bosons but do not represent interactions, the so called “mesons”, which are made of quarks and decay very rapidly).
There are twelve leptons: the electron, the muon, the tau, their three neutrinos and their six anti-particles. There are 36 quarks: six times three flavors plus the corresponding anti-quarks. Thus there are 4 forces, 36 quarks, 12 leptons, 12 bosons.
Science was even applied to life itself.
Ilya Prigogine developed “Non-equilibrium Thermodynamics” to explain phenomena far from equilibrium such as life itself. He divided nature into “conservative” systems (the ones studied by classical Physics) and “dissipative” systems (subject to fluxes of energy/matter), and noticed that the latter are ubiquitous in nature: everything that is alive is a dissipative system. They create order by feeding on external energy/metter: they are non-equilibrium systems that are sustained by a constant influx of matter/energy. He realized that such systems exhibit spontaneous development of order: they are self-organizing systems, which maintain their internal organization by trading matter/energy with the environment.
James Jerome Gibson look at life from the point of view of a network of integrated living beings. A living being does not exist in isolation. In fact, its main purpose is to pick up information from the environment. All the information needed to survive is available in the environment. Thus information originates from the interaction between the organism and its environment. Information “is” the continuous energy flow of the environment.
The other great fascination was with computers, developed during the war to crack the secret German code. In 1946 the first non-military computer, “Eniac”, was unveiled. In 1947 William Shockley invented the transistor at Bell Labs. In 1951 the first commercial computer was built, the “Univac”. In 1955 John McCarthy founded “Artificial Intelligence”, a discipline to study if “intelligent” machines could ever be built. In 1956 Robert Noyce and Jack Kilby invented the microchip, that made it possible to build smaller computers. In 1958 Texas Instruments built the first integrated circuit. Also in 1958 Jim Backus (at IBM) invented the FORTRAN programming language, the first machine-independent language. In 1965 Gordon Moore predicted that the processing power of computers would double every 18 months. In 1964 IBM introduced the first “operating system” for computers (the “OS/360”). In 1965 DEC introduced the first mini-computer, the PDP-8, that used integrated circuits.
Genetics rapidly became the most exciting field in Biology. Each living cell contains deoxyribonucleic acid (DNA for short), discovered in 1944 by Oswald Avery,and, in 1953, Francis Crick and James Watson figured out the double-helix structure of the DNA molecule: genetic information is encoded in a rather mathematical form, which was christened “genetic code” because that’s what it is, a code written in an alphabet of four “letters” (which are, physically, acids). Crick reached the conclusion that information flows only from the (four) nucleid acids of the DNA to the (twenty) aminoacids of proteins, never the other way around. In other words: genes encoded in DNA determine the organism. An organism owes its structure to its “genome”, its repertory of genes. It took a few more years for biologists to crack the genetic code, i.e. to figure out how the four-letter language of DNA is translated into the twenty-letter language of proteins. Biologists also discovered ribonucleic acid (RNA), the single-strand molecule that partners with DNA to manufacture proteins.
Less heralded but no less powerful to change our view of our race was the progress made by neurologists in understanding how the brain works. The neuron had been discovered in 1891 by Santiago Ramon y Cajal, and in 1898 Edward Thorndike had already proposed that the brain was a “connectionist” system (that the connections, not the units, were the key to its working). But the picture remained fuzzy until (1949) Donald Hebb showed that those connections were dynamic, not static, and that they changed according to a system of punishment and reward, or “selectively”: a connection that was used to produce useful behavior was reinforced, one that was part of a failure was weakened. As new techniques allowed neurologists to examine the electrical and chemical activity of the brain, it became clear that neurons communicate via “neurotransmitters”. A neuron is nothing more than a generator of impulses, activated when the sum of its inputs (the neurotransmitters received from other neurons, weighted according to the “strength” of the corresponding connections) exceeds a certain potential. The connections between neurons are continuously adjusted to improve the accuracy of the brain’s responses. Basically, each operation of recognition is also an operation of learning, because connections are refined every single time they are used. The structure of the brain also became more clear, in particular the fact that there are two hemispheres, the left being dominant for language and speech, the right being dominant for visual and motor tasks.
TM, ®, Copyright © 2003 Piero Scaruffi All rights reserved.
Michel Jouvet discovered that REM (“rapid eye movement”) sleep is generated in the pontine brain stem (or “pons”). The pons sends signals to eye muscles (causing the eye movement), to the midbrain (causing a low level of brain activity and inhibition of muscle movements), and to the thalamus. The thalamus then excites the cortex, which receives a valid sensory signal from the thalamus and interprets it as if it were coming from the sense organs. During REM sleep several areas of the brain are working frantically, and some of them are doing exactly the same job they do when the brain is awake. The only major difference is that the stimuli they process are now coming from an internal source rather than from the environment: during dreams the sensory input comes from the sensory cortex.
The obsession with Alan Turing’s abstract “machine” and with the first concrete computers shaped the intellectual landscape of many thinkers. Not only had Turing proved that the computer was, basically, an “intelligent” being. John Von Neumann, with his embryonic experiments on artificial life or “cellular automata” (1947), had also shown how such a machine could be made to reproduce and evolve.
It was Turing himself to frame the philosophical issue for the next generations, with what came to be known as “Turing’s test” (1947): a machine can be said to be intelligent if a human being, asking all sorts of questions, cannot tell whether the answers come from a human being or from a machine. It was, ultimately, a “behaviorist” approach to defining intelligence: if a machine behaves exactly like a human being, than it is as intelligent as the human being.
Kenneth Craik viewed the mind as a particular type of machine which is capable of building internal models of the world and of processing them to produce action. Craik’s emphasis was on the internal representation and on the symbolic processing of such a representation.
Craik’s ideas formed the basis for Herbert Simon’s and Allen Newell’s theory of mind, that the human mind is but a “physical symbol processor”. The implicationg was that the computer was, indeed, intelligent: it was just a matter of programming it the way the mind is. They proceeded to implement a “general solver”, a computer program that, using logic, would be able to solve any possible problem (Hilbert’s dream).
The implicit assumption behind the program of “Artificial Intelligence” was that the “function” is what matters: the “stuff” (brains or integrated circuits) is not important.
Hilary Putnam argued that the same mental state can be realized in more than one physical state, for example pain can be realized by more than one brain (despite the fact that all brains are different). Therefore, the physical state is not all that important. It is the “function” that makes a physical state of the brain also a mental state. Mental states are mere decorations: they have a function. The consequence of this conclusion, though, is that a mind doesn’t necessarily require a brain. In fact, a computer does precisely what a mind does: perform functions that can be implemented by different physical states (different software). The “functionalist” approach popularized the view that the mind is the software and the brain is its hardware. The execution of that program (the mind) in a hardware (brain or computer) yields behavior.
Jerry Fodor speculated that the mind represents knowledge in terms of symbols, and then manipulates those symbols to produce thought. The manipulation of those symbols is purely syntactic (without knowing what those symbols “mean”). The mind uses a “language of thought” (or “mentalese”), common to all sentient beings, and produced through evolution, to build those mental representations.
The program of Artificial Intelligence was not as successful as its pioneers hoped because they neglected the importance of knowledge. An “intelligent” system is only capable of performing logical operations, no matter how many and how smart; but, ultimately, even the most intelligent human being in the world needs knowledge to make sensible decisions. In fact, a person with a lot of knowledge is likely to make a more sensible decision than a much more clever person with very little knowledge. Thus the primacy shifted from “intelligence” to “knowledge”: “Expert Systems” (first conceived around 1965) apply a “general problem solver” to a “knowledge base”. The knowledge base is built by “cloning” a human expert (usually via a lengthy process of interviews). A knowledge base encodes facts and rules that are specific to the “domain” of knowledge of the human expert. Once the appropriate knowledge has been “elicited”, the expert system behaves like a human expert.
In parallel to the “knowledge-based” school, the search for machine intelligence pursued other avenues as well.
Norman Wiener noticed that both living systems and machines are “control systems”, systems in which “feedback” is employed to maintain internal “homeostasis” (a steady state). A thermostat is a typical control system: it senses the temperature of the environment and directs the heater to switch on or off; this causes a change in the temperature, which in turn is sensed by the thermostat; and so forth. Every living system is also a control system. Both living systems and machines are “cybernetic” systems. The “feedback” that allows a system to control itself is, ultimately, an exchange of information between the parts of the system.
Claude Shannon worked out a no less influential metaphor for machines: they are also similar to thermodynamic systems. The entropy of a thermodynamic system is a measure of disorder, i.e. a measure of the random distribution of atoms. As entropy increases, that distribution becomes more homogeneous. The more homogeneous the distribution is, the less “informative” it is. Therefore, entropy is also a measure of the lack of information.
Yet another version of the facts was delivered by the proponents of “Neural Networks”.
An artificial “neural network” is a piece of software or hardware that simulates the neural network of the brain. Several simple units are connected together, with each unit connecting to any number of other units. The “strength” of the connections can fluctuate from zero strength to infinite strength. Initially the connections are set randomly. During a “training” period, the network is made to to adjust the strength of the connections using some kind of feedback: every time an input is presented, the network is told what the output should be and asked to adjust its connections accordingly. The network continues to learn forever, as each new input causes a readjustment of the connections.
The difference between “expert systems” and “neural networks” is actually quite ideological. Expert systems operate at the level of knowledge, whereas neural networks operate at the level of connections. In a way, they describe two different ways to look at human intelligence: as a brain that produce intelligent behavior, and as a mind that produces intelligent decisions.
Needless to say, the mind-body problem, originally introduced by Descartes, was revived by the advent of the computer.
Dualists largely extended an intuition by Charlie-Dunbar Broad, that the universe is a series of layers, and that each layer yields the following layer but cannot explain the new properties that emerge with it. For example, the layer of elementary particles yields the layer of macroscopic phenomena. Each new layer is an “emergent” phenomenon of a lower layer. Thus the mind is an emergent property of the brain, and not a separate substance. The new dualism was a dualism of properties, not a dualism of substances. Dualism was restated as “supervenience”. Biological properties “supervene” (or “are supervenient”) on physical properties, because the biological properties of a system are determined by its physical properties. By the same token, mental properties are supervenient on neural properties.
Herbert Feigl revived materialism in the age of the brain: the mind is created by the neural processes in the brain. We have not explained how this happens the same way that humans could not explain lightning or magnetism for centuries. Nonetheless, mental states “are” physical states of the brain. Philosophers such as Donald Davidson realized that it is implausible to assume that for every single mental state there is a unique physical state of the brain (a one to one correspodence). For example, a person can have the same feeling twice, despite the fact that the configuration of the brain has changed. Thus, Feigl’s “identity theory” was revised to admit that many physical states of the brain may yield the same mental state.
Another form of materialism (“eliminative” materialism) originated with Paul Feyerabend and Richard Rorty, who believed that mental states do not exist. The mental world is only a vocabulary of vague terms that don’t refer to real entities. The mind-body dualism is a false problem that leads to false problems.
By that line of reasoning, Gilbert Ryle revived behaviorism in the context of the mind-body problem: the mental vocabulary does not refer to the structure of something, but simply to the way somebody behaves or will behave. The mind “is” the behavior of the body. Descartes invented a myth: the mind inside the body (“the ghost in the machine”).
Existentialism was the big philosophical “ism” of Post-war Europe. Existentialists focused on the human experience. Theirs was a philosophy of the crisis of values. The object and the subject of Existentialism are the same: the “I”.
Jean-Paul Sartre believed that there is no God. The individual is alone. There is no predestination (no “human nature”), determining our actions. We are free to act as we will. It is our actions that determines our nature. Existence (the free I) precedes essence (the I’s nature). In the beginning, the individual is nothing. Then she defines herself by her actions. Each individual is fully responsible for what she becomes. This total freedom causes angst. It is further amplified by the fact that an individual’s choices affect the whole of humankind. Existentialism abolishes God, but emphasizes that its atheism increases (not decreases) the individual responsibility for her actions. It complicates, not simplifies, the moral life. “We are alone, with no excuses”.
Maurice Merleau-Ponty countered that human freedom is never total: it is limited by our body. The individual is, first and foremost, a “situated” being, a body that lives in an environment. The body is not just an object surrounded by objects: it is the very subject of experience, that interacts with the environment. The body shapes the environment, but, in turn, the environment shapes the body, whose freedom is therefore limited by the way the environment shapes it. The same conditioning exists in society: the body is a linguistic actor but its linguistic action is constrained by the language it uses (the meaning of a linguistic action is constructed on the basis of a meaning acquired from the language). Ditto at the level of society: we are political agents, but we our political actions are shaped by the historical background. At all levels, there are a “visible” and an “invisible” dimensions of being that continuously affect each other.
A number of thinkers related to the zeitgeist of Existentialism even if they did not belong to the mainstream of it.
David Bohm fulfilled Einstein’s hope to find “hidden variables” to remove randomness from Quantum Theory. Bohm hypothized the existence of a potential that permeates the universe. This potential, that lies beyond the four-dimensional geometry of space-time, generates a field that acts upon particles the same way a classical potential does. This field can be expressed as the mother of all wave functions, a real wave that guides the particle (the “pilot-wave”). This field is, in turn, affected by all particles: everything in the universe is entangled in everything else. The universe is an undivided whole in constant flux. Similarly, at the higher dimension (the “implicate order”) there is no difference between matter and mind. That difference arises within the “explicate order” (the conventional space-time of Physics). As we travel inwards, we travel towards that higher dimension, the implicate order, in which mind and matter are the same. As we travel outwards, we travel towards the explicate order in which subject and object are separate. Mind and matter can never be completely separated because they are entangled in the same quantum field. Thus every piece of matter has a rudimentary mind-like quality.
Ghose Aurobindo speculared that Brahman first involutes (focuses on itself), next materializes (the material universe), and then evolves into consciousness. We are part of this process, which is still going on. Human consciousness is the highest stage of consciousness so far reached by Brahman, but not the last one, as proven by the fact that social, cultural and individual life in the human world are still imperfect.
Sayyid Qutb, the philosopher of militant Islam, lived in the dream of a purified world dedicated to the worship of God alone. Human relationships should be founded on the belief in the unity of God. Pagan ignorance (for example, of Christians and Jews) is the main evil in the world, because it rebels against God’s will and establishes secular societies that violate God’s sovereignty on Earth. The separation of church and state is “the” problem.
Pierre Teilhard de Chardin saw evolution as a general law of nature: the universe’s matter-energy is progressing towards ever increased complexity. Humans mark the stage when evolution leaves the “biosphere” and enters the “noosphere” (consciousness and knowledge). The evolution of the noosphere will end in the convergence of matter and spirit into the “omega point”.
Throughout the second half of the century, Structuralism was one of the dominant paradigms of philosophy: uncover the real meaning hidden in a system of signs.
Claude Levi-Strauss extended it to social phenomena, which he considered as systems of signs just like language. Myths from different cultures (myths whose contents are very different) share similar structures. Myth is a language, made of units that are combined together according to certain rules. The “langue” is the myth’s timeless meaning, the “parole” is its historical setting. “Mytheme” is the elementary unit of myth. Mythemes can be read both diachronically (the plot that unravels, the sequence of events) and synchronically (the timeless meaning of it, the “themes”). The themes of myths are binary relationships between two opposing concepts (e.g., between selfishness and altruism). Binary logic is, in a sense, the primordial logic, and mythical thinking is, in a sense, logical thinking. Mythical thinking is inherent to the human mind: it is is the human way of understanding nature and the human condition. Conversely, myths are tools that we can use to find out how the human mind works.
Roland Barthes transformed Saussure’s Structuralism into “semiology”, a science of signs to unveil the meaning hidden in the “langue” of cultural systems such as cinema, music, art.
Structuralism often reached provocative conclusions that had social and political implications.
Michel Foucault analyzed the mechanisms of western (liberal, democratic) society. Western society jails fools, who, in ancestral societies, were tolerated or even respected as visionary. Foucault perceived as disturbing the tendency to criminalize the creative force of madness. Basically, Western societies torture the minds of criminals, whereas totalitarian societies tortured their bodies. Prisons are, de facto, an instrument of social control, a device to train minds that do not comply with the dogmas. Thus western societies are vast mechanisms of repression, no less oppressive than the totalitarian regimes they replaced. Similar arguments can be made for sexuality and crime.
Jacques Lacan analyzed the unconscious as a system of signs. Motives are signifiers which form a “signifying chain”: the subconscious “is” that chain. This chain is permanently unstable because it does not refer to anything: the self itself is a fiction of the subconscious. A baby is born with a united psyche, but later in life, as the baby separates from the mother, that unity is broken, and the self is born; and the rest of one’s lifetime is spent trying to reunite the self and the other. Psychic life as a permanent struggle between two “consciousnesses”.
Dilthey’s Hermeneutics was also influential. Hans-Georg Gadamer applied it to Husserl’s Phenomenology and derived a discipline of “understanding”, where to him “understanding” was the “fusion of horizons” between a past text and a present interpreter.
Paul Ricoeur believed that the symbols of pre-rational culture (myth, religion, art, ideology) hide meaning that can be discovered by interpretation. There are always a patent and a latent meaning. A similar dichotomy affects human life, which is torn between the “voluntary” and the “involuntary” dimensions, between the “bios” (one’s spatiotemporal life) and the “logos” (one’s ability to grasp universal spacetime). He made a distinction between language and discourse: language is, indeed, only a system of signs, and therefore timeless, but discourse always occurs at some particular moment of time, i.e. it depends on the context. A language is a necessary condition for communication, but it itself does not communicate: only discourse communicates. The signs in a language system refer only to other signs in it, but discourse refers to a world. Discourse has a time dimension that is due to the merging of two different kinds of time: cosmic time (the uniform time of the universe) and lived time (the discontinuous time of our life’s events). Historical time harmonizes these two kinds of time.
The debate on language proceeded in multiple directions. Wilfred Sellars conceived a sort of linguistic behaviorism: thoughts are to the linguistic behavior of linguistic agents what molecules are to the behavior of gases.
Roman Jakobson, the leading exponent of “formalism”, decomposed an act of communication into six elements that summarize the act of communication like this: a message is sent by an addresser to an addressee who shares a common code, a physical channel and a context. These elements reveal that communication performs many functions in one.
The speculation on language culminated with Noam Chomsky’s studies on grammar. Chomsky rephrased Saussure’s dichotomy of langue and parole as performance and competence: we understand sentences that we have never heard before, thus our linguistic competence exceeds our linguistic performance. In fact, the number of sentences that we can “use” is potentially infinite. Chomsky concluded that what we know is not the infinite set of sentences of the language, but only a finite system of rules that defines how to build sentences. We know the “grammar” of a language. Chomsky separated syntax from semantics: a sentence can be “well-formed” without being meaningful (e.g., “the apple took the train”). In doing so, Chomsky reduced the problem of speaking a language to a problem of formal logic (because a grammar is a formal system). Chomsky realized that it was not realistic to presume that one learns a grammar from the sentences that one hears (a fraction of all the sentences that are possible in a language). He concluded that human brains are designed to acquire a language: they are equipped at birth with a “universal grammar”. We speak because our brain is meant to speak. Language “happens” to a child, just like growth. Chomsky’s universal grammar is basically a “linguistic genotype” that all humans share.
As Sellars had already noted, Chomsky’s analysis of the structure of language was not enough, though, to explain the phenomenon of language among humans. John-Langshaw Austin argued that the function of sentences is not so much to describe the state of the world as to cause action in the world. He classified a speaker’s “performative” sentences (requests, promises, orders, etc) based not on their structure but on their “effect” on the listener. We speak for a reason. “Pragmatics” is the study of “speech acts”. A speech act is actually made up of three components: a “locutionary” act (the words employed to deliver the utterance), an “illocutionary” act (the type of action that it performs, such as commanding, promising, asking) and a “perlocutionary” act (the effect that the act has on the listener, such as believing or answering).
There is more to a sentence than its meaning: a sentence is “used” for a purpose. Paul Grice realized that speech acts work only if the listener cooperates with the speaker, and the speaker abides by some common-sense rule: the speaker wants to be understood and cause an action, and the listener makes this assumption in trying to understand the speaker’s purpose. Grice believed that some “maxims” help the speaker say more than the word that she is saying: those maxims are implicit knowledge that the listener uses in order to grasp the purpose of the utterance. Language has meaning to the extent that some conventions hold within the linguistic community.
The intimidating progress of Science caused a backlash of sort among philosophers who disputed Science’s very foundations. After all, scientific hypotheses cannot be tested in isolation from the whole theoretical network within which they are formulated.
Aleksandr Koyre’ and Gaston Bachelard had already noted that scientific progress is not linear: it occurs in spurts. Thomas Kuhn formalized that intuition with the concept of “paradigm shifts”. At any point in time the scientific community agrees on a scientific paradigm. New evidence tends to be accomodated in the ruling paradigm. When the ruling paradigm collapses because of some evidence that cannot be accomodated, then a paradigm shift occurs. A paradigm shift results in a different way of looking at the world, analogous to a religious conversion. Scientific revolutions are, ultimately, linguistic in character. Thus the truth of a theory does not depend exclusively on the correspondence with reality. The history of science is the history of the transformations of scientific language.
Willard Quine argued that a hypothesis can be verified true or false only relative to some background assumptions, a condition that rapidly becomes recursive: each statement in a theory partially determines the meaning of every other statement in the same theory. One builds a “web of beliefs”, and each belief in the web is based on some other beliefs of the same web. Each belief contributes to support the entire web and is supported by the entire web. The web as a whole fits the requirements of Science. But there might be several such webs that would work as well: scientific theories are “undetermined” by experience. It is the same situation as with language: there are always many (potentially infinite) interpretations of a discourse depending on the context. A single word has no meaning: its meaning is always relative to the other words that it is associated to. The meaning of a sentence depends on the interpretation of the entire language. Its meaning can even change in time. For example, it is impossible to define what a “correct” translation of a sentence is from one language to another, because that depends on the interpretations of both entire languages. Translation from one language to another is indeterminate. Translation is possible only from the totality of one language to the totality of another language.
Another strong current of thinkers was the Marxist one, which frequently spent more time criticizing capitalism than in heralding socialism.
Juergen Habermas added an element that was missing from Marx’s “materialistic” treatment of society: social interaction, the human element. Societies rely both on labor (instrumental action) and socialization (communicative action). What we are witnessing is not so much alienation but a crisis of institutions that manipulate individuals. Communicative Action, not the revolution of the proletariat, can transform the world and achieve a more humane and just society based on free and unconditioned debate among equal citizens.
Herbert Marcuse analyzed the operation of mass societies and concluded that they seduce the citizens with the dream of individual liberty only to enslave them in a different way. The only true revolution is emancipation from the economic loop that enslaves us. Such a revolution would bring about an ideal state in which technology is used to provide individual happiness, not surplus.
Theodor Adorno warned that reason has come to dominate not only nature, but also humanity itself, and therefore Western civilization is moving towards self-destruction. For example, mass-culture industries manipulate the masses into cultivating false needs.
Cinema was probably the most faithful interpreter of the times through its well-established genres: Akira Kurosawa’s “Rashomon” (1950), Billy Wilder’s “Sunset Boulevard” (1950), Vittorio DeSica’s “Miracle in Milan” (1951), Kenji Mizoguchi’s “Ugetsu Monogatari” (1953), Yasujiro Ozu’s “Tokyo Monogatari” (1953), Elia Kazan’s “On The Waterfront” (1954), Ingmar Bergman’s “Seventh Seal” (1956), John Ford’s “The Searchers” (1956), Don Siegel’s “Invasion of the Body Snatchers” (1956), Alfred Hitchcock’s “North By Northwest” (1959), Jean-Luc Godard’s “Breathless” (1959), Federico Fellini’s “La Dolce Vita” (1960), John Huston’s “The Misfits” (1961), Robert Aldrich’s “Hush Hush Sweet Charlotte” (1965), Michelangelo Antonioni’s “Blow-Up” (1966), Luis Bunuel’s “Belle de Jour” (1967), Roman Polansky’s “Rosemary’s Baby” (1968), Stanley Kubrick’s “2001 A Space Odyssey” (1968), Sergio Leone’s “Once Upon a Time in The West” (1968), Sam Peckinpah’s “The Wild Bunch” (1969).
Music moved further away from the tradition of consonant music with John Cage’s “Concerto for Prepared Piano” (1951), Pierre Boulez’s “Le Marteau Sans Maitre” (1954), Luigi Nono’s “Canto Sospeso” (1956), Karlheinz Stockhausen’s “Gesang der Junglinge” (1956), Iannis Xenakis’s “Orient Occident” (1960), Britten’s “War Requiem” (1962), Penderecki’s “Passio Secundum Lucam” (1965), Berio’s “Sinfonia” (1968).
Poetry explored a much broader universe of forms: Pablo Neruda’s “Canto General” (1950), Andrade’s “Claro Enigma” (1951), Paul Celan’s “Mohn und Gedaechtnis” (1952), George Seferis’s “Emerologio Katastromatos” (1955), Yannis Ritsos’s “Moonlight Sonata” (1956), Ezra Pound’s “Cantos” (1960), Pierpaolo Pasolini’s “Le Ceneri di Gramsci” (1957), Vladimir Holan’s “A Night with Hamlet” (1964), Vittorio Sereni’s “Gli Strumenti Umani” (1965), Andrea Zanzotto’s “La Belta`” (1968).
Fiction was the most prolific of the literary genres: Cesare Pavese’s “La Luna e i Falo’” (1950), Elsa Morante’s “L’Isola di Arturo” (1957), Italo Calvino’s “Il Barone Rampante” (1957), Carlo-Emilio Gadda’s “La Cognizione del Dolore” (1963), Alejo Carpentier’s “Los Pasos Perdidos” (1953), Jose Donoso’s “Coronacion” (1957), Gabriel Garcia Marquez’s “Ciento Anos de Soledad” (1967), Malcom Lowry’s “Under the volcano” (1947), William Gaddis’ “The Recognitions” (1955), Wilson Harris’ “Palace of the Peacock” (1960), Anthony Burgess’s “Clockwork Orange” (1962), Janet Frame’s “Scented Gardens For The Blind” (1963), Saul Bellow’s “Herzog” (1964), John Barth’s “Giles Goat Boy” (1966), Yukio Mishima’s “Golden Pavillion” (1956), Boris Pasternak’s “Doctor Zivago” (1957), Witold Gombrowicz’s “Pornography” (1960), Gunther Grass’ “Die Brechtrommel” (1959), Thomas Bernhard’s “Verstoerung” (1967), Elias Canetti’s “Auto da fe” (1967), Raymond Queneau’s “Zazie dans le Metro” (1959), Julio Cortazar’s “Rayuela” (1963), Carlos Fuentes’s “Artemio Cruz” (1964), Jorge Amado’s “Dona Flor” (1966), Kobe Abe’s “Woman of Sand” (1962), Kenzaburo Oe’s “Silent Cry” (1967), Patrick White (Australia, 1912): “Voss” (1957).
Theatre built upon the innovations of the first half of the century: Tennessee Williams’ “A Streetcar Named Desire” (1947), Arthur Miller’s “Death of a Salesman” (1949), Samuel Beckett (1906, Ireland): “En Attendant Godot” (1952), Friedrich Durrenmatt (1921, Switzerland): “The Visit of the Old Lady” (1956), Max Frisch (1911): “Herr Biedermann und die Brandstifter” (1958), Harold Pinter (1930, Britain): “Caretaker” (1959), Eugene Ionesco (1912): “Rhinoceros” (1959), John Arden’s “Serjeant Musgrave’s Dance” (1959), Peter Weiss (1916): “Marat/Sade” (1964).
An epoch-defining moment was the landing on the Moon by USA astronauts, an event that ideally ended the decade of the boom.
If the Moon landing had seemed to herald complete domination by the USA, the events of the following decade seemed to herald its decline. The USA was defeated militarily in Vietnam (1975), Lebanon (1983) and Somalia (1992). The oil crisis of the 1970s created a world-wide economic crisis. The USA lost one of its main allies, Iran, to an Islamic revolution (1979), that was as significant for the political mood of the Middle East as Nasser’s Arab nationalism had been for the previous generation. After 30 years of rapid growth, both Japan and Germany became economic powers that threatened the USA globally. Both countrie caught up with the USA in terms of average wealth. Militarily, the Soviet Union remained a formidable global adversary, extending its political influence to large areas of the developed world.
Other problems of the age were drugs and AIDS. The culture of drugs and the holocaust of AIDS marked the depressed mood of the arts. Soon, another alarming term would surfance in the apocalyptic language: global warming.
However, space exploration continued, still propelled by the desire of the USA and the Soviet Union to compete anytime anywhere. In 1970 and 1971 the Soviet Union sent spacecrafts to our neighbors, Venus and Mars. In 1977 the USA launched the Voyager to reach other galaxies. In 1981 the U.S.A launched the first space shuttle. In 1986 the Soviet Union launched the permanent space station “MIR”. In 1990 the USA launched the Hubble space telescope.
Computers staged another impressive conceptual leap by reaching the desk of ordinary folks: the micro-processor (1971) enabled the the first “personal” computer (1974) which became ubiquitous from 1981 on.
As mass-media became more pervasive, they also changed format: the video-cassette recorder (1971), the cellular telephone (1973), the portable stereo (1979), the compact disc (1981), the DVD (1995). Ultimately, these innovations made both entertainment, communication and culture more “personal” and more “portable”.
Classical music reflected the complex world of the crisis with Dmitrij Shostakovic’s “Symphony 15” (1971), Morton Feldman’s “Rothko Chapel” (1971), Gyorgy Ligeti’s “Double Concerto” (1972), Henryk Gorecki’s “Symphony 3” (1976), Arvo Part’s “De Profundis” (1980), Witold Lutoslaski’s “Symphony 3” (1983).
The novel continued to experiment with newer and newer formats and structures: Vladimir Nabokov’s “Ada” (1969), Michel Tournier’s “Le Roi des Aulnes” (1970), Ismail Kadare’s “Chronicle in Stone” (1971), Danilo Kis’s “Hourglass” (1972), Thomas Pynchon’s “Gravity’s Rainbow” (1973), Nadine Gordimer’s “The Burger’s Daughter” (1979), Barbara Pym’s “Quartet in Autumn” (1977), Manuel Puig’s “El Beso de la Mujer Arana” (1976), Mario Vargas-Llosa’s “La Tia Julia” (1978), Salman Rushdie’s “Midnight’s Children” (1980), Elfriede Jelinek’s “Die Ausgesperrten” (1980), Toni Morrison’s “Tar Baby” (1981), Uwe Johnson’s “Jahrestage” (1983), Jose Saramago’s “Ricardo Reis” (1984). Milan Kundera’s “The Unbearable Lightness of Being” (1985), Joseph McElroy’s “Women and Men” (1987), Antonia Byatt’s “Possession” (1990), Winfried Georg Sebald’s “Die Ausgewanderten” (1992).
Poetry was becoming more philosophical through works such as Joseph Brodsky’s “Stop in the Desert” (1970), Mario Luzi’s “Su Fondamenti Invisibili” (1971), Derek Walcott’s “Another Life”“ (1973), Edward-Kamau Brathwaite’s “The Arrivants” (1973), Giorgio Caproni’s “Il Muro della Terra” (1975), John Ashbery’s “Self-Portrait in a Convex Mirror: (1975), James Merrill’s “The Changing Light at Sandover (1982).
By now, cinema was even more international than literature: John Boorman’s “Zardoz” (1973), Martin Scorsese’s “Mean Streets” (1973), Francis-Ford Coppola’s “The Godfather Part II” (1974), Robert Altman’s “Nashville” (1975), Theodoros Anghelopulos’s “Traveling Players” (1975), Bernardo Bertolucci’s “1900” (1976), Terence Malick’s “Days of Heaven” (1978), Ermanno Olmi’s “L’Albero degli Zoccoli” (1978), Woody Allen’s “Manhattan” (1979), Andrej Tarkovskij’s “Stalker” (1979), Istvan Szabo’s “Mephisto” (1981), Peter Greenaway’s “The Draughtsman’s Contract” (1982), Ridley Scott’s “Blade Runner” (1982), Terry Gilliam’s “Brazil” (1985), Wim Wenders’s “Wings of Desire” (1988), Zhang Yimou’s “Hong Gaoliang” (1987), Aki Kaurismaki’s “Leningrad Cowboys go to America” (1989), Tsui Hark’s “Wong Fei-hung” (1991), Takeshi Kitano’s “Sonatine” (1993), Krzysztof Kieslowski’s “Rouge” (1994), Bela Tarr’s “Satantango/ Satan’s Tango” (1994), Quentin Tarantino’s “Pulp Fiction” (1994), Jean-Marie Jeunet’s “City of Lost Children” (1995), Lars Von Trier’s “The Kingdom” (1995), Emir Kusturica’s “Underground” (1995), Jan Svankmajer’s “Conspirators of Pleasure” (1996), David Lynch’s “Lost Highway” (1997), Manuel de Oliveira’s “Viagem ao Principio do Mundo” (1997), Hirokazu Kore-eda’s “The Afterlife” (1998).
Physics was striving for grand unification theories. Both the “Big Bang” model and the theory of elementary particles had been successful examples of hybrid Quantum and Relativity theories, but, in reality, the quantum world and the relativistic world had little in common. One viewed the world as discrete, the other one viewed the world as continuous. One admitted indeterminacy, the other one was rigidly deterministic. One interpreted the weak, strong and electromagnetic forces as exchanges of virtual particles, the other one interpreted the gravitational force as space-time warping. Given the high degree of success in predicting the results of experiments, the two theories were surprisingly difficult to reconcile. Attempts to merge them (such as “Superstring Theory”) generally led to odd results.
Skepticism affected philosophers. Paul Feyerabend argued that the history of science proceeds by chance: science is a hodgepodge of more or less casual theories. And it is that way because the world is that way: the world does not consist of one homogeneous substance but of countless kinds, that cannot be “reduced” to one another. Feyerabend took the Science of his time literally: there is no evidence that the world has a single, coherent and complete nature.
Richard Rorty held that any theory is inevitably conditioned by the spirit of its age. The goal of philosophy and science is not to verify if our propositions agree with reality but to create a vocabulary to express what we think is reality. Facts do not exist independently of the way we describe them with words. Thus science and philosophy are only genres of literature.
Another sign that a new era had started was the decline of Structuralism. Jacques Derrida accused Structuralism of confusing “being” and “Being”, the code and the transcendental reality. Language is also a world in which we live. In fact, there are multiple legitimate interpretations of a text, multiple layers of meaning. Language is constantly shifting. He advocated deciphering the “archi-escriture” (“deconstruction” or “differance”).
France after World War II provided the ideal stage for a frontal critique of the rationalist tradition founded by Descartes and publicized by the Enlightenment that views reason as the source of knowledge and knowledge as the source of progress. “Modernism” had been based on the implicit postulate that progress founded on science is good, and that reason applied to society leads to a better (e.g. egalitarian) social order. The pessimistic views of Friedrich Nietzsche, Arnold Toynbee, Oswald Spengler, Martin Heidegger and Ludwig Wittgenstein escalated as modern society revealed the dark sides of rapid economic growth, industrialization, urbanization, consumerism, of the multiplying forms of communication and information, of globalization. Technology and media on one hand democratize knowledge and culture but on the other hand introduce new forms of oppression. The earliest forms of reaction to modernism had manifested themselves in Bohemian lifestyles, subcultures such as Dadaism, anticapitalist ideologies, phenomenology and existentialism. But it was becoming more and move self-evident that perception of the object by the subject is mediated by socially-constructed discourse, that heterogeneity and fragmentation make more sense than the totalization of culture attempted by modernism, and that the distinction of high-culture and low-culture was artificial. The post-modernist ethos was born: science and reason were no longer viewed as morally good; multiple sources of power and oppression were identified in capitalist society; education no longer trusted as unbiased but seen as politicized; etc. Realizing that knowledge is power, the postmodernist generation engaged in political upheavals such as student riots (Berkeley 1964, Paris 1968), adopted mottos such as “power to the imagination” and identified the bourgeoisie as the problem. For postmodernism the signifier is more important than the signified; meaning is unstable (at any point in time the signified is merely a step in a never-ending process of signification); meaning is in fact socially constructed; there are no facts, only interpretations.
Guy Debord argued that the “society of the spectacle” masks a condition of alienation and oppression. Gilles Deleuze opted for “rhizomatic” thought (dynamic, heterogeneous, chaotic) over the “arborescent thought” (hierarchical, centralized, deterministic) of modernism.
Felix Guattari speculatd that there is neither a subject nor an object of desire, just desire as the primordial force that shapes society and history; a micropolitics of desire that replace Nietsche’s concept of the “power to will”. In his bold synthesis of Freud, Marx and Nietzsche (“schizoanalysis”) the subject is a nomadic desiring machine.
Jean-Francois Lyotard was “incredulous” towards Metaphysics (towards “metanarratives”) because he viewed the rational self (capable of analyzing the world) as a mere fiction. The self, like language, is a layer of meanings that can be contradictory. Instead of “grand narratives”, that produce knowledge for its own sake, he preferred mini-narratives that are “provisional, contingent, temporary, and relative”; in other words, a fragmentation of beliefs and values instead of the grand unification theories of Metaphysics.
Jean Baudrillard painted the picture of a meanigless society of signs in which the real and the simulation are indistinguishable. The transformation from a “metallurgic” society to a “semiurgic” society (a society satured with artificial signs) leads to an implostion in all directions, an implosion of boundaries (eg politics becomes entertainment). More importantly, the boundary between the real and the simulation becomes blurred. Technology, economics and the media create a world of simulacra. The simulation can even become more real than the real (hyper-real). Post-modern society is replacing reality with a simulated reality of symbols and signs. At the same time meaning has been lost in a neutral sterile flow of information, entertainment and marketing. The postmodern person is the victim of an accelerating proliferation of signs that destroys meaning; of a global process of destruction of meaning. The postmodern world is meaningless, it is a reservoir of nihilism. In fact, the accelerating proliferation of goods has created a world in which objects rule subjects: “Things have found a way to elude the dialectic of meaning, a dialectic which bored them: they did this by infinite proliferation”. The only metaphysics that makes sense is a metaphysics of the absurd like Jarry’s pataphysics.
However, the topic that dominated intellectual life at the turn of the millennium and that fostered the first truly interdisciplinary research (involving Neurology, Psychology, Biology, Mathematics, Linguistics, Physics) was: the brain. It was Descartes’ mind-body problem recast in the age of the neuron: who are we? Where does our mind come from? Now that new techniques allowed scientists to study the minutiae of neural processes the ambition became to reconstruct how the brain produces behavior and how the brain produces consciousness. Consciousness became a veritable new frontier of science.
The fascination with consciousness could already be seen in Julian Jaynes’ theory that it was a relatively recent phenomenon, that ancient people did not “think” they way we think today. He argued that the characters in the oldest parts of the Homeric epics and of the Ancient Testament were largely “non-conscious”: their mind was “bicameral”, two minds that spoke to each other, as opposed to one mind being aware of itself. Those humans were guided by “hallucinations” (such as gods) that formed in the right hemisphere of the brain and that communicated to the left hemisphere of the brain, that received them as commands. Language did not serve as conscious thought: it served as communication between the two hemispheres of the brain. The bicameral mind began “breaking down” when the hallucinated voices no longer provided “automatic” guidance for survival. As humans lost faith in gods, they “invented” consciousness.
A major revolution in the understanding of the brain was started, indirectly, by the theory of the immune system advanced by Niels Jerne, which viewed the immune system as a Darwinian system. The immune system routinely manufactures all the antibodies it will ever need. When the body is attacked by foreign antigens the appropriate antibodies are “selected” and “rewarded” over the antibodies that are never used. Instead of an immune system that “designs” the appropriate antibody for the current invader, Jerne painted the picture of a passive repertory of antibodies that the environment selects. The environment is the actor. Jerne speculated that a similar paradigm might be applied to the mind: mind manufactures chaotic mental events that the environment orders into thought. The mind already knows the solution to all the problems that can occur in the environment in which it evolved over millions of years. The mind knows what to do, but it is the environment that selects what it actually does.
Neurologists such as Michael Gazzaniga cast doubt on the role of consciousness. He observed that the brain seems to contain several independent brain systems working in parallel, possibly evolutionary additions to the nervous system. Basically, a mind is many minds that coexist in a confederation. A module located in the left hemisphere interprets the actions of the other modules and provides explanations for our behavior: that is what we feel as “consciousness”. If that is the case, than our “commands” do not precede action, they follow it. First our brain orders an action, then we become aware of having decided it. There are many “I”’s and there is one “I” that makes sense of what all the other “I”’s are doing: we are aware only of this “verbal self”, but it is not the one in charge.
A similar picture was painted by Daniel Dennett, who believed the mind is due to competition between several parallel narrative “drafts”: at every point in time, there are many drafts active in the brain, and we are aware only of the one that is dominant at that point in time. There is, in fact, no single stream of consciousness. The continuity of consciousness is an illusion.
Jerne’s theory was further developed by Gerald Edelman, who noticed that the human genome alone cannot specify the complex structure of the brain, and that individual brains are wildly diverse. The reason, in his opinion, is that the brain develops by Darwinian competition: connections between neurons and neural groups are initially under-determined by the genetic instructions. As the brain is used to deal with the environment, connections are strengthened or weakened based on their success or failure in dealing with the world. Neural groups “compete” to respond to environmental stimuli (“Neural Darwinism”). Each brain is different because its ultimate configuration depends on the experiences that it encounters during its development. The brain is not a direct product of the information contained in the genome: it uses much more information that is available in the genome, i.e. information from the environment. As it lives, the brain continuously reorganizes itself. Thus brain processes are dynamic and stochastic. The brain is not an “instructional” system but a “selectional” system.
The scenario of many minds competing for control was further refined by William Calvin, who held that a Darwinian process in the brain finds the best thought among the many that are continuously produced. A neural pattern copies itself repeatedly around a region of the brain, in a more or less random manner. The ones that “survive” (that are adequate to act in the world) reproduce and mutate. “Thoughts” are created randomly, compete and evolve subconsciously. Our current thought is simply the dominant pattern in the copying competition. A “cerebral code” (the brain equivalent of the genetic code) drives reproduction, variation and selection of thoughts.
Paul MacLean introduced the view of the human brain as three brains in one, each brain corresponding to a different stage of evolution: the “reptilian” brain for instinctive behavior (mostly the autonomic system), the “old mammalian” brain for emotions that are functional to survival (mostly the limbic systemi) and the “new mammalian” brain for higher cognitive functions (basically, the neo-cortex). Mechanical behavior, emotional behavior and rational behavior arose chronologically and now coexist and complement each other.
Merlin Donald viewed the development of the human mind in four stages: the “episodic” mind, that is limited to stimulus-response associations and cannot retrieve memories without environmental cues (lives entirely in the present); the “mimetic” mind, capable of motor-based representations and of retrieving memories independently of environmental cues (understands the world, communicates and makes tools; the “mythic” mind, that constructs narratives and creates myths; and the “theoretical” mind, capable of manipulating symbols.
Steven Mithen identified four “modules” in the brain, which evolved independently and represent four kinds of intelligence: social intelligence (the ability to deal with other humans), natural-history intelligence (the ability to deal with the environment), tool-using intelligence and linguistic intelligence. The hunters-gatherers of pre-history were experts in all these domains, but those differente kinds of expertise did not mix. For thousands of years these different skillsets had been separated. Mithen speculates that the emergence of self-awareness caused the integration of these kinds of intelligence (“cognitive fluidity”) that led to the cultural explosion of art, technology, farming, religion.
The role of a cognitive system in the environment was analyzed by Humberto Maturana and Francisco Varela. They believed that living beings are units of interaction, and that cognition is embodied action (or “enaction”). Organisms survive by “autopoiesis”, the process by which an organism continuously reorganizes its own structure to maintain a stable relationship with the environment. A living being cannot be understood independently of its environment, because it is that relationship that molds its cognitive life. Conversely, the world is “enacted” from the actions of living beings. Thus living beings and environment mutually specify each other. Life is an elegant dance between the organism and the environment, and the mind is “the tune of that dance”.
Wilson-Edward Osborne, the founder of “sociobiology”, applied the principles of Darwinian evolution to behavior, believing that the social behavior of animals and humans can be explained from the viewpoint of evolution.
Richard Dawkins pointed out that one can imagine a Darwinian scenario also for the evolution of ideas, which he called “memes”. A meme is something that infects a mind (a tune, a slogan, an ideology, a religion) in such a way that the mind feels the urge to communicate it to other minds, thus contributing to spreading it. As memes migrate from mind to mind, they replicate, mutate and evolve. Meme are the cultural counterpart of genes. A meme is the unit of cultural evolution, just like a gene is the unit of biological evolutionJust like genes use bodies as vehicles to spread, so memes use minds as vehicles to spread. The mind is a machine for copying memes, just like the body is a machine for copying genes. Memes have created the mind, not the other way around.
Dawkins held the view that Darwinian evolution was driven by genes, not by bodies. It is genes that want to live forever, and that use bodies for that purpose. To Dawkins, evolution is nothing but a very sophisticated strategy for genes to survive. What survives is not my body but my genes.
Dawkins also called attention to the fact that the border of a “body” (or, better, phenotype) is not so obvious: a spider would not exist without its cobweb. Dawkins’ “extended phenotype” includes the world that an organism interacts with. The organism alone is an oversimplification, and does not really have biological relevance. The control of an organism is never complete inside and null outside: there is a continuum of degrees of control, which allows partiality of control inside (e.g., parasites operate on the nervous system of their hosts) and an extension of control outside (as in the cobweb). What makes biological sense is an interactive system comprising the organism and its neighbors. The very genome of a cell can be viewed as a representation of the environment inside the cell.
Stuart Kauffman and others saw “self-organization” as a general property of the universe. Both living beings and brains are examples of self-organizing systems. Evolution is a process of self-organization. The spontaneous emergence of order, or self-organization of complex systems, is ubiquitous in nature. Kauffman argued that self-organization is the fundamental force that counteracts the universal drift towards disorder. Life was not only possible and probable, but almost inevitable.
Linguistics focused on metaphor as more than a poetic tool. George Lakoff argued that language is grounded in our bodily experience, that language is “embodied”. Our bodily experience creates our concepts. Syntax is created by our bodily experience. The “universal” grammar shared by all humans is due to the fact that we all share roughly the same bodily experience. The process by which we create concepts out of bodily experience is metaphor, the process of experiencing something in terms of something else. The entire human conceptual system is metaphorical, because a concepts can always be understood in terms of less abstract concepts, all the way down to our bodily experience. No surprise that we understand the world through metaphors, and we do so without any effort, automatically and unconsciously. Lakoff held that language was created to deal with physical objects, and later extended to non-physical objects by means of metaphors. Thus metaphor is biological: our brains are built for metaphorical thought.
Dreams continued to fascinate neurologists such as Allan Hobson and Jonathan Winson, as new data showed that the brain was “using” sleep to consolidate memories. Dreams are a window on the processing that goes on in the brain while we sleep. The brain is rapidly processing a huge amount of information, and our consciousness sees flashes of the bits that are being processed. The brain tries to interpret these bits as narratives, but, inevitably, they look “weird”. In reality, there is no story in a dream: it is just a parade of information that is being processed. During REM sleep the brain processes information that accumulated during the day. Dreams represent “practice sessions” in which animals refine their survival skills. Early mammals had to perform all their “reasoning” on the spot. Modern brains have invented a way to “postpone” processing sensory information.
Colin McGinn was skeptic that any of this could lead to an explanation of what consciousness is and how it is produced by the brain. He argued that we are not omnipotent: like any other organisms, there may be things that we just can’t conceive. Maybe consciousness just does not belong to the “cognitive closure” of our organism. In other words, understanding our consciousness is beyond our cognitive capacities.
The search for consciousness inside the brain took an unexpected turn when a mysterious biorhythm of about 40 Hertz was detected inside the brain. The traditional model for consciousness was “space-based binding”: there must be a place inside the brain where perceptions, sensations, memories and so forth get integrated into the “feeling” of my consciousness.
Thus Gerald Edelman and Antonio Damasio hypothesized mechanisms by which regions of the brain could synthesize degrees of consciousness. Damasio realized that the “movie in the mind” consciousness caused by the flow of sensory inputs was not enough to explain self-awareness. He believed that “Self” consciousness reqired a topography of the body and a topography of the environment, that ultimately the “self” originated from its juxtaposition against the “non-self”. An “owner” and “observer” of the movie is created within a second-order narrative of the self interacting with the non-self. The self is continuously reconstructed via this interaction. The “I” is not telling the story: the “I” is created by stories told in the mind.
Francis Crick launched the opposite paradigm (“time-based binding”) when he speculated that synchronized firing (the 40 Hertz biorhythm) in the region connecting the thalamus and the cortex might “be” a person’s consciousness. Instead of looking for a “place” where the integration occurs, Crick and others started looking for integration in “time”. Maybe consciousness arises from the continuous dialogue between regions of the brain.
Rodolfo Llinas noticed a possible implication of this viewpoint. It looks like neurons are active all the time. We do not control our neurons, no more than we control our blood circulation. In fact, neurons are always active, even when there are no inputs. Neurons operate at their own pace, regardless of the pace of information. A rhythmic system controls their activity, just like rhythmic systems control heartbeat or breathing. It seems that neurons are telling the body to move even when the body is not moving. Neurons generate behavior all the time, but only some behavior actually takes place. It sounds like Jerne’s model all over again: it is the environment that selects which movement the body will actually perform. Consciousness is a side-effect: the thalamus calls out all cortex cells that are active, and the response “is” consciousness.
How consciousness was produced by evolution was a fascinating mystery in itself. Graham Cairns-Smith turned the conventional model upside down when he claimed that emotions came first. A rudimentary system of feelings was born by accident during evolution. That system proved to be useful for survival, and therefore evolved. The organism was progressively flooded with emotions until a “stream of consciousness” appeared. Language allowed to express it in sounds and thoughts instead of mere facial expressions. Then the conscious “I” was born.
The Berlin Wall fell in 1989 and the Soviet Union was dissolved in 1991, two years after withdrawing from Afghanistan (a lengthy and debilitating war). Most of its Eastern European satellites adopted the American model (democracy and capitalism) and applied for membership in both NATO (the USA-led military alliance) and the European Union (the economic union originally sponsored by Italy, France and Germany, which now included also Britain and Spain). From the point of view of the USA, not only had the enemy (the communist world) surrendered, but most of its allies had turned friends to the USA. Almost overnight, the entire world was adopting the American model. The “domino” effect that the USA had feared would propagate communism took place in the opposite direction: the moment the Soviet Union fell, almost all the countries of the world abandoned communism and adopted the American economic and political model. Democratic reforms removed the dictators of Latin America, Far East and (later) Africa. The exceptions were rare (Cuba in Latin America, Burma and North Korea in the Far East, Zimbabwe in subequatorial Africa). There were only two notable exceptions. Under the stewardship of Deng Xiaoping (who had seized power in 1978), China itself had embarked into pseudo-capitalistic economic reforms, but the one-party system remained in place and kept strict control over freedom of speech. The Arab world, from Morocco to Iraq, from Syria to Yemen, and its eastern neighbors Iran and Afghanistan, were probably ruled by the most totalitarian regimes.
Other than these exceptions, the world was largely being molded after the example of the USA. What had been a picturesque melting pot (mostly a demographic experiment) had become a highly efficient economic and military machine, now imitated throughout the planet.
The adoption of the same economic model favored the creation of several free-trade zones and the creation of a “global village”.
The 1990s were largely a decade of economic boom and (relative) peace (Africa being the theater of most remaining wars).
The USA found itself at the helm of an odd structure. It was definitely not an “empire”, since each country maintained plenty of independence and every country became a fierce USA competitor, but at the same time. It had assumed a revolutionary (not imperial) mission to spread liberal democracy around the world. It fought wars that were more liberation than expansion wars. Its enemies were the enemies of liberal democracy (Nazism, Fascism, Communism, Islamic fundamentalism). It was, first and foremost, an empire of Knowledge: 75% of all Nobel laureates in the sciences, economics, and medicine were doing research in the USA.
As the two countries of the world were not forced anymore to choose between the USA and the Soviet camp, some of them achieved enough independence to exert significant regional influence. The new regional powers included the European Union (which kept growing in size and ambition), China, India (the largest democracy in the world), Japan, Brazil, Nigeria and South Africa. Last, but not least, there was Russia, intent in rebuilding itself as a non-communist country.
There was a significant shift from the Atlantic Ocean to the Pacific Ocean, as Japan, China, South Korea, Indochina, Australia, India became more and more relevant, while Western Europe was becoming less and less relevant. The combined gross product of Asian-Pacific countries increases from 7.8% of world GDP in 1960 to 16.4% in 1982 and to over 20% in 2000.
Islamic fundamentalism was not easy to define as a political entity, but, benefiting from the example of Iran’s Islamic state and the funds pouring from the oil states, it managed to hijack a country, the very Afghanistan that had contributed to the fall of the Soviet Union.
After a crisis in the 1970s, that had proven to the whole world how crucial the supply of oil was for the world’s economy, the Middle East had become a strategic area above and beyond the scope of the Cold War. With the end of the Cold War, the Middle East became an even more dangerous place because its hidden dynamics became more evident: a deadly combination of totalitarian regimes, Islamic fundamentalism, Palestinian-Israeli conflict and huge oil reserves. A practical demonstration came with the “Gulf War” in which a large USA-led coalition repulsed an Iraqi invasion of Kuwait.
Western society was being dominated by automation, from the sphere of the household to the sphere of science. The main innovation of the 1990s was the Internet, which, created in 1985 became a new tool to communicate and spread knowledge, thanks to electronic mail (“e-mail”) and the “World-Wide Web”. This was a completely new landscape, not too dissimilar from the landscape that explorers of the 16th century had to face. Suddenly, companies had to cope with computer viruses that spread over the Internet and people could find virtually unlimited amounts of information about virtually any topic via the “search engines”. The effects of the Internet were also visible on the economy of the USA: it largely fueled the boom of the 1990s, including the bubble of the stock market (the “dot-com” bubble).
The other emerging technology was genetic engineering. Having explained how life works, humans proceeded to tamper with it. The first genetically-engineered animal was produced in 1988, followed in 1994 by the first genetically-engineered edible vegetable, and, in 1997, by the first clone of a mammal. The Human Genome Project was deciphered.
Both ordinary folks and the intellectual elite had the feeling that they lived a special time. It is not surprising that thinkers turned increasingly to interpreting history. Ironically, this autobiographical theme started when Francis Fukuyama declared the “end of history”, meaning that the ideological debate had ended with the triumph of liberal democracy.
John Ralston Saul criticized globalization, that he viewed as caused by a geopolitical vacuum: nation states had been replaced by transnational corporations. The problem is that natural resources and consumers live in real places.
Samuel Huntington interpreted the history of the world of the last centuries as a “Western Civil War”, in which the Christian powers of Europe fought each other anytime anywhere. The fall of Communism and the triumph of Capitalism ended the Western Civil War. Now the world was turning towards a “clash of civilizations” (Western, Islamic, Confucian, Japanese, Hindu, Slavic-Orthodox, Latin American, African).
Just like the Moon landing that had seemed a good omen for the USA turned out to open a decade of problems, the fall of the Soviet Union that seemed another good omen for the USA turned out to open another category of problems. In 2001, hyper-terrorism staged its biggest success yet by crashing hijacked planes into two New York skyscrapers. The USA retaliated by invading (and democratizing) their home base, Afghanistan, and, for good measure, Iraq. Hyper-terrorism rapidly found new targets around the world, from Spain to the Arab countries themselves. Far from having fostered an era of peace, the fall of communism had opened a new can of worms. The second Iraqi war was also the first instance of a crack within the European allies: Britain, Italy and Poland sided with the USA, while France and Germany strongly opposed the USA-led invasion.
Into the new millennium, Science is faced with several challenges: unifying Quantum and Relativity theories; discovering the missing mass of the universe that those theories have predicted; understanding how the brain manufactures consciousness; deciphering the genome; managing an ever larger community of knowledge workers; using genetics for medical and agricultural purposes; and resuming the exploration of the universe.
Appendix: The New Physics: The Ubiquitous Asymmetry (physics.doc), a chapter of Nature of Consciousness
Piero Scaruffi, December 2004
Bibliography
William McNeill: A History of the Human Community (1987)
Charles VanDoren: A History of Knowledge (1991)
Mark Kishlansky: Civilization In The West (1995)
Roberts: Ancient History (1976)
Arthur Cotterell: Penguin Encyclopedia of Ancient Civilizations (1980)
John Keegan: A History of Warfare (1993)
Bernard Comrie: The Atlas Of Languages (1996)
Henry Hodges: Technology in the Ancient World (1970)
Alberto Siliotti: The Dwellings of Eternity (2000)
Alan Segal: Life After Death (2004)
David Cooper: World Philosophies (1996)
Ian McGreal: Great Thinkers of the Eastern World (1995)
Richard Popkin: The Columbia History of Western Philosophy (1999)
Mircea Eliade: A History of Religious Ideas (1982)
Paul Johnson: Art, A New History (2003)
Ian Sutton: Western Architecture (1999)
Donald Grout: A History of Western Music (1960)
Geoffrey Hindley: Larousse Encyclopedia of Music (1971)
Michael Roaf: Mesopotamia and the Ancient Near East (1990)
Hans Nissen: The Early History of the Ancient Near East (1988)
Annie Caubet: The Ancient Near East (1997)
Trevor Bryce: The kingdom of the Hittites (1998)
Rosalie David: Handbook to Life in Ancient Egypt (1998)
Henri Stierlin: Pharaohs Master-builders (1992)
Glenn Moore: Phoenicians (2000)
Barry Cunliffe: The Ancient Celts (1997)
David Abulafia: The Mediterranean in History (2003)
Henri Stierlin: Hindu India (2002)
Hermann Goetz: The Art of India (1959)
Heinrich Zimmer: Philosophies of India (1951)
Surendranath Dasgupta: A History of Indian Philosophy (1988)
Gordon Johnson: Cultural Atlas of India (1996)
Jadunath Sinha: “History Of Indian Philosophy” (1956)
Haridas Bhattacharyya: “The Cultural Heritage Of India” (1937)
Heinrich Zimmer: Philosophies of India (1951)
Charles Hucker: China’s Imperial Past (1975)
Sherman Lee: A History of Far Eastern Art (1973)
Wolfgang Bauer : China and the Search for Happiness (1976)
Joseph Needham: Science and Civilisation in China (1954)
John King Fairbank & Edwin Reischauer: East Asia Tradition and Transformation (1989)
Penelope Mason: History Of Japanese Art (1993)
Paul Varley: Japanese Culture (1973)
Thomas Martin: Ancient Greece (1996)
Katerina Servi: Greek Mythology (1997)
Robin Sowerby: The Greeks (1995)
Peter Levi: The Greek World (1990)
Tomlinson: Greek And Roman Architecture (1995)
Bruno Snell: The Discovery of the Mind (1953)
Henri Sierlin: The Roman Empire (2002
Duby & Perrot: A History of Women in the West vol 1 (1992)
Giovanni Becatti: The Art of Ancient Greece and Rome (1968)
Marvin Tameanko: Monumental Coins (1999)
John Norwich: A short history of Byzantium (1995)
Kevin Butcher: Roman Syria (2003)
Tomlinson: Greek And Roman Architecture (1995)
Bart Ehrman: Lost Scriptures (2003)
Elaine Pagels: The Origins Of Satan (1995)
Robert Eisenman: James the Just (1997)
Timothy Freke: The Jesus Mysteries (1999)
John Dominic Crossan: The Historical Jesus (1992)
Albert Hourani: A History of the Arab peoples (1991)
Bernard Lewis: The Middle East (1995)
John Esposito: History of Islam (1999)
Michael Jordan: Islam – An Illustrated History (2002)
Edgar Knobloch: Monuments of Central Asia (2001)
Huseyin Abiva & Noura Durkee: A History of Muslim Civilization (2003)
Vernon Egger: A History of the Muslim World to 1405 (2003)
David Banks: Images of the Other – Europe and the Muslim World Before 1700 (1997)
Reza Aslan: No God but God (2005)
Majid Fakhry: A History of Islamic Philosophy (1970)
Carter-Vaughn Findley: The Turks in World History (2005)
Richards, John: The Mughal Empire (1995)
Bertold Spuler: The Mongol Period – History of the Muslim World (1994)
David Christian: A History of Russia, Central Asia and Mongolia (1999)
Graham Fuller: The Future of Political Islam (2003)
Norman Cantor: Civilization of the Middle Ages (1993)
Henri Pirenne: Histoire Economique de l’Occident Medieval (1951)
Robert Lopez: “The Commercial Revolution of the Middle Ages” (1976)
Will Durant: “The Age of Faith” (1950)
James Chambers: “The Devil’s Horsemen” (1979)
Henry Bamford Parkes: The Divine Order (1968)
Fernand Braudel: The Mediterranean (1949)
Lynn White: Medieval Technology and Social Change (1962)
Gerhard Dohrn-van Rossum: “History of the Hour” (1996)
Frances & Joseph Gies: Cathedral Forge and Waterwheel (1994)
Georges Duby: The Age of the Cathedrals (1981)
Gunther Binding: High Gothic Art (2002)
Xavier Barral: Art in the Early Middle Ages (2002)
Daniel Hall: “A History of Southeast Asia” (1955)
Geoffrey Hosking: Russia and the Russians (2001)
Simon Schama: “A History of Britain” (2000)
Will Durant: The Renaissance (1953)
John Ralston Saul: “Voltaire’s Bastards” (1993)
Joel Mokyr: Lever of Riches (1990)
Hugh Thomas: The Slave Trade (1997)
Peter Watson: Ideas (2005)
John Crow: “The Epic of Latin America” (1980)
David Fromkin: “Europe’s Last Summer” (2004)
Mary Beth Norton: A People And A Nation (1986)
John Steele Gordon: “An Empire Of Wealth” (2004)
Daniel Yergin: “The Prize” (1991)
Lawrence James: Rise and Fall of the British Empire (1994)
Robert Jones Shafer: “A History of Latin America” (1978)
Paul Kennedy: The Rise and Fall of the Great Powers (1987)
Jonathan Spence: “The Search for Modern China” (1990)
Henry Kamen: Empire (2002)
Edward Kantowicz: The World In The 20th Century (1999)
Christian Delacampagne: A History of Philosophy in the 20th Century (1995)
Piero Scaruffi: Nature of Consciousness (2006)
Jacques Barzun: “From Dawn to Decadence” (2001)
Peter Hall: Cities in Civilization (1998)
Sheila Jones: The Quantum Ten (Oxford Univ Press, 2008)
Orlando Figes: “Natasha’s Dance – A Cultural History of Russia” (2003)
Roger Penrose:The Emperor’s New Mind (1989)
Gerard Piel: The Age Of Science, 2001)
Paul Johnson: Modern Times (1983)
Edward Kantowicz: The World In The 20th Century (1999)
Tony Judt: Postwar – A History of Europe Since 1945 (2005)
John Lewis Gaddis : The Cold War (2005)
Stephen Kinzer: Overthrow – America’s Century of Regime Change (2007)
Piers Brendon: The Decline And Fall Of The British Empire 1781-1997
HH Arnason: History of Modern Art (1977)
Herbert Read: A Concise History of Modern Painting (1959)
Jonathan Glancey: 20th Century Architecture (1998)
MOCA: At The End of the Century (1998)
Jonathan Glancey: 20th Century Architecture (1998)
Eric Rhode: A History of the Cinema (1976)
Robert Sklar: Film (1993)
Eileen Southern: The Music of Black Americans (1971)
Ted Gioia: A History of Jazz (1997)
Mark Prenderast: The Ambient Century (2000)
Piero Scaruffi: A History of Jazz (2007)
Piero Scaruffi: History of Rock and Dance Music (2009)