As of today, we can say that there are four types of development in terms of dependence on other cultures. They can be called (i) Non-dependent Development; (ii) Dependent Development; (iii) Independent Development; and (iv) Inter-dependent Development. These are broadly historical in the sense that movement is from endogenous to exogenous with the passage of time; however, there is no rigidity with regard to their sequence, and their simultaneity is not denied.
Let us elaborate this.
- Non-dependent development: Changes in the pre-colonial era occurred from within; they were non-dependent on any outside agency. Such development can be called orthogenetic or endogenous, or even indigenous. When societies were relatively isolated, this was the only mode of development. But even now, some aspects of development are the products of internal effort. Simultaneous inventions in two cultures that were never in contact offer a good example. Some have referred to this as a consequence of ‘stimulus diffusion’. The Americans and the Russians non-dependently engaged in space research and developed their own means to reach the moon. The development of various scripts is another good example.
- Dependent development: The colonial period introduced changes brought from abroad via the single aperture of the colonizing country, and hence the changes that occurred during that period—generally called Westernization—were dependent on a single outside source and directed to reinforce colonial rule. It was a single aperture model.
- Independent development: With independence, the sources of outside influence multiplied, and the countries began exercising their choice in receiving outside elements—a process that can rightly be called independent development. Modernization seems the appropriate title for this process, as it does not restrict the flow of influence to the West, but allows it from different directions. It must be stressed that while the dependency syndrome continues in this phase, the decision to allow or disallow alien elements are taken, at least in theory, by the independent nation-states. It is this independence to choose paths of outsourcing that distinguishes it from the previous model, where the country was totally dependent on the colonial master, both for decision-making and for the import of material and non-material culture. With the attainment of independence, choice was returned to the natives. It thus became a multiple aperture model.
- Inter-dependent development: As the world entered the twenty-first century, there was an acknowledgement of the new process influencing the entire world. This is Globalization, which emphasizes interdependence and mutual give and take, unlike previous processes, in which the developing societies were only at the receiving end. Thus, the pattern of heterogenetic changes has been altering. With reciprocity in exchange, this model of interaction has become a complex multiple aperture model.
We can divide the various periods into (i) the pre-colonial era, (ii) colonial era, and (iii) post-colonial era. The last can be divided into two phases: the initial phase, where the developed world was treated as the positive reference group for emulation and adoption of select cultural items—in the 4-M framework—and the current phase of globalization, where a mutual give and take has begun occurring. Globalization has brought the entire world in various ways to each country, and every country has begun registering its presence in the rest of the world. Today, it is not only the globe that has entered India, but India has also made its presence felt in the rest of the world. It is this duality, this reciprocity, that makes the process of globalization different from the previous processes of modernization and Westernization. It should, however, be emphasized that there could be countries that are still in the pre-globalization phase. And even countries that are now part of the global framework may differ in terms of their degree of dependence or interdependence.
Figures 19.1–19.5 illustrate this formulation:
Figure 19.1 The Pre-colonial Era: Isolated Societies in a Region
Figure 19.2 Relatively Insulated Societies in a Region
Figure 19.3 The Colonial Era: Linkage between Colonial Master and the Colonies
Figure 19.4 The Post-colonial Era: Multiplicity of Apertures
Figure 19.5 The New Context of Globalization
The ‘4-Ms’ of development that we mentioned earlier as part of the Western paradigm are still significant indicators of development, but their import from abroad is becoming restricted with the changing profile of developing countries. Many of these countries are now increasingly involved in the production of machines, are becoming monetarily self-reliant, have their own skilled manpower, and are developing their own systems of management and governance. In many respects, they act as equal partners with countries of the developed world. In fact, with the advances made in the fields of science and technology and with the advent of the information revolution, multinational companies are shifting their businesses and offices to countries of the developing world. This new phase of development is characterized by mutual dependence, that is, interdependence, while the earlier phases were characterized by dependent development. Those countries that have entered this phase of interdependence are contributing effectively to globalization. But there are several others who are lagging behind, and are thus experiencing the impact of globalization only vicariously.
Indian Experience in Development
The changes that have occurred over the past decade, with the onset of the twenty-first century are so enormous that those that had occurred in the 1960s and 1970s pale in comparison. It is, however, important to summarize the contributions that sociological research in countries like India have made to the understanding of change, and to the management of change in the past decades.
Social science research carried out in developing countries on current issues and problems, as well as on new projects and institutional structures, has contributed a good deal to the understanding of social change, supplementing contributions from developed societies and also adding new dimensions. It is important to note that such research has been carried out not only by indigenous scholars, but also by scholars from the developed world.
We shall briefly summarize the key contributions relative to this aspect of change, focusing particularly on India.
The emerging new states in older societies seemed keen to step up the pace of development so as to catch up with the developing societies that served as their positive reference groups—be they the socialist or Western capitalist societies. The accelerated pace of development encouraged students of society to prioritize the study of existing social structures before they underwent a transformation. They focused on documenting the existing reality that was responding to forces of rapid change.
In India, for example, the 1950s and 1960s were marked by a remarkable number of studies of village communities in India—both by insiders and outsiders—the latter now hailing not only from Britain, but also from other countries, particularly the United States of America. These studies, done as ethnographic accounts, produced some excellent monographs on the village and the caste system.
One of the pioneers of village studies in India, Srinivas, cogently argued for an objective portrayal of Indian society to replace the then prevailing and the ‘upper caste view’. He advocated the ‘field view’, based on observations by the researcher (see Madan, 2008).
Srinivas’ study of the Coorg alerted him to the prevalent diversities of belief and custom, which led him to believe that the practice of Hinduism differed from region to region—a fact generally missed by the untrained eye. He went on to categorize Hinduism as All-India, Peninsular, Regional and Local. He hypothesized that as the area of spread decreases, the number of ritual and cultural forms shared in common increases. Conversely, as the area increases, the common forms decrease’ (Srinivas, 1952: 213–14).
Srinivas also introduced the concept of Sanskritization11 to explain the changes occurring in the caste system. He elaborated this concept in a separate paper to explain the changes in the placement of individual castes in the ritual hierarchy. Castes belonging to the lower ranks in the ritual hierarchy, which were trying to emulate the behaviour of the upper castes—by abdicating certain practices such as widow remarriage, non-vegetarianism, or adopting teetotalism and other sanskritic rituals—were seen to follow a path of Sanskritization. It is fieldwork that led Srinivas to suggest that castes should not be seen as unchanging entities. A change in the location of any caste in the hierarchy is a phenomenon to be reckoned with. Such change, according to him, was a consequence of changes in the internal behavioural patterns of a caste. Changes in the caste units led to changes in the pattern of inter-caste relations, or the caste system.
Srinivas argued that ‘the caste system is far from a rigid system in which the position of each component caste is fixed for all time’. Various pre-Aryan and non-Aryan people adopted practices of the Vedic Aryans, which changed their outlook and ways of life. In fact, the growth in the number of castes can be attributed to the continuous acceptance and assimilation of outsider groups into the Hindu fold. While these groups were accepted as Hindus, they retained their group identities determined by their birth. Acceptance also signified their location in the caste hierarchy, which, however, remained ambiguous and became fuzzier with the continued increase in the number of new groups joining the system. Most of these groups were placed in the middle rungs of the hierarchy, thus creating a horizontal stretch and disturbing the simple verticality of the original four Varnas. It is the change in location brought about by castes striving to move upwards in the local caste hierarchy that characterizes the process of Sanskritization.
In his 1966 lecture delivered at Delhi University, Srinivas defined the concept as:
… the process by which a ‘Low’ caste or tribe or other group takes over the customs, ritual, beliefs, ideology and style of life of a high, and in particular, a ‘twice-born’(dwija) caste. The Sanskritization of a group has usually the effect of improving its position in the Local caste hierarchy. It usually presupposes either an improvement in the economic or political position of the group concerned, or a higher group self-consciousness resulting from its contact with a source of the ‘Great Tradition’ of Hinduism such as pilgrim-centre or monastery or proselytizing sect.
This formulation challenged the prevalent view that castes are static and unchanging. This observation was supported by a large number of village studies (for example, Atal, 1979; Bailey, 1957) carried out in different parts of India. These studies indicated changes that were taking place both within castes and in inter-caste relations—in caste as a unit, and caste as a system. Such a distinction was not made in earlier treatises on caste.12
Castes are constantly changing, both in number, composition, and vocational profile. The real castes of today do not conform closely to any descriptions/prescriptions of the old scriptures. Moreover, while castes are found all over India, the real functioning units operate at the regional level. Additionally, groups that moved out of the Hindu fold to other religions through conversion carried their caste with them, and continue to be identified with it. That is why in India, a demand has been raised for the inclusion of ‘Dalit castes’ belonging to other religions in the ‘Scheduled Caste’ or ‘Backward caste’13 categories.
Scholars also dwelt on the extensions of a village that joined it with the indigenous civilization of India. This approach considered the village as part of a wider system and not as an isolated whole, although it could be isolated as a community. The village was described both in terms of its unity and its extensions. In this regard, Mckim Marriott made a significant contribution through his seminal article published in Village India. Following Robert Redfield, Marriott tried to examine the linkages between the Great Tradition and the Little Traditions. He presented outlines of the twin processes of Univer-salization and Parochialization, through which the great and little traditions interacted and enriched themselves. Marriott defined Universalization as the upward evolution of local/parochial traditions—rituals, festivals, deities, etc.—and Parochialization as the downward devolution of elements of the Great Tradition (see Marriott, 1955). This interplay contributed to modifications in the profile of both the Great and the Little Traditions.
The operation of these processes of change was facilitated through improved means of transportation and communication. Carriers of the Great Tradition moved from one part of the country to another and spread its elements—the stories, rituals and value systems of Hinduism. These were accommodated within the local and regional cultures with appropriate modifications as part of the process of parochialization. Similarly, many of the elements of regional cultures travelled with migrants and gained acceptance among people of the other regions where the migrants settled.14
This occurred not only in the realm of rituals and religious practices, but also with regard to culinary habits and dressing patterns. Idli-Dausa from the south, Tandoori dishes from Punjab, Rasgulla and other sweets from Bengal, Ladoo from Uttar Pradesh and Rajasthan, and Dhokla from Gujarat, for example, have become part of an all-India cuisine. The same can be seen with regard to styles of dressing among both men and women. Of course, in matters of dress, outside influences are quite prominent as well. But one finds a queer accommodation of the two: on formal and ceremonial occasions, people—both men and women—wear traditional clothes; and on regular days and at workplaces, people wear Western clothes: the denim culture has spread far and wide among both men and women.
The most important feature of post-independent India was to initiate programmes of ‘directed culture change’ in order to remove poverty and improve agriculture; naturally, the programme of directed change was focused on rural India, where the majority still lives. In October 1952, the Government of India launched a massive Programme of Community Development to ameliorate conditions in Village India. The Programme was designed to introduce changes in agriculture, rural infrastructure, and community life with the involvement of the local populace. The government felt the need to evaluate the on-going programmes of change and assess their impact. This provided an additional impetus to social scientists to shift their focus on change.
The programmes of directed culture change in developing countries provided a fresh impetus to study other aspects of change, namely planning strategies of change, analysing the failure of a programme and obstacles to change, and the role of various sectors of society in promoting change.
Studies in this field began with the analysis of acceptance or rejection of an innovation. Some of the insights provided by researches in the West, particularly those related to technological innovation, came in handy. A large number of village studies has been carried out by sociologists, social anthropologists, economists, psychologists, political scientists, and public administration specialists on various aspects of directed change under the massive Programme of Community Development (CDP). The researchers produced detailed case studies of programmes of action and of the introduction of innovations in the fields of agriculture, health and education. Following the evaluation of the Community Development Programme, and on the recommendations of the Balwant Rai Mehta Committee, the Government of India introduced Panchayati Raj to initiate a process of democratic decentralization, which would involve local communities in decision-making and programme implementation.
The Community Development Programme was centrally planned, based on the Etawah Pilot Project initiated by an American, Albert Mayer. This served as the prototype for the country-wide programme. It was based on the assumption that the village should be treated as a ‘unit in neediness’, with complete disregard for the economic and social disparities prevalent in the village social structure. This project approach to change was disputed by some other scholars who believed that disparities of status and power hamstring community activities, and therefore thorough-going structural changes should precede programmes of rural development. Kusum Nair wrote a fascinating after travelling to all parts of the country and personally witnessing the activities of CDP for almost a year. She highlighted the absence of the human element in the Programme. Her was aptly titled Blossoms in the Dust. She said: ‘… in planning for the farming community it is apparent that there cannot be any economics in isolation from sociology and social psychology.’ She made the following points:
- The relationship between purely economic factors and cultural conditions cannot be ignored or excluded from planning.
- While differences in physical conditions (for example the climate, man/land ratio, soil, irrigation) must be taken into consideration while framing schemes to increase agricultural production, of equal importance is the community’s attitude to work. Despite every material resource and better technology, a programme may fail to achieve its goals if the community is not motivated to initiate change. A high-yielding variety of grain with better nutritional value may be rejected because of its taste or appearance.
- In the absence of common valuations, a uniform response to common incentives and stimuli cannot be expected. It is wrong to assume that ‘given equal opportunity, financial incentives and resources, all communities will respond similarly, in their productive efforts and economic achievements’.
Kusum Nair concluded: ‘Development will not become a self-generating process with its own momentum unless the value system of the community, and the social structure containing it, are first altered and adjusted to be in harmony with the socio-economic objectives of planning.’
Nair’s conclusion turned out to be partly prophetic, as it was overly accentuated. There were others—such as S. C. Dube (1958)—who advocated a middle path. Dube studied the functioning of community development in a block in the Rankhandi-Jaberan area of Uttar Pradesh, and provided ample evidence of the barriers to change latent in the traditional social structure, value orientations and cultural norms. But he was not in favour of dismissing the programme as he believed that it had set in motion, despite its weaknesses and drawbacks, new trends that favoured planned development. It provided an alternative way of life, raised people’s aspirations to an extent, and also made a dent in the intra-community power structure.
However, the Community Development Programme died a premature death. In the following years, the Government of India, through its Five-Year Plans, continuously revised the agenda and initiated other programmes for rural development. There was another programme named Antyodaya (meaning rise of the last) which received wide publicity. In recent years, there has been a nation-wide programme called the Mahatma Gandhi National Rural Employment Guarantee Act or MNREGA. Similarly, there is the NRHM—National Rural Health Mission—which was inaugurated in 2005.
This is not the place for a detailed discussion and analysis of all such programmes and their performances. They should be taken as illustrations of government-sponsored programmes of directed change. Empirical studies of such programmes in select areas, carried out by social scientists, have contributed to the understanding of the processes of change.
Sociological research grew in India in response to such initiatives and the changes that followed. From village studies, scholars moved to study towns and cities, and the processes of urbanization and industrialization. These studies paved the way for research into political participation (particularly voting, and social movements), and of the mass media of communication.
All these studies were directly or indirectly influenced by theories relative to modernization. Although most researches used the concepts and hypotheses associated with modernization theories, some came out with formulations that challenged established assumptions. For example, the tradition-modernity continuum was seriously questioned. Researchers discovered modernity in tradition, and tradition (Western religious values, for example) in modernity prescriptions. In the heat of the freedom struggle, native leaders picked up some traditions—values and institutions—and held them responsible for underdevelopment and the perpetuation of foreign rule. Researches in independent India also led scholars to announce that traditions have some intrinsic value which make them resilient, refusing to oblige those who were writing their obituaries. The concept of Panchayat, for example, was picked up from the Indian social structure and made part of local-level administration.
From Modernity to Post-modernity
We have dwelt at length on the concept of modernization. The new trend is to think about other processes, namely globalization and post-modernity. We have also suggested the difference between modernization and globalization. It will now be useful to provide a capsule summary of what post-modernity is.
Facing Up to Modernity (1977), identified the following four characteristics of modernity:
- The decline of small, traditional communities.
- The expansion of personal choice.
- Increasing social diversity.
- Future orientation and growing awareness of time.
Implicit in all these elements, as in other formulations of modernization, is a movement between two polar types, in which the point of origin is regarded as simple, less developed, and tradition-bound. Whether we talk of Tonnies or Weber or Durkheim or Marx, we find in all of them a suggestion for such a movement of societies, from simpler forms to complex entities.
The development strategies adopted by newly independent countries were also built on the same premise. The implicit idea was that developing societies of the world would also become mass societies, in which personal kinship ties would weaken and individualism erode the spirit of communitarianism. Modernization was seen to pave the way for a homogenized world, destroying cultural diversity. All societies were supposed to move in the same direction, giving up their traditions and adopting a world culture.
However, there were doubts expressed about the emergence of such a scenario. These doubts were further reinforced by the manner in which developing societies confronted the winds of change coming from abroad. They refused to be swept away. Cultures demonstrated their resilience and selectively accepted outside innovations, imparting new meanings and functions to them. It is this behaviour of the so-called traditional cultures that prompted people to talk of post-modernity.
While the use of the term ‘post-modern’ is traced by some adumbrationists15 to the seventeenth century—the term used for anything that departed from what was then called ‘modern’—it was in the 1970s that ‘post-modern’ was first used in the field of architecture for those efforts that tried to combine classical forms with modern pragmatism and scientific engineering. Later, this term came to be used as an acknowledgment of the wholesale failure of modernity.
However, post-modernists were not against modernity. They did not argue for the rejection of modernity in toto; instead, they de-emphasized the total rejection of culture and redefined the emerging profile as a judicious mix of the old and the new. Postmodernism is in a way a rejection of the linear narrative, and it makes a strong case for the co-existence of tradition and modernity, of course in changing equations. The new need not be built on the ashes of the old.
John J. Macionis, in his Sociology textbook (10th edition, 2006), rightly suggests that post-modernism is still in search of a universally acceptable definition. And yet scholars are in agreement about the following observations:
- In important respects, modernity has failed.
- The bright light of ‘progress’ is fading.
- Science no longer holds the answers.
- Cultural debates are intensifying.
- Social institutions are changing.
Some scholars believe that post-modernity has entered a second phase. The first phase overlapped the end of modernity. At that time, in the Western world, television became the primary source of news, and manufacturing decreased in importance, although trade volumes continued increasing. In 1967–69, the developed world witnessed a major cultural explosion when a series of demonstrations and acts of rebellion were reported; opposition on the part of the youth to prevailing practices, policies and perspectives became quite vocal and violent. This was also the time when feminist ideology began to take shape.
The second phase of post-modernity is characterized by digitality. This is demonstrated by the increasing power of digital means of communication: fax machines, modems, cable and high speed Internet, and the World Wide Web (www). This has led to the creation of a new information society.
Those holding this position argued that the ability to manipulate items of popular culture, the World Wide Web, the use of search engines to index knowledge, and telecommunications were producing a ‘convergence’, which would be marked by the rise of ‘participatory culture’ and the use of media appliances. Pauline Rosenau says that post-modernism ‘rejects epistemological assumptions, refutes methodological conventions, resists knowledge claims, obscures all visions of truth, and dismisses policy recommendations’ (1992: 3). This characterization, in Rosenau’s view, is also applicable to post-structuralism. According to her, the ‘post-modern impact in the fields of anthropology, law, women’s studies, planning, urban studies, geography, sociology, international relations, and political science has been greater than in the case of economics and psychology, where its development is slower, (ibid.).
Post-modernism is a new academic paradigm.
In its most extreme formulation, post-modernism is revolutionary; it goes to the very core of what constitutes social science and radically dismisses it. In its more moderate proclamations, post-modernism encourages substantive re-definition and innovation. Post-modernism proposes to set itself up outside the modern paradigm, not to judge modernity by its own criteria but rather to contemplate and deconstruct it. Ironically, on occasion this flamboyant approach arrives at conclusions that merely reinforce those already evident in the social sciences …. [And] so post-modernism is not always as entirely original as it first appears (ibid.: 4–5).
What post-modernism does is challenge all-encompassing world views, be it Marxism, Islam, Christianity, feminism, liberal democracy, or secular humanism. They are dismissed as logocentric, transcendental totalizing meta-narratives that anticipate all questions and provide predetermined answers.’All such systems of thought rest on assumptions no more or no less certain than those of witchcraft, astrology, or primitive cults’ (ibid.; see Shewder, 1986).
The debate on post-modernity has two distinct elements that are often confused: (i) the nature of contemporary society, and (ii) the nature of the critique of contemporary society. There are three principal analyses.
- Some theorists downplay the significance and extent of socio-economic changes and emphasize continuity with the past.
- Other theorists analyse the present as a development of the ‘modern’ project into a second, distinct phase that is nevertheless still ‘modernity’, variously called the ‘second’ or ‘risk’ society, ‘late’ or ‘high’ modernity, ‘liquid’ modernity, and the ‘network’ society.
- Then there are those who argue that contemporary society has moved into a literally post-modern phase that is distinct from modernity.
There is no simple definition of post-modernity. In fact, post-modern social science is still in its infancy, and ‘like many incipient paradigms, its overall shape and character is vague, its substantive contribution still shadowy and fragmentary, mixed and uneven’ (Rosenau, 1992: 169).
The so-called ‘Postmodern sociology’ has focused on the emerging scenario of the most industrialized nations in the late twentieth century, which were characterized by the ubiquity of mass media and mass production, the rise of a global economy, and a shift from manufacturing to service economies. The hallmark of post-modernity, according to some, is the rise of consumerism, where social connectedness and community feeling have become rare. All of this is ascribed to more rapid transportation, wider networks of communication, and the ability to abandon standardization of mass production, leading to a system that values a wider range of capital and allows value to be stored in a greater variety of forms. As indicators of post-modernity, scholars talk of the dominance of television and popular culture, increased accessibility of information and mass telecommunications, environmentalism, civil rights, feminism, multiculturalism, and the growing anti-war movement. Post-modernists express the hope that present trends will mark the end of individualism and lead to a rebirth of the tribal era.
The important point to note is that most of these developments relate to societies that claim to be developed and modernized. Crisis has engulfed the developed world, and thus there is a need to re-examine the strategies copied from the North for emulation in developing societies.
The crisis in the West runs so deep that Morris Berman came out with a in 2000 with the suggestive title, The Twilight of American Culture. The author made a prophetic statement regarding Western decline, caused through a process of ‘dumbing down’.
Berman has provided ample evidence of the collapse of American intelligence. He made the startling revelation that the number of people reading a daily newspaper in the US has halved since 1965. Many people do not even open the bulkier dailies they buy, and deposit them in wayside dustbins. For an outsider visiting the country, there is very little by way of news in newspapers; the only purpose they serve is to tell the reader about places where goods are on sale. The communication media in the US have insulated people from the wider world. The newspapers carry local news—where there is no North, no East, no West and no South, as indicated in the acronym NEWS.
Berman’s shows how insulated American children are. This was his experience at the Charter High School for Black Teenagers in Washington D.C.:
… some of them, at age sixteen or seventeen, had never heard of the Atlantic Ocean, did not know what 1999 meant, historically speaking, or thought the Civil War had taken place in the 1960s. One student thought that Washington, D.C. was in the Midwest, and was not able to locate New York, Florida, or Texas on a sketchy map (2000: 11).
Berman does not regard this as mere ‘cultural deprivation; it is cultural massacre’ (ibid.).
To further cement his argument, he quotes other sources and reports that:
- Forty-two per cent of American adults cannot locate Japan on a world map, and according to Garrison Keillor (National Public Radio, 22 March 1997), another survey revealed that nearly 15 per cent couldn‘t locate the United States!
- A survey taken in October 1996 revealed that one in ten voters did not know who the Republican or Democratic nominees for President were.
- 1995 article in The New York Times reported a survey result that said 40 per cent of American adults (this could be upward of 70 million people) did not know that Germany was our enemy in World War II’. A Roper Survey (1996): 84 per cent of American college seniors couldn‘t say who was President at the start of the Korean War (Harry Truman). Fifty-eight per cent of American high school seniors cannot understand a newspaper editorial in anynewspaper, and a US Department of Education survey of 22,000 students in 1995 revealed that 50 per cent were unaware of the Cold War, and that 60 per cent had no idea of how the United States came into existence (ibid.: 34–35).
Berman feels that all this is evidence of a ‘steady lobotomizing’ of American culture. Such lobotomization, which makes people sluggish, seems to be a real danger not only for American culture, but for every other culture.
A similar situation seems to be developing in India. The Times of India, in its 12 August 2001 edition, carried a front page report of some of the answers given by ‘men and women armed with postgraduate degrees’, who appeared for an interview for teaching jobs in government schools in Madhya Pradesh. According to the TOI, ‘General Knowledge is not that general’, as is clear from the answers to simple questions related to the respondents’ own states and districts. Here is the sample:
Q: Who is the President of India?
A: Digvijay Singh (who was the Chief Minister of the State).
Q: Name some of the major rivers of Madhya Pradesh.
A: Kshipra, Chambal, maybe even Vindhyachal (the last is a mountain, and in Hindi, the suffix Achal means a mountain).
Q: Where is Madhya Pradesh?
A: Er … I think in Bhopal (which is the capital of Madhya Pradesh).
Q: Who is the District Collector of Indore? (the city where interviews were held)
A: Someone whose name sounds like a film star’s. I think it is Suleman Khan. (The actual name was Mohammed Suleman.)
The point that both stories make is that there seems to be a global trend of dumbing down, exposing low levels of intelligence. Citing titles such as Lewis Lapham’s Waiting for the Barbarians or Robert Kaplan’s An Empire Wilderness,16 Berman remarks: ‘the documentation that they provide—of crumbling school systems and widespread functional illiteracy, of violent crime and gross economic inequality, of apathy, cynicism, and whatmight be called “spiritual death”—is quite overwhelming… [The] system has lost its moorings, and, like ancient Rome, is drifting into an increasingly dysfunctional situation’ (2000: 2). Political scientist Benjamin Barber has termed this phase of American society ‘McWorld’—commercial corporate consumerism for its own sake.
Check out any TV ad for Nike or Pepsi and you‘ll see that McWorld has tremendous vitality; it appears energetic and upbeat. The problem is that since this vitality celebrates nothing substantive beyond buying and owning things, it itself is the cultural decline …. The United States, as Robert Kaplan suggests, is evolving into a corporate oligarchy that merely wears the trappings of a democracy (ibid.: 3).
There are philosophical dimensions to this debate on post-modernism. However, we need not go into the details of esoteric ideas. We shall also refrain from discussing postmodernism as an ideology.
This brief allusion to the debate on post-modernity is intended to serve as a preface to the emerging area of research on change that focuses on the future.
Leave a Reply