Society and Higher Education Part 2
So far I have talked about “ideal” or “paradigmatic” conservatives and liberals.
So far I have talked about “ideal” or “paradigmatic” conservatives and liberals.
In the United States now there are few of these left. Many conservatives today simply represent the interests of the rich and of corporations, with little care (beyond rhetoric) for working people or poor people. Many liberals claim to support working people and poor people but regularly sell them out for corporate contributions. In the meantime, over the last few decades, the rich have gotten richer, the poor have gotten poorer, and the middle class has remained stagnant in terms of wealth and imperiled in their very existence. In fact, the distribution of wealth in the United States is so bad now that we are in danger of social disruption and societal failure.
Faced with significant problems, we need reasoned and open debate between authentic conservative and liberal positions. We need this debate in order to understand our problems, to discover novel solutions to them, and to implement meaningful change.
Let me take a specific example, namely our schools. People on the right and left of the political spectrum believe our schools are broken, unfit as a 21stcentury educational system. Neither tinkering nor engineering has worked so far, however. Over the years, many incremental, tinkering changes have implemented, but the system stays largely the same. A major system-wide engineering change was made when legislation like No Child Left Behind brought in a form of federal control of schools and a testing, accountability, and standardization regime. This large system-wide, engineering sort of change was, ironically, championed by both conservative Republicans and liberal Democrats.
The system-wide change has not worked either. Teachers teach to the tests. School systems cheat on the tests and the scoring. Schools fail to teach in areas where there are no tests (like music, art, and civics). Classrooms are filled with skill-and-drill. The drop-out rate increases. Claimed progress on state-wide tests does not show up on reputable tests like NAEP in reading and mathematics (NAEP is given to samples of students). The United States falls further behind other countries on international tests in science and mathematics and in college graduation rates.
Since both tinkering and engineering have failed, what is required is real discussion between competing viewpoints. This discussion would of necessity involve what I will call “big questions”. These are questions like: What is the point of education? Should education be basically vocational and for whom? What is the role today of the liberal arts and humanities? What does society owe children as a right in education? What sorts of education should be left to markets and profit? What is the meaning of a public school and should we retain public schools? What do citizens in the 21st century need to know to participate in their societies? Can digital media transform schooling when other technologies have failed in the past(e.g., radio, television, and computers)? Should everyone go to college and what should college be about? What should equity mean in education?
Such questions cannot be answered by one single area of academic or political expertise. They require joint collaborative discussion in which people juxtapose, compare, contrast, and integrate competing visions, points of view, types of knowledge, and methods of inquiry. Such big questions are the most crucial questions we face into today’s high-tech, science-driven, high-risk, global world, a world filled with complex and interacting systems. With our schools, and whole educational system, it is clear that we need to back up and ask these big questions, see if we can gain consensus, and move forward in new directions.
Such questions quickly fall prey to ideology, party politics, and self-interest (usually now defined in the short term) in today’s public sphere. The university could have, should have, and maybe once was a place where such questions could have been asked and debated—and formulated into proposals—without falling totally prey to “politics” in the worst sense of the word. Of course, there has never been any site—certainly not universities—free of ideology and self-interest, but there have been and should be places more free of them than are Congress, corporations, and the media today.
The big question, “What is the purpose of schooling in a society?” Is a sub-question of the yet bigger question, “What is the role of knowledge in a human life?” (remember, Socrates said: "The unexamined life is not worth living,” Apology 38). These are the sorts of questions that have been at the heart of debate in society for centuries. Proposed answers to them have evolved over time and they have sedimented into our social institutions and practices. As times change, different proposed answers come to fore and sometimes change institutions and history.
Another big question is “What should the relationship in society be between individual freedom, on the one hand, and authority and government, on the other?”. St. Augustine argued that humans are basically fallible, weak, and morally depraved if left to their own devices. Thus, they need strong authority to control them. Indeed, for Augustine, even if the government is evil, a strong government that controlled people is better than the social chaos that individual freedom would give rise. Augustine thought that such evil governments should be changed only with caution unless we unleash even worse evil through unconstrained freedom.
Others in the course of Western history (not least French philosophers like Antoine Destutt de Tracy influenced by the French Revolution) have argued that humans are basically good and moral if they are given good, rich, and nurturing environments. They are warped by institutions, power, and social hierarchies that seek to control and constrain them, much like feet than grow poorly when bound from birth. Thus, the role of government is to ensure nurturing environments for all and then to allow human worth and creativity to unfold in relative freedom.
This debate is eternally relevant to societies, including ours today. It is a question in which all people have a stake. As we humans learn more and invent more we can give more nuanced answers to this question and build better systems based on those answers. With digital and social media today, questions about the relationship (and balance)) between freedom and authority and between individuals and institutions have been reopened with new force and new possibilities for speaking to old problems.
What happens to a society that stops asking such big questions? What happens to a society that no longer leaves any place to ask them that is not prey to ideology, self-interest, profit, and party politics? What happens to a society in which conservative and liberals will not debate each other, but only themselves, even when both approaches—as in our schools—have already failed? What happens when a society makes policy based not on thoughtful answers to these big questions, but solely on short-term gain, self-interest, or party politics? We may very well find out and soon.
Universities and churches were two “off market”, non-profiting seeking, long-term oriented institutions that were thought to be the key places for discussing big questions. But they are often no more such key places, if they ever really were. In the university today, faculty and administrators too often seek profit, short-term gain, and ideological victory over the pursuit of truth and workable solutions to our shared problems.
Universities have, for some time now, sought to address education in three ways. One way is to offer undergraduate students “mini” versions of what they offer graduate students. Faculty members teach undergraduates a scaled down version of their disciplinary specialty, the specialty in which they more deeply train their graduate students and the specialty to which they devote their writing and research. These latter two tasks are their true priority and the basis on which they reap rewards in the university.
A second approach seeks to offer undergraduates “big ideas” from the history of thought in Western and other civilizations. This is a liberal arts approach. In most cases, the big ideas are cut off from any real world applications or projects. Thus, today, undergraduates often find the ideas “irrelevant”.
A third approach is to make undergraduate educational relevant to the future work lives and vocations of students. Indeed, today, the largest major on most campuses is business. And, of course, many community colleges and for-profit colleges engage primarily in vocational education.
There is today a new approach. In this approach a college offers students exciting social interactions (often beer and bodies) and an environment full of amenities (good food and recreation facilities). Academic work is dumbed down and becomes a secondary concern to social interaction. College becomes camp. Ironically, there is a traditional version of this sort of approach at highly prestigious colleges. In such colleges the students often were—and still are sometimes today—offered status and social networking with other privileged students in lieu of any challenging, deep, or relevant education. In our status-driven society, colleges like Harvard and Princeton could water-board their students and still collect a great many students.
There is an inherent paradox in the whole notion of college in the modern world. College was, in some sense, meant to be an “elite” institution. It was supposed to be for those who desired it and were “intellectually fit” for it. It was meant to be the educational “big leagues”. Everyone was supposed to have the “right” to play in the educational minor leagues (elementary and high school), but not the educational big leagues.
Of course, in reality, until the end of WWII, colleges and universities were not necessarily filled by the intellectually fit, but by those who had the status and resources to be admitted. College and universities sought out and admitted few working class students. And, too, far from being the educational big leagues, college and universities often pandered to their privileged students, offering them a high status degree without excessive intellectual demands. Nonetheless, in ideal terms, college and universities were meant to be for an “elite” defined in terms of commitment to knowledge and the intellect, not in terms of money and status.
After WWII, a great many working class people did start going to colleges and universities (thanks in part to the GI Bill). Some of them eventually even became faculty members. Eventually, too, society demanded that a college education be made widely (and sometimes freely) available to everyone who wanted a college education. Going to college became an equity “right”, just like elementary school and high school. A great many superb students, who would earlier not have had an opportunity to go to college, did so. A great many underprepared students came as well, many of them the victims of America’s segregated and unequal schools.
An institution cannot both be an “elite” institution for those who have risen to the top and an equity-based institution available to everyone as a matter of social justice. This dilemma was resolved via the status of colleges and universities. The higher status institutions remained elite. Lower status colleges—and community colleges and today’s for-profit colleges—rolled out the welcome mat to all.
Today, there is another paradox at the heart of colleges and universities. Our society has decided to make college a goal for all who want it. We have decided that college is a matter of social justice, especially since college graduates earn significantly more than do high school graduates across their lifetimes. For some time we backed up this goal with public colleges and universities that were free or inexpensive. I myself (a baby-boomer and the first person in my family to go to college) went to the University of California (Santa Barbara) free. I also went to Graduate School at Stanford University on state taxpayer funds (coupled with a fellowship from Stanford). Thus, though I was a person not from the middle class, I earned a BA, MA, and PhD without any debt whatsoever.
Today even public colleges—let alone private colleges—are expensive enough that many poorer students cannot go to college, even though we tell them they should get a college education as a matter of social justice and economic growth in the United States. In other words, we lie to them. Many other poorer students—and a great many middle class ones—leave college with mountains of debt.
This dilemma means that today, in many states, we seek forms of alternative cut-rate college education for the poorer students (via distance learning and off campus extension programs). These alternative forms usually amount to four more years of high school, at best. Indeed, in the face of many underprepared students, and of prepared students who resist traditional college work, many colleges and universities are today nothing more than an additional faux bad high school that students attend after an earlier real bad high school. Often a majority of the faculty are part-time, adjunct, non-tenure track faculty with teaching loads as heavy or heavier than high-school teachers.
The final paradox at the heart of colleges and universities is that what were meant to be “off market” institutions are now fast becoming market-driven institutions. When I first became an academic, there was much less emphasis on “making money”. Public subsidies and support for colleges and universities were much larger. Colleges and universities were somewhat insulated from market forces.
Today, colleges and universities are face-to-face with the market. They have to make money on tuition, new and expanded programs, grants, and gifts. Academic work and degrees are dumbed down and sometimes sold like indulgences in the Medieval Church. Deans, provosts, and college presidents spend most of their time raising and worrying about money. Faculty are pressured to raise money in any way they can. Areas that do not receive much grant money (e.g. disciplines in the Humanities) are ignored by administrators and students alike. There is a push for research that leads to money in the short run, not research that leads to knowledge in the long run.
For proponents of free markets this all seems good. Why not let the market decide which academic areas, research, and faculty should survive (because they make money) and which should not (because they do not)? Why should any college keep around money losing fields or faculty whose research cannot garner grants?
The answer is—or has been in the past, at least—the same answer as to why we should keep biological diversity around even if we cannot make money on small owls and rare snakes. Diversity—including the stuff that seems useless—is a storage-house of possibilities for the future. We cannot know now (in the short run) what ideas or species may be found crucial in the future, in the long run.
When Mendel played around with his pea plants in the Monastery garden (because he had failed the state test to teach science), no sane institution at the time would have given him a grant. Years later, people came to realize that, in that garden, he had invented the basis of all of modern biology. Mendel was the first person in history to understand genetics. He was the only person who understood genetics in the 19th century (and this includes Darwin).
The theory was that, in colleges and universities, we should not drive out the Mendels just because they are not good for the bottom line. We should not be too quick to dismiss ideas and research as “useless” and “ridiculous”, because too often useless and ridiculous ideas have, in the long run, turned out to be important. But most do not and this attitude means paying for and leaving around a good many crazy or useless ideas, research, and faculty. It is not cost effective in the short run by any means.
Phonology, the study of sound and sound systems in languages, is by no means a popular subject or one that leads to much grant money. During WWII the U.S. government brought a number of academics out of Germany and the German occupied countries and, having nothing to do with them, let them lecture to each other at the New School in New York. Roman Jacobson, a Slavic linguist, gave lectures on phonology, a subject that any penny-pinching administrator would get rid of today.
Claude Levi-Strauss, a young French anthropologist, was inspired by these lectures to rethink his whole approach to anthropology and, in the act, created Structuralism, a major intellectual movement in the 20th century. Levi-Strauss invented the basis of modern anthropology. Piaget, inspired by Structuralism, invented the basis of modern psychology and child development. Structuralism spread to a great many other disciplines. Of course, today we have progressed beyond Structuralism in many ways, but a great deal of modern knowledge would be quite different today had Levi-Strauss not listened intently to Jacobson’s lectures on phonology.
In a world inspired by short-term gain and a plethora of business majors, few undergraduates will ever be in danger of hearing lectures on anything as boring and “useless” as phonology. Phonology is not relevant to much in the world unless students, like Levi-Strauss, bring creative minds to phonology lectures and sometimes see analogies and metaphors that lead to new ideas and even new areas of knowledge.
No one can tell a student what will for sure be relevant or irrelevant, important or unimportant, in the future, the future the student will live in. Students, on yesterday’s model of colleges and universities, were expected to expose themselves to various ideas and influences and take the risk of being bored or wasting their time in search of what would eventually inspire them and make them think deep thoughts and become deep people. Since no one can tell what is relevant or irrelevant, important or unimportant in the long run, markets cannot do so. They can, at best, tell us what is working in the short run. But in a short run culture that is enough. But it may not be enough, though, for the survival of human society.
No criticism of colleges has been more common over the last decade than diatribes against “the sage on the stage”. This criticism says that college teachers, in the guise of an expert or sage, lecture at students. In the act, they give them information that was once the professor’s preserve alone, but today is readily available via the Internet to anyone. So, on this view, the professor needs to stop lecturing (“pontificating”), since what the professor knows is now knowable by anyone without the effort of earning a PhD.
This criticism, though common, shows a lamentable lack of understanding about what college teaching, in the form of lectures, was meant to be about. It is hard to reform something if you never understood it in the first place. The criticism is lamentable, as well, because it grossly over-rates how useful information is in and of itself, however one gets it.
There were two reasons to give lectures. One was to give students information (“facts”) garnered from research in academic disciplines. If this was all a lecture was—and, indeed, that is all many an undergraduate lecture was—then the above criticism is fair, but rather pointless. It is pointless because this type of lecture was never worth much to begin with. It did not take the Internet to make it useless. Information from an academic discipline, given to people who will never be in it, has always been useless, because it is inert and largely irrelevant to most people.
Let me be clear here. There are probably facts that everyone needs to know. For example, people in the United States should surely know what decade the Civil War took place in (1860s). But no one, even prior to the Internet, needed a professional academic historian to tell them this. And professional academic history was never about the collection and dissemination of such dates anyway.
Facts all by themselves are largely irrelevant. They are good for creating a sense of shared cultural knowledge in a society or social group and this is not a bad thing by any means. But even knowing what decade the Civil War occurred in is of no great use unless one knows how it connects to other facts to gain some degree of significance. For me, for instance, a person who lived in the 1960s, the fact that the Civil War ended a hundred years earlier and that African-Americans were really only beginning to gain their rights as citizens in the 1960s was mind blowing. And, today, that fact that we are still leaving out aspects of the Civil War in our current political divisions between North and South is also mind blowing. It makes me mediate on what a 100 years means or doesn’t in human history. It makes me mediate, too, on what it means to say the Civil War ended when we day it did and to marvel at how historical events live on in the present in many different ways.
Now it is true that one can easily look up on the Internet when the Civil War happened and when it officially ended (1865). It is also possible to find a number of “stories” (like the one I just told) that connect these facts to others to give them some significance and to give one something important on which to meditate. However, one still has to know which stories are accurate and, perhaps even more importantly, develop some reason to care about and some desire to meditate on issues of importance. This, too, can be done on the Internet (and off it, of course), but takes much more effort. One can, for instance, become an active participant in an interest-driven community on the Internet devoted to the Civil War, Civil Rights, the North or the South, or any of many other interests related to the Civil War. Such interest-driven groups will often contain real expertise, but the experts on the site may well not be professional academics, but passionate amateurs—and that is all to the good.
A second reason to give a lecture had little directly to do with information or facts. The lecture was meant to show how a professional producer of knowledge (an academic) used information and facts, as well as other tools, to solve specific sorts of problems. We say academics (those who do research, which is the minority) “produce knowledge” (I just said it above), but in reality they propose and test solutions to problems or answers to questions. When the solutions appear to work we say knowledge has been produced, but when solutions change, we say knowledge has changed.
The problems that academics attempt to solve can be highly philosophical and abstract (e.g., “What is the nature of knowledge”?). They can be solidly empirical (e.g., “Why are there seasons?”). They can seem to be impracticable (“What are the historical origins of the English definite article ‘the’?”) or highly practicable (“What causes traffic in cities to back up when there is no accident or other blockage?”).
Academic disciplines attempt to solve problems. In the act, they create things that count as “facts”, at least until new discoveries are made. In turn, they use these facts to solve new problems, create new facts, and then again solve new problems. Academics think the facts they uncover are important and interesting, but they are not in business to repeat them, but to solve new problems (and challenge claims to have solved old ones) by using them as tools for inquiry.
In this regard, academics are like carpenters. A carpenter builds buildings, just like the academic solves problems. The buildings are an outcome of what the carpenter does, just like facts are the outcome of what the academic does. Knowing carpentry involves a great deal more than owning the right tools, just as knowing an academic discipline involves a lot more than knowing (“owning”) a bunch of facts. Knowing carpentry means knowing how to use tools to build good buildings. Knowing an academic discipline means knowing how to use facts, formula, and technical devices as tools to solve problems of a specific sort.
So a second reason to give a lecture was to give a demonstration of a “master” craftsman at work. Just as we would want to learn carpentry with the help of a master carpenter showing us and explicating for us how to do it, we would want to learn an academic discipline, say linguistics, by having a master linguist showing us and explicating for how to do it.
Now, one will say, “But this too can be done via the Internet. Just record the best academics (or the best carpenters) and let everyone see them demonstrate their stuff”. But this is not remotely the same thing as a live demonstration. In a live lecture—as in a live carpentry lesson—students can ask questions and can be questioned. There can be interactive dialogue. The student can say to the linguist or the carpenter “I didn’t get that, could you show me again, maybe in a different way” or “What about this related problem, does it work the same way?”.
The whole point of lectures, in this second form as demonstration and not information delivery, was to engage with questions (with problems) and show what questions were worth asking, how they might be answered, how one could evaluate different answers, and how one question could give rise to the next. The Internet—and a textbook, as well—cannot ask or answer questions in real life momen- by-moment interaction.
In lectures, one of the most important learning moments is when students interrupt to ask a question, offer an answer, or ask for clarification. In my experience, lectures meant as teaching do not work well when this does not happen. Surely a lecture on the Internet from a great linguist is useful and better than no interaction at all. But a lecture in person from that same linguist, where questions can be asked, answered, and demonstrated live is more useful.
People who bemoan the “sage on the stage” are either bemoaning the first type of lecture, lectures as information delivery, or they are bemoaning lectures of the second type, lectures as demonstrations of the craft of problem solving, that are not interactive and responsive. I, too, bemoan both. I bemoaned them before the Internet existed. They were both based on a bad model of what education should be. It was not the Internet that rendered them problematic.
Let’s call lectures that demonstrate problem solving—demonstrate how to proceed in engaging with problem solving in a given academic discipline—and that are interactive and responsive “responsive craft demonstrations”. It would be a poor carpentry teacher indeed who sought to demonstrate skills but would not reply to the students’ questions and confusions. So, too, it is a poor academic teacher who demonstrates how to solve problems, but will not respond to the students’ questions and confusions.
Now an academic giving a responsive craft demonstration is no more a “sage on the stage” than is a father showing his child how to ride a bike. Both are introducing newcomers to how to do something and, then, too, to become something—either a bike rider or a linguist, say.
Responsive craft demonstrations are the heart and soul of training graduate students. Good graduate students do not learn to do linguistics or be a linguist (or any other discipline) by reading books or garnering information alone from classes. Rather, they learn these things by engaging with good linguists who demonstrate—in and out of class—how to do linguistics and be a linguist.
Now we reach the real problem. Such responsive craft demonstrations, while the heart and soul of graduate education, can be problematic for undergraduates. Most undergraduates are not going to become linguists or any other type of academic. Showing them how to solve problems and answer questions in an area that they do not care about—and that may indeed be irrelevant to them—is not a good college education. It is, though, a way of teaching with which good academics are often comfortable.
The only reason to engage in such responsive craft demonstrations with undergraduates would be if the questions being asked or the problems being solved are arguable important to people even of they have no intention entering the specific academic discipline whose approach to problem solving is being taught. Arguably, of course, there are such questions, perhaps ones like “What are the origins of human beings?”, “How can cross-cultural communication succeed or fail?”, or “What is the role of genetics in one’s life?”. These and many others are probably questions that all people should care about, know how to think about, and realize how knowledge about such problems is produced.
The problem is this: For people who do not want to be academics, such important questions are not asked in disciplinary terms, as they are in academic disciplines. Academic disciplines work by a principle of divide and conquer. They take a big question and cut it into bits and let a different discipline or specialty handle different bits. The nature of life is dealt with in philosophy, biology, chemistry, computer science (e.g., “artificial life”), history, archeology, and a number of other disciplines. Each discipline takes a manageable sub-part of the concern the question is about. No one is responsible for putting all the sub-answers back together (or, at least, traditionally there has been no one until the growth of some aspects of complexity theory).
A college education for an undergraduate ought to be about big questions in all or many of their parts, with the answers to the different parts being put back together into a bigger picture. This means that being exposed to one academic specialty won’t do. It won’t do either to expose undergraduates to one specialty after another, since there is no one helping the undergraduate put all the pieces back into some meaningful whole that speaks to the lives we live when we are not being academics and where the problems we face do not come at us in nice manageable chunks.
So undergraduate education should be—and should have been—about big questions that cross the borders of academic disciplines, while being informed by the best resources of those disciplines, as well as some voice or voices that can put things back together again and gain a real purchase on the big picture—on the forest and not just the trees. Such an education should lead to helping people spend their lives thinking intelligently about big questions, not accepting answers in the present once and for all.
It is hard for colleges and universities to do this. In a research based university the default will be teaching specialized knowledge fit mostly for graduate students. In a teaching college, there may be too few faculty with active research lives that can inform proposed answers to big questions with the current knowledge and tools of inquiry of academic disciplines, however much this knowledge must be aggregated into a bigger picture. But, of course, at their best colleges and universities do have faculty who have done an admirable job at involving their students with a life-long quest to think about big questions. Perhaps too few, though. And, today, under the pressure to make money, cut costs, and to be relevant in the short term, such teaching may go the way of the dinosaurs in any case.
So, for me, the key issues or questions are these: 1) How should we replace information delivery lectures, lectures were never that useful, and which are surely even less useful in the Age of the Internet? 2) How can we eradicate non-responsive lectures of any sort, including craft demonstrations that are not responsive? 3) How can we ensure that graduate students can gain good responsive craft-based education in age of short-term profits and diminished demand for academics in many areas? 4) How can we build undergraduate education that is no longer dominated by information delivery and non-responsive lectures? 5) Should we –and how can we—build undergraduate education around big questions and a life-long pursuit of them? 6) Apart from academic disciplines for graduate students and big questions for undergraduates, what else—if anything—should be in the university? 7) Can our current short-term, short-sighted ideologically-driven (spin driven) society sustain anything like real colleges or universities and not just vocational education institutions or “party schools” for the rich?
Let me close on some remarks about “discussion”, rather than lecturing as a form of college teaching. Because of the animus against lectures, many reformers have turned to discussion (large or small group) as the preferred form of teaching in college. This, of course, misses the fact that a responsive craft demonstration was always a particular type of discussion, though not one between “equals” (anymore than a carpentry master-class is a discussion among equals).
A discussion among newcomers to big questions or an academic discipline must involve some model of what inquiry about the question or the discipline looks like. Often in college discussion groups this is a “reading”. But the newcomers cannot read this “reading” without some model of how people whose area of inquiry this reading is in read (and write) in that area. Nor can they ask questions of the reading and get answers back from it. A teacher is still required and is not useful if, in the guise of equality or empowering students, he or she will not talk.
A discussion, if it is to be useful, must be a three way dialogue between students, a text, and a teacher who can model how to raise, approach, and sometimes solve problems in a given area. In this case, the discussion is a form of what I called above a responsive craft demonstration, though one that rightly puts more emphasis on student-to-student interaction. Many good responsive craft demonstrations (“lectures”) did this in any case. They sought to move from teacher talk to dialogue with students as a group.
Discussions in which newcomers interact with each other and a text written in an area in which none of them are “experts” might be useful in some areas, especially if the text is written about an issue on which many people do have some background knowledge (e.g., a novel about love, an argument about whether relationships should take work or not, or an argument about current affairs). Discussions can also be useful when students have gained some shared knowledge and are, thus, no longer newcomers to a given area of inquiry or important question.
Discussions in which students share little knowledge, are newcomers, and have no useful modeling from a teacher who is not a newcomer are much like trying to learn French with no French speakers in the room and only a textbook or a recording of French. Such discussions are an evasion of teaching. In most cases, they are useless, no matter how “modern” or “progressive” they are held to be. They are a cheap and cost-effective way (given that the teacher, if present at all, need not know much beyond picking the textbook or other sorts of texts) to give people something that is an education in name only.
In the end, though it is not politically correct to say it, education—indeed most learning in or out of school—is not based on equality. Some years ago I became a video gamer. I would not be a gamer today if I had not, at the beginning, realized I needed to be guided by people who knew more about games than me, could model for me how to play games, and how to be and become a gamer.
Now it so happens that one of the masters I had to pay attention to in learning to game was my own young son. But I never thought that learning to be a gamer was a matter of equality, nor did I want primarily to discuss games with other non-gamers or people as poor at gaming as I was at the beginning, however much I did want to learn with them, practice with them, and gain their mutual support. I did not want a “sage on the stage” to help me become a gamer. But I did want a “sage”—in fact a number of “sages”—in the game and the gaming world. Most of these sages were decades younger than me. So what?
Subject  Search  Reset  Return