Society and Higher Education Part 3
Almost everyone—regardless of political ideology—agrees that college in the United States is broken.
Almost everyone—regardless of political ideology—agrees that college in the United States is broken.
At research universities, undergraduate education is often short changed in favor of graduate education and faculty research agendas. Every tenured or tenure-track faculty member at a research university well knows that status and rewards are tied to research and publication, not undergraduate teaching. Undergraduates at many colleges and universities are turned off by what they feel to be irrelevant lectures and an education tied to academic information more germane to graduate students. And, too, in the pursuit of students and tuition, college has in many places become dumbed down and more and more expensive.
The pursuit of money and short-term gain, leading in to an intense competition for students and ever higher prices, is only one of the problems facing higher education. Another problem is that learning out of school—starting with very young children and extending to adults—is today organized in radically different ways than is learning in schools and colleges. Yet this out-of-school learning appears to be more popular, effective, equitable, and even more profitable. While it is true that this boom in out-of-school learning has been fueled by digital and social media, it is fueled also by a different theory of learning and teaching.
I will have more to say about today’s out-of-school learning later. Let me here give just one example. In the video game Civilization players have to think about how history works in terms of why things happened as they did and how they could have happened differently. They have to solve historical, geographical, political, and military problems for a civilization they build over long periods of historical time. They have available to them an encyclopedia of information about history and cultures across time. They can, if they wish, use software to modify the game and make up their own historical challenges (e.g., succeed across time without warfare). This is a far cry from a textbook of inert facts or a lecture full of dates and big names.
Colleges and universities have sought to tie into this out-of-school learning boom by offering more e-learning courses and programs. But in the vast majority of cases, this e-learning is just a digital version of what already goes on in higher education. It is recorded lectures, lots of texts, talking heads, and paper and pencil tests. Furthermore, such e-learning is often dumbed down even further than are regular college courses. Discussion isn’t usually real time or face to face, but a long record of text. In no way does this form of learning match the way the best learning works out of school today, either in theory or practice. It is just a ruse to make money more effectively.
Let me give one example of how non-innovative much e-learning is. Outside of school, for video games and other digital media, there has been years of intense research on how to make good interfaces between the user and the software. These interfaces (think, for example, of a good interface in a real-time-strategy game like Rise of Nations or Age of Empires) are meant to be user-friendly, inviting, customizable, and to facilitate effective and motivating communication and problem solving. In much school-based e-learning today, the interface is no better than opening a book, attending a lecture, or reading a long, rambling, and unedited discussion (with perhaps a few pictures or links).
Why has digital learning outside of school spent so much time and effort on good interfaces, while school e-learning has not? It is because, outside of school, people are actually motivated to learn and will not pay for learning that does not work well. However, in colleges and universities today, students are often paying for a credential and not for any learning they care much about or that works all that well for them. This may be a sustainable model when the credential is highly prestigious (as at Harvard or Yale). It is not a sustainable model when the credential is not particularly prestigious and when everyone knows the credential does not prepare students for (or correlate with) success at work or in the world.
One cannot reform higher education—nor talk about how it can face the challenges from out-of-school learning—without knowing what the purpose of higher education is. I think we can all agree that that purpose should NOT be to charge unmotivated undergraduates premium prices to be trained as mini-graduate students in academic disciplines they will never be in. Beyond that there is little agreement.
One cannot reform higher education either without knowing what the purposes and accomplishments of earlier stages of education should be. Too often today colleges and universities are just offering undergraduates another shot at a high school education to make up for the poor high school education they already received. But what should the purpose of higher education be if high schools actually achieved some real purpose and what should that purpose be?
We can ask the same question of high schools: What should high schools be if elementary schools did a good job at some real purpose we all accepted and high schools did not have to remedy or repeat bad earlier education, as they so often do today? Finally we can ask the same question of elementary schools: What should elementary education be if families did a good job at getting their children ready for school, which many families do not, in fact, do. Finally, of course, we have to ask how we can help each level do well what it is supposed to do (once we determine what that is). I will take up this topic a good deal more later.
A discussion of the goals of education can quickly get highly general and boring, though such discussions do capture people’s competing and often conflicting value systems about life and society. I will try to keep this discussion as specific as I can.
Many people see a major divide between two possible goals for college. Some want college to prepare students for work. They want to see a vocational slant to college. Of course, professional schools like medicine, law, engineering, and business already have this slant (and business is the most popular major in U. S.’s colleges and universities). People who advocate for this focus point to the importance of a college education for upward mobility and the acquisition of a good job.
The vocational focus has a problem however. First, in the United States and other developed countries, as well as some emerging economies like China, there are too many college graduates for the sorts of jobs that require them in a modern developed country. Furthermore, many jobs in the United States that say they require a college degree do not really need such a requirement. Companies simply use the requirement to ensure that they at least get the equivalent of a well educated high school student. Making high schools better would mitigate the need for college for this purpose. If we want people well prepared for work—whether high status or low status work—it would be better to skip college as we know it and simply have them attend vocational and professional schools that make no pretense to be anything else. There is no reason to call these colleges or universities.
Modern colleges were designed to give people a “higher education”, that is, an education that made people well versed in the historical accomplishments of civilization. They were designed to make people well versed in things like science, mathematics, philosophy, art, literature, and history, that is, in the “liberal arts”. The “liberal arts” are “liberal” precisely because they are not primarily focused on one’s work life, but on understanding the world and leading a life worth living or, at least, thinking about what makes a life worth living.
Universities were designed with this liberal arts goal as well, but also with the goal of engaging in research that leads to discovery, knowledge, and the improvements in society to which these can lead. In universities, of course, there is tension between the knowledge consumption goal for undergraduates and the knowledge production goal for graduate students and professors.
The point of this discussion is this: if the goal of school after high school is job training (whether for being a hair dresser or a lawyer), there is no reason to mix this with college. If this is all we want, we could just as well get rid of colleges as we know them and set up a myriad of well-designed vocational and professional schools. And, in fact, for the vast majority of work there is no need for any degree past high school (if the high school is any good) since the most productive job training, in most cases, is done on the job and most jobs do not, in reality, require a college degree.
So the real question about colleges is this: Do we want a “liberal arts” education for some people and, if so, for which people? The real question about universities is this: Do we want an institution that engages in research and undergraduate education at the same time? If not, then we should just have research institutions that engage in research and apprentice new researchers. There would be no need to call these institutions colleges or universities. They might just as well be called “research centers” or “think tanks”.
The debate about colleges and universities, in my view, is really a debate about whether society should continue to sponsor institutions that offer liberal arts education not focused on jobs or job training and, if so, who society should sponsor such an education for. Once one argues that colleges should be vocational (or if they just become fun camps for young people with money), then we are actually talking about something that need not be called a college at all. If we continue to call such things “college”, we have simply changed the meaning of the word.
There is a deeper question and problem here. If we replace college with vocational training (again, whether for hair dressers or lawyers), then we are in danger of leaving college as a liberal arts institution only to the rich who do not need to worry about jobs too early. This is, in a sense, where we started. Until the GI Bill after WWII and the 1960’s college was largely the preserve of elites. We must, at the least, ask if this is state to which we want to return.
In the end, then, I will simply set aside college as focused on vocational education. There is no reason why society cannot set up all sorts of vocational training at all sorts of levels and there is no reason to call any of this college. When I talk about goals for college and university education, I will be focusing on goals that are not directly related to job training of any sort. If someone thinks that no level of education should be devoted to learning that is not job related, then he or she should simply propose to close colleges or leave them to those rich enough to pay a premium price for “wasting their time”.
One way to think about the goals of college is to ask this question: Are there important things that everyone (or some set of people) need to know that they cannot have gotten in their K-12 schooling and that is not a form of job training? If you answer “yes” to this question, then whatever you take these things to be, that is your goal for college and undergraduate education in universities. If your answer is “no”, then you see no need for college as I am using the term (for at least some people).
We can readily see from this question that it is not answerable without also thinking about the goals of K-12 education. The goals of college are whatever important (non-job-training) learning K-12 has not accomplished. These goals should not be based on assuming earlier levels of schooling will do a bad job and so colleges need to make up for this situation. Colleges should not be remedial institutions making up for the sins of K-12 education. If we need such institutions, let’s just call them what they are: institutions for re-taking earlier levels of schooling that, for whatever reason, failed the first time (and this is something that could be handled well by good e-learning). Again, there is no need to call these institutions colleges.
Of course, there is lots of academic content that high schools cannot cover in the time they have. Some might say this “spill over” can be left for college. But if this spill over is just additional content in an academic discipline the student will never be in, then there is really no for an institution to teach it. No high school course covers the treatment of “parasitic gaps” in theoretical syntax (a branch of linguistics). But no one who is not going to be a linguist needs to know this in any case.
Historically there are three ways to claim truth. One is authority. Prior to the Renaissance, for centuries, truth was determined by authority. Things were true because an authority figure said they were. This authority figure might be an ancient thinker (e.g., Plato, Aristotle, or Galen), a priest or bishop, a king or an aristocrat, or a military or government authority. Authorities were often claimed to be smarter than others or to have some special access to truth (or God). Authority often implied force, in the sense that some institution or another (church or state) could and would enforce what authority said was true.
A second historical way to claim truth is ideology (what many today call “spin”). People have for all of human history sought to make claims that certain things were true because they wished they were or because they would be advantaged if they were. These people have engaged in lies, distortion, and self-deception to attempt to convince others (and sometimes themselves) that the world is the way they want it or need it to be. Often they make up convincing stories or sham arguments to engage in persuasion. Often such people represent not just themselves, but groups, cultures, nations, institutions, or causes. Often, too, they claim to have or to be related to some authority. Ideology and authority as claims to truth have often supported each other.
A third way to claim truth is observation. From the beginning humans have been able to test whether something is true (or “works”) by observation of the world and of other humans. For example, humans discovered a great many things about food safety, animal behavior, the natural world, and human psychology long long ago. Such observation was often not just individual, but a matter of groups collaborating and sharing. In history, authority and ideology have mostly trumped observation when it comes to power, but observation has often trumped authority and ideology when it comes to survival (eating poisonous food does not work just because authority or ideology claims it does).
Centuries ago the Greek doctor Galen (born 131 AD) claimed that women, in mind and body, were just weaker versions of men. Men’s bodies and minds contained a type of “heat” related to the light of the stars. Women’s minds and bodies did not contain as much of this heat. This special heat shaped ideas and, as a result, women’s ideas were not “well formed”, while men’s were. Thus, men were smarter than women and fit to rule where women were not. Galen derived these ideas from mistaken observations about the body (in an age before scientific anatomy) and from ideology (in Greece men were in charge, women were not).
For centuries what Galen said was claimed to be true based on his “authority” as an “ancient”. Furthermore, societies down through the ages were dominated by men and these men had self-interested reasons to believe and argue for Galen’s claims. His claims resonated with the ideological teachings of churches and governments.
Presumably for centuries some people—especially women—knew from their own observations that women did not have less well-formed ideas than men. Galen also claimed that women ejaculated sperm during intercourse and we can bet most women knew this was not true. But, in most cases, authority and ideology trumped observation, especially observation by the less powerful.
The Renaissance and the Enlightenment gave rise to a radical new approach to truth. Thinkers began to claim that truth should be determined not by authority or ideology, nor by everyday observations. They claimed that truth is to be found through what we might call “disciplined observation”.
Disciplined observation is like a game with rules. Claims must be made on the basis of observations of the world, not authority or ideology. These observations must be repeatable, accessible to others, and carefully managed to control for mistakes (managed via tools, technologies, and collaboration, for example). Repeated, accessible, carefully managed observations constitute evidence for claims. Evidence must be shared and made public. Others must be allowed to check the evidence for accuracy and consistency and to make the observations themselves. Others must be allowed to critique the claims and to try to falsify them. Someone who makes a claim must publically acknowledge the evidence, how much evidence there is for the claim, and the quality of the evidence for the claim.
Disciplined observation is just a high octane version of everyday observation. It is the basis of science, knowledge production, and discovery. But it can also be the basis of art, as well. Science and art are both forms of discovery. The artist claims to have insight into the world or human experience based on deep, repeatable, and accessible observations of the world or life that are publically shared and open to critique. Artists need not claim their insights are based on outside authority (e.g., God) or ideology, and, if they do, their art does not fall into the category of what I am calling disciplined observation (anymore than does “science” based on authority or ideology, which is not, of course, really science). Artists also often use tools to discipline observation, tools like paint, musical systems, and cameras.
Artists sometimes make claims about what is true. For example, Tolstoy’s War and Peace has a lot to say about how war and peace work. To give one example: Tolstoy sees frontline soldiers as the cause of victory or defeat in war based on the flow and ebb of events on the front lines, not the big plans and strategies of generals. A good general, for him, simply takes the credit or blame for what he knows happens largely outside his control, making people feel there is some larger workable plan. This serves, at least for me, as a metaphor for politics, as well as for management in many workplaces. This way of looking at the world can be compared and contrasted—tested against—a great many other observations by novelists, social scientists, military historians, and management consultants, as well as soldiers, citizens, and workers.
Often artists make claims about what is aesthetic (found beautiful to humans), emotionally meaningful, inspiring, or an insightful perspective on life or the world. Tolstoy’s insight qualifies in this way as well. So does Emily Dickinson’s poem “My Life Closed Twice” where she claims that the real endings we suffer in life (“closings”) are the emotional sufferings we undergo from the loss of love, not the death of our physical bodies. A great many others have been impressed by the way the human body can die only once if you radically traumatize it, but the human heart or soul can be equally traumatized (pained) many times, giving us many “deaths”. And, yet, Dickenson claims it is this fact that gives us humans our “heavens” and “hells”, the sorrows and joys of our lives, and not religion.
Claims about what humans can find beautiful, meaningful, inspiring, or insightful for living life are, just like claims in science, claims to truth via disciplined, shared, and accessible observations. Many people have discovered new forms of beauty, meaning, inspiration, and insight from art. But these claims cease to be “disciplined observation” if one defers their assessment to authority (including religion) or ideology. Dickinson’s poem is insightful (“true to life”) not because it’s God’s view or any authority’s view. It is insightful because it is a view of life based on her own observations, observations that resonate with how others have experienced life.
In the United States the word “science” is used only for the physical, social, and natural sciences, and often only for the “hard sciences”. It is not used for literature, music, or art. In some other countries the term is used more widely, for something like what I am calling “disciplined observation”. In any case, I will use the term “disciplined observation” to include science and art and things in between.
At this point, probably, readers of all political stripes are nervous about how I am using the word “truth”. Truth is a much abused word. Almost everyone, regardless of their intellectual or political camp, is skeptical of “truth” when other people claim it, though rarely are they skeptical of their own claims to truth (myself included, of course). And, in a way, this is as it should be. It is to this dilemma that the idea of “disciplined observation” actually speaks. So let’s turn to truth.
Truth has taken a lot of hits from postmodern academics. They claim there is no fixed, eternal big “T” truth. Claims to such are just ideologically “master narratives” meant to empower certain sorts of people and institutions. But “disciplined observation” is not about any such sense of the word “truth”. It is about a much more practical and mundane matter: namely what works.
Humans are overly impressed by their own observations and ideas and liable to believe them when they are, in fact, false. Disciplined observation is the willingness to put one’s observations and ideas to the test of other people’s observations and ideas. These people often compose a group that develops tools and methods to test, critique, and debate observations and claims in a certain domain (e.g., biology or gardening). These tools and methods cannot rest on authority or ideology. Crucially they must involve ways of confronting claims against the publically shared world of human experience. If the world behaves in such a way as to consistently contradict a claim, the claim cannot (yet, at least) be claimed to be true. If the world behaves in such a way so as to consistently support a claim, then that claim can be taken (at least for the time being) as true. In this sense “truth” is a move in the “game” of “disciplined observation”.
Astrology and astronomy are classic examples of two related areas where only one plays by the rules of disciplined observation. Astrological claims are often put forward to a social group that debates astrology. But such claims do not pass muster when tested against the world, while many astronomical claims do. Furthermore, people engaged in astrology base their claims on a shared ideology, not genuine critique, testing, and debate. Astrology is more like a religion than it is like disciplined observation.
There is a slightly stronger sense of the word “true” that is connected to the disciplined observation “game”. This game is played on the assumption that many more claims taken to be true in the game than ones taken to be false will still be considered true over the long haul as the game progresses (until the end of time).
Thousands of people have found Emily Dickenson’s claims about life, love, and the human heart true to their experience of love gained and lost. In fact this is so much the case that many of us would not say her claims were false if many people came to deny them. We would say the nature of human beings or human life had radically changed.
Claims about evolution in biology are so well supported by the game of disciplined observation in biology that people who deny evolution (and not just mechanisms relevant to how it works) simply do not accept or want to play by the rules of disciplined observation. And, of course, that is their prerogative. People can play any game they like, but they cannot expect the world of nature or human experience to be so flexible as not to “bite back” against a great many of their claims that are not, in fact, true by the rules of disciplined observation.
Disciplined observation is not just a “game” played by academics. Nonetheless, academic disciplines have refined the game a great deal with many technical tools and methods, And they have devoted the game to quite specific, technical, specialized, and narrow questions in a myriad of specialties and sub-specialties. This effort has yielded lots of nonsense and fundamentally uninteresting and unimportant work. But it has also yielded lots of important, marvelous, essential discoveries. In fact, it seems that keeping the weak work around is the cost for getting the important discoveries, since often we cannot tell at the outset what will turn out to be important. In any case, these highly specialized academic areas are properly the preserve of graduate students and research professors. They are not the correct food for undergraduate education, however much undergraduate “majors” often feed undergraduates simplified versions of graduate disciplinary education.
Disciplined observation has always had a home outside of schools and more so today than ever. Today, via the Internet, people of all ages can join interest-driven groups to develop expertise in a great variety of areas. This phenomenon is so pervasive today that our time has been called the Age of Pro-Ams (Professional-Amateurs). People engage in amateur science, citizen journalism, fan-fiction writing, the design of all sorts of media, as well as engage at expert levels on topics like cats, health, autism, gardening, politics, environmental concerns, and almost any other topic one could think of.
Let me just take one specific example of this Pro-Am phenomenon. Players of the massively multiplayer (and massively popular) video game World of Warcraft can join Internet sites where they engage in “theory crafting”. On these sites players seek to discover, analyze, and improve the complex statistical model that underlies game play in World of Warcraft where the results of every player move is determined by a large number of interacting variables. They engage with mathematics, statistics, probability, and game design to offer expert analyses and to critique the game and the company that makes it in the service of what they think would be better ways to design the game. They also make software tools (called “mods”) that incorporate their statistical insights and which help other players to better understand and use these statistical insights as they the game.
On such Pro-Am sites people make claims that must be backed up by disciplined observations, made publically accessible, and open to critique, debate, and falsification. They cannot claim truth based on authority, ideology, or informal observations. In fact, often they have no degrees or expert credentials. They, in fact, must abide by the rules of the game of disciplined observation, though they are not academics and not studying an academic topic directly, but a video game. Of course, their claims in mathematics and statistics are often grounded in or accurate in terms of academic work.
There is, however, another way to engage with disciplined observation that has no secure home. This involves people playing the game of disciplined observation not about topics in academic specialties (like syntax theory in linguistics) or ones in popular culture (like theory crafting in World of Warcraft) but on what I will call “big questions”.
Big questions are questions that can only be answered by drawing on different sources of knowledge. They do not fall into any one discipline, but require pooling knowledge across different disciplines, knowledge domains, history, and people. Furthermore, they are questions whose answers help shape how we live, act, and value in the world.
Take one example. In 2008 the world experienced an economic meltdown and a deep recession brought on by a variety of factors, many of them centered in the United States, but eventually spread throughout the world. Lots of people have suffered and continue to suffer from this meltdown. It nearly shut down the global economy. Why did it happen? That question is a big question if approached in a certain way. Answering it requires melding and evaluating knowledge from economics, human psychology, sociology, history, politics, public policy, ethics, as well as thinking about human nature, human interactions, values, environments, ideologies, and behaviors.
Answering this question is not just a prerequisite for asking how to fix the meltdown and avoid future ones, though this is important, of course. Answering it also leads to a deep understanding of how the world we live in works and how the claims we make about it and the actions we take in it can go wrong and lead to unfortunate consequences, even with good intentions (and, of course, without them as well). Answering leads to some degree of protection against being harmed by human and institutional greed, ideology, stupidity, and mistakes. In this sense, the question is generative: it generates lots of further understandings and applications beyond the specific issue (the 2008 recession) that triggered it. Let’s call such questions not just “big questions” but “generative big questions”.
We can play the disciplined observation game with generative big questions, but this requires that we pool people with different types of knowledge, experience, and skills. It requires that we draw on knowledge from academic and non-academic sources. It requires we reject authority, ideology, and anecdote as the ground for answers for questions that often impinge on people’s politics and values. And it requires an appeal to disciplined (repeated, well-managed, and publically accessible) observations and claims based on them that are open to critique, debate, and refutation.
Generative big questions are ones that academic disciplines break down into smaller questions directed to specific academic specialties and sub-specialties. Unfortunately, in many cases, no discipline exists to put all the sub-questions and sub-answers together into a “big picture” answer that speaks to the larger question. Furthermore, generative big questions are important to all of us, not just academics, and we all have to be able to think about them if only to protect ourselves from stupidity and spin in a complex world.
And we cannot just passively accept credentialed experts’ answers to these big questions, since this becomes just a form of basing truth on authority. At the very least we need to know how and why the experts reached their conclusions and how trustworthy their conclusions are actually claimed to be on their home base in an academic discipline or set of them. At the best, we need to engage in the “game” ourselves, along with credentialed and not-credentialed experts.
The home of disciplined observation played on questions embedded in academic specialties and sub-specialties is graduate education and academic research. The home of disciplined observation played by Pro-Ams is popular culture. But where is the home of disciplined observation played on generative big questions, on questions that sound “academic” but which impinge on everyone’s life and pursuit of happiness and meaning? Proactively approaching generative big questions requires credentialed professionals, Pro-Ams, and “newbies”, each with different knowledge, experiences, perspectives, and values to collaborate. Sounds a lot like a vision for college.
Generative big questions should be the focus of people’s interactions in the public sphere. They should be the focus of media, politics, and larger societal debates. They should be the focus of our communal attempt, throughout our lives, to discuss with others what the purpose of life is, what makes a life worth living, what we owe each other, and how a just society should be organized. The discussion of such questions is, in fact, the heart and soul and a pre-condition for a real public sphere. There was a time when we thought the purpose of a college education was to prepare one for this public sphere, to prepare one to participate it and even to reform it.
And here we hit a major and perhaps insurmountable problem. Colleges and universities which focus undergraduate education on generative big questions require a true public sphere to sustain them and then, in turn, be nourished by them and their graduates. What, then, is a public sphere?
A public sphere is the space in any society in which people from different backgrounds (whether these be defined in terms of classes, races, ethnic groups, genders or sexual orientations, interests, values, or beliefs) seek to create together a “public”. People who are part of this public feel they should collaborate for the common good. They feel they have not just rights, but responsibilities to others in the public. The responsibilities are not based on kinship, friendship, shared backgrounds, wealth, or politics, or shared life styles. They are impersonal.
Further, people in a public sphere agree to settle their differences and to come to agreements about the common good based on publically accessible, free, and open argument, evidence, and universalizable (and not sectarian) moral values. They do not base their arguments solely on ideology, self-interest, authority, short-term profits, deception (“spin”), fear mongering, or distain for others. They respect each other as co-citizens in the public sphere.
Today in the United States and in some other countries there is no real public sphere or a quite diminished one. In the United States wealthy and professional people share life styles, interests, and values with other wealthy and professional people across the globe. In a global world they can interact with people like themselves across the world and come to see these people as their true “peer group”. They can come to feel little co-citizenship or co-membership in a public sphere with others in their own country less well off or different from themselves. They feel little responsibility to or for others in their own society and often resent paying taxes to sustain the common good.
Furthermore, in the United States today, we are deeply ideologically driven, constant victims of spin in the name of ideology or profit, and prone in business and politics to short-term thinking and planning for short-term gain. We see those who disagree with us as “traitors” or “pin heads”. We do not approach social problems pragmatically, asking what balance of conservative or liberal approaches is called for to reach a workable and fair solution, rather we see the whole world through one ideological lens more closely connected to our own desires and self-interest than any notion of a common good. We are well aware that no matter how much evidence of harm was available businesses that caused that harm would fight to the end to stop regulation to mitigate the harm, as the cigarette companies have long done and as any number of other corporations do today.
In the U. S. health care debate we heard about the perils of “rationed health care” (where people would not be able to get high risk low benefit expensive operations) if the government sponsored health care so that everyone got coverage. However, many people showed little concern for the “rationed health care” forced on thousands of people who had no health insurance and could not even get a basic operation that could save their lives.
Colleges and universities that have as their goal sustaining and nurturing a true public sphere—via open discussion of generative big questions—cannot function if there is no real public sphere. It is not surprising that in a society with a diminished public sphere, there is not a widespread desire to fund such a college education for the “public”.
Subject  Search  Reset  Return