Categories
Human Nature in Nature Blog

What Does Science Teach Us? Its Core Lesson is Humility

Summary

Through the centuries science teaches the lesson of humility, while its technological applications confer increasingly god-like power that seems to belie that lesson but actually reinforces it. Each scientific insight that further reveals the vastness of the universe and the complexity of life teaches anew the lessons of humility and relatedness. Meanwhile new applications of technological power paradoxically drive the lessons home by creating new and bigger problems, many of which require yet more technology and larger bureaucracies to manage in an endless spiral, a vicious circle. But we’re still not getting it: Western Civilization continues to revel in its technological prowess and individualism while ignoring history and turning a blind eye to core lessons of its own science. We need a passing grade on these core lessons of science and history—not just to live ethically with each other and with other life-forms on a finite planet, but to survive. That achievement, however, will signal, and require, a major cultural shift.

Part 1: What does science teach us?

We’ve had four or five hundred years of sometimes startling, paradigm-shifting advances in science. Science brings blessings, but also problems. It provides answers—which often as not create controversy. People usually imagine science taking place in some rarefied atmospheric layer high above messy ground-level politics, or in sheltered laboratories behind ivy-covered walls; but more and more scientists find themselves embroiled in highly public political fistfights. At the same time many ordinary citizens, non-scientists, seem to have given up hope that science might provide answers for the really pressing problems of human life and society, and may in the end even make them worse. All in all, it seems like a good time to take stock and ask: “What have we learned, really? What does science teach us?”

I can see some readers shrugging off the question, though.  Why bother?  Let’s not bury our heads in the sand or lose them in the clouds!  We have more pressing problems.  Abstract questions may seem quite beside the point while Trump and his minions tromp and rage across the political landscape like storms fueled by climate change, doing more damage to democracy and the wellbeing of U.S. citizens than terrorists ever could?  Why fiddle around with abstract science lessons while forests burn, Britain struggles with Brexit, real weather pounds our increasingly vulnerable infrastructure racking up unprecedented $billions in damage, and in general the world is going to hell in a hand-basket.

But Trump doesn’t think that science is irrelevant. His administration makes science a prime target of attack, as did the Bush administration  before it. Corporate leaders intent on their bottom line spend vast sums both on both science, and efforts to muzzle or divert it.

That tells us something.  Could it be that rampant political craziness, greed and heedless self-interest, can’t stand reason?  Political power grounded in ideology, especially when it leans toward totalitarianism, doesn’t like competing sources of authority that might constrain or direct it. At the same time as science is a target of political ire it can also be a voice of reason standing against these kinds of madness.

Sand, Rocks, Water: Southern Oregon Coast, July 2017

So, then…. What does science teach us? We’ve learned so many different things from science that it’s hard to know where to begin. We know what the sun really is, what the Earth’s made of, how the sun and the Earth are related, what the stars are, how life evolved, how long (roughly) humans have been around, the causes of most disease, why fire is hot. We understand the mysteries of inheritance down to the molecular level (and we know about atoms and molecules).

We also have learned a great deal about ourselves and what makes us distinctively human. We know that humans evolved over aeons of time just like all other living things, and at the same time have a good handle on how we differ—on what makes us unique in the universe as far as we know it. We’ve come some way toward unraveling the mysteries of human society, language and culture. We know (and are still learning about) ourselves as uniquely cultural beings. We understand, at least in part, the world of symbols and their role as the foundation for human language, for self-reflective consciousness, and as the foundation for human culture at large.

All that is just for illustration. Those findings suggest the scope of the scientifically-grounded understandings we have won; but such specifics still just scratch the surface. It’s like picking up some stones, or a handful of sand on the beach and letting it sift through our fingers, as evidence of the Earth under our feet. There is so much more to be said, to be seen, to be experienced. And all that knowledge is not irrelevant.  Just try to imagine not knowing what we’ve learned from science, including big underlying lessons we’ve absorbed or are still working on along with all the specifics.

 

Science is more than a series of discoveries and inventions.

Scientific findings lead somewhere; they tell us things. They are lessons of life—moral as well as practical. Some such lessons go beyond specifics about gravity and billiard balls, climate and pesticides, and so on. The immense world of scientific knowledge that we’ve built points irresistibly toward a few BIG conclusions about the universe, about life, about us. Such conclusions, once accepted and absorbed, acting both with and against other powerful cultural forces help shape and reshape not just how we view things but our very selves.

Such big ideas from science—as they change our minds and reshape how and what we think about everyday matters—compete with already entrenched self-conceptions, beliefs, and practices that they must shift, remake, or replace They also have to contend with even more deeply mythic psychic structures that define the “soul” of America (or any other “modern” nation).

Scientific discoveries, taken seriously, mandate often quite fundamental change in how we view things, and ourselves. They may challenge perspectives and ways of thinking rooted in distant pasts—beliefs and faiths that may or may not be fully conscious, yet still govern how we act and think.

For all these reasons key findings of science will have been and are controversial, taking decades or more to be widely enough accepted to reshape basic aspects of our social, cultural, and public life (if they ever do). If you grant that scientific explorations into the nature of things can produce such big ideas that shake the tree of modern western culture—and therefore each of our own personal identities—at its roots, what would you say they are?

And, a further question: What is one big idea, perhaps even the big idea, that science poses to us today, that we’re working on presently, that compares to the big ideas that have so influentially shaped or reshaped our culture so far?  Is there a central reality, a fundamental aspect of existence that scientific inquiry has uncovered, a core lesson that we struggle with today because it contradicts existing enduringly entrenched stories and myths, beliefs and practices, that define our cultural identity?

If so, what would you say it is? A quick look at a few past big scientific discoveries might help us make our way toward a (or the) take-home lesson from science today.

A Few Things We Have Already Learned 

1.  The Solar System.

I see one big scientific discovery that most of us could readily agree on now. For that very reason it is perhaps not so interesting—except that historically it crucially helped define and shape who we’ve become. Modern science, according to historians and philosophers of science, began with this big conclusion: namely, Copernicus’s and Galileo’s discovery that we humans are not literally the center of the universe.

Empirically, with our limited human vision, the Earth seems stationary and more or less flat while the heavens—sun, moon, and stars—move across the dome of the sky around us. That’s what we empirically see. But now we know better than to believe our eyes; we have a more objective, a more scientific, a more theoretical understanding:

The Earth is not flat and stationary (with heaven above and hell below) while sun and stars circle around it.  Rather, we live on a large moving and spinning ball, the Earth, that follows the same physical laws as everything else as it circles its sun somewhere on the fringes of an inconceivable immensity of space.  We humans are not the center of God’s universe.  That was huge in its day—a paradigm shift at a cosmic level. (See earlier posts here and here.)

2.  Second, Evolution.  

Similarly, many (maybe most) of us now recognize that we humans are not the special creation of a Divine Being who fashioned us out of whole cloth in His image—at least, not in any simple literal direct way, as you yourself might model a figure out of clay. Nor did such a (humanized) being place us on the world He made just for us, and give us dominion over it and all its creatures so long as we obey a few of His basic rules (Commandments). Rather, we evolved naturally, and rather late, as one animal species among all the rest.

“All blessings shall come on thee, and overtake thee if thou shalt hearken unto the voice of the Lord thy God. …. And the Lord shall make thee the head, and not the tail: and thou shall be above only” (Deuteronomy 28: –. Quoted in Ronald Schenk, American Soul: A Cultural Narrative, n.50, pp. 46-47).

Darwin’s theory of evolution, this second big conclusion of science, remains more controversial than the heliocentric (sun-centered) theory of the solar system; but at least by now the opposing sides are entrenched and well-known. This makes the whole argument less interesting, less current—except for those waning few who remain heatedly engaged in it. Some people continue to reject evolution in favour of literal religious creationism; while dogmatists on the other side insist that evolution repudiates religious faith altogether. They can blather all they want, but they’re not going to change each other’s minds.

(On further reflection, I have to admit that I do find some of the arguments rather interesting after all; but the “I’m right, you’re wrong” absoluteness that often defines the extremes of the opposing sides strikes me as wrong-headed and arrogant, all things considered.)

Many others reconcile evolution and religious belief in various ways, primarily by foregoing literalism for more interpretive or symbolic reading of religious texts. This seems like a reasonable approach given what we do know, and what we don’t. The important point is that, despite some ongoing controversy, evolutionary theory has changed how we see ourselves on Planet Earth. It has to be confronted or worked with. And on balance, it’s fair to say that most people (a small majority, at the very least) living in the day-to-day modern world have come to terms in one way or the other with the basic truths of evolution as science reveals them.


According to a 2014 Gallup poll, a minority (but still a surprisingly large minority) of Americans “continue to believe that God created humans in their present form 10,000 years ago, a view that has changed little over the past three decades.”  The poll also finds, however, that “half of Americans believe humans evolved, with the majority of these saying God guided the evolutionary process.”  But “the percentage who say God was not involved is rising.” This seems roughly consistent with the Pew Research Poll I cited in an earlier post, which found a decline in religious affiliation in all denominations, and an increase in the numbers of people who consider themselves to be nonreligious. I’m pretty sure that in modern European nations, as compared to the U.S., relatively more people accept evolution.


Historical Downsides of Evolutionary Theory. I can’t leave the topic of evolution without noting its historical downside, and an important secondary lesson associated with it.  What are sometimes called special interests and their intellectual apologists quickly seized on the idea of evolution to justify racism, hierarchy, and exploitation. The new theory of evolution, says one noted historian of science,

“seemed to offer, with a startling finality, a nearly religious vindication of the hierarchy of peoples long assumed by educated Europeans and the superiority of elites demanded by class systems. In America and Europe in this century, evolutionary thought helped to breed a nasty fascination with eugenics as social policy.”

After Darwin, well into the twentieth century, Europeans and Euroamericans justified slavery on the belief that Blacks were evolutionarily genetically inferior, not fully human. Colonial regimes claimed “Survival of the fittest” as their excuse to conquer and exploit others. The Darwinian theory of biological evolution inspired Social Darwinism,  a reductionist idea that gave intellectual justification for the exploitive excesses of early capitalism. It all boils down to the maxim: knowledge is not wisdom—not necessarily, anyway.

New knowledge does not arise in a political vacuum, nor does it usually have clear, unambiguous implications for policy. It can mean different things in different contexts, and opposing interests fight each other to define it. The more consequential the idea, the more people fight over it. By and large (but by no means always) elites and rulers have the advantage in this regard. Thus, at first evolutionary theory justified inequality and hierarchy.

Science itself does, however, tend to self-correct over the long run and these corrections find ways into the public sphere. In this case, later scientific discoveries—especially, advances in population genetics, and the theory of culture in the early-and mid-20th century—repudiated scientific racism based on Darwinian theory. Racist beliefs no longer find any credible justification in science.  Further scientific advances  exposed racism as ignorance.  Outbreaks of social Darwinist ideology still occur, but they don’t get much traction or spread very far. The ideas that backed up these seductive but reductive doctrines have become, for most of us, scientifically obsolete.

To summarize so far: Two big game-changing discoveries in the history of science, heliocentrism and evolution, forced fundamental shifts in perspective and re-defined our view of ourselves in the cosmos and on the Earth.  Both discoveries show us to be less the center of existence and more just one part of larger complex wholes.  Now there’s a third discovery that builds on evolutionary understanding but goes beyond it and provides grounds for correcting some of its excesses: namely, the discovery of culture.

3.  Third, The Theory of Culture. 

The Third major, ground-shifting advance in scientific understanding, the theory of culture, both complements evolutionary theory and marks its limits. In the human world, Culture, not genes, accounts for the wide range of different behaviors, customs, and beliefs.  We humans are cultural beings, by nature.

Understanding culture freed us to better recognize the evolutionary truth that humankind in all its evident diversity and disarray consists of one species. Not even the most striking physical or behavioral distinctions among different peoples around the globe mark different species, nor make any one group naturally better nor more human than the next. Now culture, not biology, explains the often striking differences in customs, beliefs, level of technological development, and so on, that distinguish one human group from another.

Ethnographers and linguists who open-mindedly spent enough time living with different groups of people often found themselves amazed at the depth, sophistication, and beauty they found in the cultures they studied. They marvelled at the intricacy and appropriateness of their ecological adaptations. They also, of course, sometimes found violence, harmful superstitions, inequality, and other social ills.  But significantly such ills were as often, or perhaps even more often, imposed by “civilized” colonialists as they were indigenous aspects of native cultures themselves.

The bottom line: Human physical differences, although often given great cultural importance, in fact do not matter when it comes to explaining why people act and believe as they do. The discovery of human beings as cultural beings gave the final kick needed to topple biological racism as a scientifically credible doctrine into the historical dustbin of failed ideas. These advances in scientific understanding transformed evolutionary theory from an idea that elevated elitism and excused hierarchy and exploitation into a lesson in humility—which, of course, many people still don’t like to hear.

At the same time, the idea of culture dislodges key premises about human nature that continue to prop up the American economic and political systems, and indeed American culture at large. Not surprisingly much like evolution, people misunderstand and misuse the idea of culture in political and social life—so much so that some anthropologists would abandon the theory of culture altogether. But that’s another debate for another time. I’ll just say here that I don’t see the culture idea going away, and such problems do tend to self-correct over time (as we saw with evolution). So our energies are better spent getting it right faster.

 

1,2,3:   You can see in those three big conclusions from scientific study over the last half-millennium just how profoundly science has shaped how we now view ourselves in our world, in the cosmos. Despite the controversies that still surround them, it’s clear that our culture’s scientific achievements shaped and moulded Modern Western Culture—our entire world-view, and thus who we are as human beings, each individual one of us—in profound ways.

Past science lessons are easier to see and describe—in part because they are past. We have a little distance on them—even as they still stir up shock-waves of controversy. Also, each of the three or four big discoveries I’ve mentioned has a well-known theory or idea associated with it, as well as the historical figure who is credited with the discovery, whose name virtually everyone knows.  All that makes it better known, more visible. If you bring up Galileo or Darwin in polite company, for instance, people will know what you’re talking about, even if they disagree with what you say.


(True, the most recent instance, the idea or theory of culture, is an exception here in that the names of those who developed the idea are not so well known [except within anthropology]. Is the theory of culture so at odds with underlying basic premises of Western culture that it remains essentially invisible as a major theoretical advance, even as the word “culture” is now everywhere? In any event, albeit simplified and often misunderstood, the idea of culture itself is out there. It has become pervasive in everyday thought and speech.  Where would we be without it?)


Past science lessons may be easier to grasp, but science didn’t just happened in the past. It is ongoing, and it continues to confound and challenge today’s cherished ingrained beliefs just as it did those of earlier times. In that regard, in addition to the example of culture theory, two more recent discoveries are worth noting.

4,5  Quantum Mechanics and Systems Theory

Two more recent breakthroughs in scientific understanding seem, so far, to have little affected the general cultural outlook, the world-view, of the modern West, or of America in particular. But they, too, in their own ways, teach precisely the same underlying lesson of humility as the major discoveries of the past.

The first, quantum theory in physics, says—directly contrary to classical Newtonian physics—that not sublime certainty (which only humans can comprehend) but rather chaotic uncertainty lies at the very heart of existence.  The second, general systems theory, following closely on the heels of culture theory in the mid-twentieth century, shows among other things that connection and wholeness along with—maybe even more than—individualism and competition drive evolution and define the conditions of life in general, and human society in particular.

Part 2: Learning Humility

So, back to our original question: What does science teach us—what’s our take-home lesson so far?  By this time, the underlying “lesson” of science that I have in mind—a lesson presented in each of the five or six major scientific advances I’ve mentioned—might be beginning to be heard. Especially so as a lengthening series of natural disasters, cascading economic crises, and even greater problems looming on the horizon all reinforce it.

I know that I’ve already given it away; but it may still surprise you that our BIG science lesson of all time is humility.  And with humility, respect.  Taking science’s lesson of humility to heart makes it more natural to respect the natural world that sustains us.  At the same time and equally important, we might learn greater respect for ourselves.

The lesson of humility, however, being common to each and every major scientific advance does not have a single widely-known ground-breaking theory associated with it, nor a Copernicus or Galileo or Darwin to embody it. Nor is it represented by a prominent word, idea, or practice in American public or private life. Rather, it appears most often in traditional religious teachings which too few people take seriously.  Only rarely and relatively obscurely is humility noted as a major theme of scientific advance.

Perhaps American culture itself blinds us to this dimension of scientific learning. American political and economic life undervalues and even disdains humility.  As today’s cumulative science lesson, humility challenges enduring dominant American values that emphasize individualism, accumulation, “getting ahead,” and unconstrained applications of technology and exploitation of the natural environment to generate wealth—just as earlier scientifically-grounded insights challenged prevailing world-views and values of their days.  It’s lack of fit with the dominant American ethos doubtless makes it easier to ignore or even to see all of the ways that science teaches lessons in humility.

Science: An “Outrage On Humanity’s Naïve Self-Love”

Freud perhaps most famously and dramatically, if somewhat negatively, pointed out the lesson in humility taught by scientific discoveries.  Freud, says Professor Arnold Krupat, speaks of

“psychoanalysis as the third wound to human narcissism, for its demonstration, after the Copernican and Darwinian wounds (e.g., that we are not only not the center of the universe, nor only a little lower than the angels), that we are also not even masters of our own minds.”

In Freud’s own words, Copernicus’s and Darwin’s theories, and his own discovery of the unconscious, are “great outrages” visited upon humanity’s “naïve self-love.” Key scientific advances, Freud said, delivered “bitter blows…[to the human] ego.” Consequently each major discovery met with “the most violent opposition” just as you’d expect.

While I take Freud’s point, it is only part of the story. It’s only one way to look at the humbling effects of scientific advances.

One might look at humility as taught by science from a more positive perspective that emphasizes relation, connection.  Each advance of understanding shows humans to be less special, less separated than people thought we were, and more a part of and related to everything else. True, you might go with Freud and view all that as blows to the human ego. But it isn’t only that: it is also realistic, and in an important sense that needs to be better realized, also empowering.

Finally then, in brief, that’s the answer as I see it to the question posed in this blog post.  The advance of science is a lesson in humility.  But we haven’t earned a good grade yet.  It’s a lesson that we’re still struggling to learn.

Although I know that I’ve already gone on at length here, I’d still like to expand on this theme a little more. Next time….

Categories
Human Nature in Nature Blog

Ideas, Culture, & the Freedom to Error

Every great advance in human history—the use of fire, the wheel, agriculture, writing, our highest religious ideals, democracy, the internet—starts with a new idea which in turn is based on earlier ideas. Humans live by ideas. What does it mean for us when the ideas we live by are in error?

Living Beyond our Genes

When humans attained culture, we moved beyond genes—or, rather, grew up into a new level of organization and evolution. That happened fifty or   sixty thousand years ago, maybe earlier.  Until then we evolved like other life forms, primarily by natural selection working on our physical bodies, instincts, and more or less fixed patterns of relating. But now as cultural beings our own ideas are the means by which human life advances, evolves, becomes ever more complex.

Ideas also can hold us back.  Wrong or limiting ideas restrict what we do and who we are.  Anthropologist Clifford Geertz said that humans live in webs of meaning that they themselves spin.  Living in a world of ideas that we ourselves create gives humans degrees of freedom that no other being on Earth enjoys. But this includes, as I said in an earlier post, the freedom to make mistakes, to be fooled, to be played the sucker, and to wrap ourselves in straight-jackets of limited thinking.

The human mind has freedom to divide, categorize and combine things in novel and creative ways.  But we also use this gift to build fences—ones that keep others out, and ones that confine our own minds.

Living according to mistaken ideas can work for a while, even get you ahead. And, your living by mistaken ideas can get someone else ahead (and of course vice versa, depending on the circumstance). But more often, and always in the end, in the complex interconnected ecologies in which we live and subsist, wrong ideas eventually lead to unpleasant or even deadly consequences.

Socrates (or was it Plato?) said the unexamined life is not worth living. For us, with our terrifying capabilities and the large scale of our works, the unexamined mind is becoming a dangerous way to live.

Waking up to both our freedom and our fallibility can be terrifying.  Or, more positively, it can bring us home to ourselves, help make us more real, caution us to live with respect on the fragile planet that sustains us.

Ideas We Don’t Know We Have

People have inborn tendencies to take—and sometimes mistake—their ideas for perceptions, their dogmas for truths.

People don’t generally realize that they’re in the grip of a limiting or mistaken idea until a new idea comes along and challenges it. We believe what we believe for as long as we can. The most powerful ideas or ideologies are those we don’t know we have, but rather take for granted as just the way things are. It’s a bit of a stretch and a simplification, but you could even think of a particular culture as one big idea, a world-view. Those big ideas that give form to our world most deeply define who we are, and are the hardest to grow out of, to change.

The Flammarion engraving (1888) depicts a traveler who arrives at the edge of a flat Earth and sticks his head through the firmament. By Anonymous – Camille Flammarion, L’Atmosphère: Météorologie Populaire (Paris, 1888), pp. 163, Public Domain.

It’s easiest to “see” ideas we don’t have ourselves. Take the idea of the flat Earth as the center of the universe around which all the heavenly bodies revolve that I mentioned in my last post.

Setting aside those curious few who still believe the world is flat and the round-earth idea a conspiracy (a notion most people today rightly regard as pseudo-science nonsense), the rest of us readily recognize the idea of a flat Earth as a particular idea, even a peculiar idea, because it is not our idea. It belongs to different times, different cultures. We have distance from it. We don’t perceptually and conceptually inhabit a universe that idea described.

But in past times and distant places, without the benefit of Newtonian physics and photos from space, the idea of a flat Earth as the center of the cosmos could be and was compelling because it seemed to come directly from people’s own senses. It was not just what people thought, but what they experienced, as they looked out over the land and watched the sun rise and set, and saw the stars wheel their set ways across the dome of the night sky. They believed what they empirically, naïvely saw—not recognizing it as a belief—and constructed whole religious cosmologies around it. They inhabited that world. They lived within their idea of the world, and they viewed the world from within (from the perspective of) that idea. Therefore, how could they recognize it as an idea? It was not just an idea; it was who they were; it was simply truth; it was how things are.

Michelangelo, The Creation of Adam. By Jörg Bittner Unna – Own work, CC BY 3.0, via Wikimedia Commons.

In the West, that idea of the Earth (and of “Man” created in the image of God), as the literal center of God’s Universe wasn’t widely recognized as a particular idea subject to challenge until Copernicus and Galileo put it in question. At first, it was Copernicus’s notion that the Earth was not at the center of the universe, but rather the Earth and the other planets orbited the sun, that seemed unbelievable. As evidence mounted and it became believable, Copernicus’s world-shifting idea ushered in another large new idea, that of science. And with science a whole new world-view arose as an alternative to the religious orthodoxy of the Middle Ages.

Science is our own big idea; it is how we think, who we are.  And, as I said last time, and will also talk about next time, we’ve reached the point in our evolution of ideas where more people can, and should, and in fact more do, think farther in the round—think more consciously about what we think.

An ancient symbol in many cultures: The Uroborus—The Snake That Swallows Itself.

(Granted, that’s not a totally new idea either: In examining our thoughts, even making theories about our own minds, we circle in on the essence of being human as captured in the ancient image of the uroboros—the snake swallowing its own tail.  But in the world we’re making, we need not just philosophers philosophers and mystics but more everyday people who vote—to keep getting better at it.)

If we take on the task of thinking more consciously about what we think, then our big idea—namely science and technology, and not just science but our whole current world-view that got shaped in the ideas and images of earlier science—is the first course on our plate. But what’s a good angle to approach it from? Here’s just a couple of suggestions. We can look critically at science and the modern world-view it shaped without falling into relativism.  And we can see, based in part on later scientific discoveries, when and where early science gave right answers on particulars while larger “pictures” of our world and ourselves based on those particulars were incomplete and even wrong.

Science—Our Big Idea.  Is It Our Mistake?

The advent of modern science was in many ways a huge advance in human learning, human understanding, and human life. Lifting Western culture out of its “Dark Ages,” it became the scaffolding for what we, with some basis but also some hubris, call the Enlightenment. Early mechanistic science and the universe that it describes became our big idea, a foundation for our culture. It shaped the big idea of the universe and our place in it that we live by and live within.

But what, exactly, is that big idea? Who has appropriated it, shaped it, and used for their own ends?  It has different aspects that have changed through time. Two central ones today are that we can transcend culture and command nature.

 

Transcending Culture, Commanding Nature.  Anthropologist Sharon Traweek memorably and paradoxically describes the cultural world of high energy physics as “the culture of no culture.”  The phrase has a catchy ring to it because this is the world-view not just of a clique of specialists, but of science at large—or was until quite recently.  And it still is the general outlook of modern Western civilization that has grown up around that earlier scientific world-view and incorporates it in many of its day-to-day practices, institutions, and beliefs.  “Cultures” of various strange kinds are what others have.  But us—why we’re just folks.

In the minds of those who practice it, as recorded by Traweek and many others, scientific methodology works to transcend culture, to take all cultural influences—dogmas, assumptions, particular perspectives, opinions, prior beliefs emotions, superstitions—out of its equations. In science, or so it is told, we take ourselves out of our work, and out of the world we observe. By taking ourselves fully out of the natural world from which we emerged, we think, we become almost like gods, able to understand nature’s workings on its own terms—and control it on ours.  That’s linear thinking.  That’s Descartes’ Chasm—the Cartesian split at the heart of the modern world view. What hubris!  What a BIG mistake!  We forgot and now are having to relearn that we’re always within and part of the systems we’re messing with.

An Ironic Error

Traweek’s wry characterization reveals an ironic error at the heart of the modern scientific world-view: the “culture of no culture” is itself a culture. We humans are by nature cultural beings.  Culture is what defines and sustains us as human; we can no more lose culture and remain human than a cloud, say, can lose its water vapor and remain a cloud.  Culture is our cat’s meow. Neither can we abstract ourselves from the natural world within which we humans evolved. We’re in the world we think we control, and what we to do to it affects us (remember the uroborus).

Adding irony onto irony, it is the ongoing development of science itself that now more clearly reveals those errors. That in itself is unexceptional: it’s how science works, how knowledge advances, how new theory supplants or sublates (incorporates) earlier more limited theory.

But when new theoretical understanding supplants old theory on which a whole culture developed—when the advance of science challenges earlier scientific viewpoints that continue to underpin a civilization’s institutions, practices, and attitudes—things get more complicated.  

That is our circumstance today.

Categories
Human Nature in Nature Blog

On Walking, Writing, Publishing, Telling Stories, and the Importance of Ideas

Walking & Talking.

On another misty-day hike in a local forest preserve, our little group winds around huge cedars and Douglas fir on muddy trails, stretching to step up rocky ascents or across small streams until we arrive on the shores of a quiet lake.  Ducks patrol the water’s edge, while farther out the glassy surface reflects trees and sky.  There we find a resting place to sit, talk, and munch energy bars or nuts before hitting the trail again back to our cars.

As we rest, a friend and fellow writer and I commiserate about the difficulties of getting our writing published in the popular press.  She writes historical novels and children’s stories.  In her excellent self-published novel, Sun Road, available on Amazon, a strong young Icelandic woman of the tenth century sets out on a series of adventures to a new world.  For my part, I am writing and want to publish a non-academic, non-fiction book on some recent ideas about human nature and culture.

How did I come to take on such a project?  Here’s a bit of background.  After graduate school I taught anthropology classes in university for a few years, and then had an opportunity, which I took, to work for an American Indian nation on practical problems it faced related to coal development in the Northern Plains region of the United States.  I subsequently spent most of my working career outside of academia as an applied social/cultural anthropologist.  I remain basically an academic at heart, however, even if I worked mostly in non-academic settings and am writing a non-academic book now.

Original research with culturally-diverse peoples and communities is central both to practicing anthropology in the world and teaching anthropology in the classroom—as is writing.  In my case, work outside the university in part involved preparing research-based, policy-related reports several of which are book-length.  I also wrote articles for academic journals.  Even as a child I was always fascinated by science in general, and later developed interests in law and social theory as well, and carried these interests through my academic and work career.

# #

Writing & Publishing in the World of Commerce

Been there, done that. Now I want to relate some of what I’ve learned to wider audiences in less specialized language and less restrictive settings. So, I got a book on how to publish popular nonfiction (as opposed to academic) books. There and in online sources I learned that typically the hopeful author writes a book proposal and submits it to numerous book agents until one finally agrees to represent it.  So I went to work. I wrote the proposal and six completed sample chapters.

Meantime, as I read more about non-academic publishing, and poked around the topic on the internet, I could see some parallels between the process of getting a popular or “trade” book published and academic publishing.  But overall they’re quite different and occur in different contexts.  Academic publishing relies on “blind” peer review, with contributions to disciplinary development or the general advancement of knowledge being key criteria. Here, in contrast, the agent’s and then the publisher’s assessment of the immediate commercial potential of your book drives the process.

So of course in the world of non-fiction trade book publishing, review is not “blind” as in academia: Who you are matters.  I’m sure that writing quality and intrinsic interest matter too. But being known matters more. It’s better if you’re a known quantity, a public figure, perhaps an already published author with successful books or best-sellers behind you.  All this becomes obvious, though it remains largely implicit and taken for granted.

# # #

The Entrepreneurial Author

After you write your proposal and sample chapters, you don’t submit that package. You put it on the shelf, and then start sending short, highly formulized “query letters” to agents describing your project.  (There is even a book just on how to write the one-page query letter.)

It may take months, at least, and many tens if not hundreds of queries, before you pique an agent’s interest (if you’re lucky) and he or she requests your full proposal.  If an agent likes your proposal and takes you on, she goes to work to sell your book to a publisher.  If she is successful and a publisher accepts your book, you may get an advance—and away you go.

But, that’s a lot of “ifs,” a lot of risk.  Why would anyone do that?  Why am I doing that? Why do we bother?  Why do novelists write excellent novels they have to spend their own money to self-publish?  Why are there so many non-fiction writers inundating agents’ offices with query letters, and manuscripts and proposals that will never see the light of day?

It takes hard disciplined work to do research, to write—and, a lot of time. And, it looks like, in the end it’s all a hugely speculative venture.  Writers often do it like some developers build houses, as it were, “on spec.” It is even more difficult if, like me and many others, you come to the task late without having cultivated a more public presence with commercial potential earlier.  Why do it?  Why are there poets who take “real jobs” on the side, and starving artists?  It doesn’t make sense. There are far easier ways in our wealthy and highly commercialized society and culture to make money, to secure a good living.

# # # #

What Matters? What Lasts?

As we sat by the lake, surrounded by great trees, munching and talking, watching the ducks and thinking about our respective writing projects, that “Why?” question came up. Asked and answered: There is something inherently, deeply satisfying on the human level about telling a good story for others to learn from and enjoy; and the same is true for reading, developing and communicating ideas.

And, these things—ideas and stories—matter.  They matter on a whole other level. In the larger scheme of things, they matter more than immediate financial gain.  Not that financial gain doesn’t matter too.  It feels good and right to be financially rewarded for the work we do.  And, of course, necessity may enter in here as well: Everyone has to “make a living, and here that means making money.  But most people—perhaps not everyone, but most people—also look beyond survival and accumulation to other values.

One such value resides in ideas, in stories.  What is our whole cultural world made up of, in the end, but ideas?  Stories broaden our horizons, give us access to the lives of real people elsewhere or in the past, help us imagine our futures.  Every story carries ideas; and every significant idea gets woven into stories.

 

What Matters? What Lasts? What gives the greatest satisfaction? Is it the accumulation of things or the growth of ideas? Your answer, of course, depends on who you are—and also defines who you are.

A hard-headed realist (I’ve never actually met such a creature—though I have met a few people who imagine they’re one) might think that ideas, stories, mere theories, exist in some airy mental realm unconnected to “the real world.” In fact, ideas and stories largely make up our real world. The stories we tell ourselves, about ourselves, become the very lives we live. That’s as true for the self-proclaimed realist as for anyone else.

What lasts? If longevity or durability are measures of “realness,” then nothing is more real for human beings than ideas.  Our ideas, the stories that make up our cultural lives today, transcend time.  Some key ideas have persisted and grown through the rise and fall of civilizations whose stone monuments and buildings long since crumbled into ruins.  Great music, written down and performed by countless choirs and orchestras, heard live or in recording generation by generation, becomes woven into cultural tradition that is as durable as the massive stone cathedrals in which it was, and still is, so often played and enjoyed.

# # # # #

To sum it up:

Well, we didn’t solve anything as we sat there watching the ducks, talking about writing, and looking at the trees and sky. But it did bring up the thoughts I’m recording here.

In the human world ideas are as real and consequential as things—and often even more so.  That’s what defines us as human.  Our present economy and culture, however, tends to emphasize material things and tries to commodify everything.

In that way our economy actually runs against grain of human nature, perhaps even distorts it, rather than naturally growing out of human nature as mainstream economic theory still assumes.  Focusing on self-interest—just one aspect of human nature (and not a very evolved or distinctively-human one at that)—our present economy develops and channels that motivation into the scarcity- and fear-based, production-oriented, commercialized society and culture we have today.

But all that is changing as I write this. The “information age” that social commentators say we are transitioning into relentlessly moves ideas, information, organization itself, much more center-stage in our cultural life.  It de-emphasizes things, challenges the central place of physical property in our culture, and opens spaces for new ways to think about and value ideas on the most practical levels, as well as in their aesthetic and ethical dimensions.

It is hard to measure, divide, and quantify ideas.  Ideas are inherently harder to commodify and commercialize than material things.  Paul Mason, in his fascinating book, Postcapitalism, writes that this contradiction between the growing centrality of ideas and the fundamentally commercial foundations of modern Western culture sets up one of the major challenges of our present era.  My next post looks further at evolving ideas about who we are as human beings in light of this evolution of culture, and consciousness.

WP2Social Auto Publish Powered By : XYZScripts.com