Leadership Lessons from George Washington and Jesus

This hit close to home. The study of leadership, leaders and what makes them tick has fascinated me for a number of years. I mean, who doesn’t t get excited to hear the latest developments from Apple each time Steve Jobs was on stage and utter his famous words, “… there’s one more thing.” Or learning about Elon Musk’s empire of mass producing electric cars and developing a space program rivaling NASA. These two leaders are models of what popular leadership theories might label as successes. 

However, amidst the celebrity status, rock-star personas these kinds of leaders evoke, it is important to be wary of how easy and tempting it is to succumb to failure. This is evidenced in the 2008 global economic collapse in which highly paid CEOs of financial institutions stood at the helm. The Great Recession brought calamity to so many. Men and women who worked hard to save for retirement saw their future disappear overnight. Home owners who should not have qualified for loans foreclosed at alarming rates, causing a ripple effect on the larger national and global economy. It is staggering to think all this happened under the watchful eye of these Ivy League educated men and women. What happened? Was their exclusive, high-priced education deficient? Was ethics not a required course for accountancy majors? Perhaps, but something else might be the cause.

Dennis Tourish in The Dark Side of Transformational Leadership: A Critical Perspective is helpful and makes a startling claim to help students of leadership understand some of its pitfalls. This is a much needed reminder, especially for me as I tend to practice leadership with an exclusive focus on practical results. Tourish points out that even well-meaning leaders fall prey to internal systems which creates an environment of temptation to manipulate company results and incentivizes unwise risk-taking behavior. Experts have argued that this kind of agency “undermines teamwork, encourages a short-term focus and leads people to believe that pay is not related to performance at all but to having the right relationship and an ingratiating personality.”1

Apparently no system of leadership (Authentic Leadership, Servant-Leadership, etc.) is immune from this excess because, according to Tourish, the pull toward “Superman” (ala G.E.’s Jack Welch) stye of leadership is too great to resist. If the leadership style of the Welchs, Jobs and Musks of the world are to be viewed with reservation, who can we look to for a better model worthy of emulation? What qualities in a leader are required when institutional needs and objectives constantly change? There are two individuals that come to mind. 

The first is George Washington. According to Sam Walker, author of The Captain Class: A New Theory of Leadership,2 Washington felt responsible and connected with his troops in victory and in defeat. For example, after liberating Boston in 1776 he was reported to have humbly stayed back, sending instead his generals into the jubilant city. In that same battle, and soon after, the British returned with more men and succeeded in flanking the American position. But not before Washington led the evacuation, stayed all night until the last boat departed, saving 9,000 troops. Washington was known for persuading and encouraging his men in battle and appealing to their highest honor.

The second is obvious, our Lord Jesus. He is God Almighty and yet humbled himself to rescue sinful people. He destroyed current models and theories of leadership by setting an example of humility.3 Jesus never considered it below his dignity to stoop down to wash other people’s feet. He was present with Martha, Lazarus’ sister, deeply moved to tears upon learning of his death. When Jesus put together his team he chose individuals who already had things in common: fishermen, brothers, etc. This signifies a proactive approach to a relational component of leadership lacking today. 

Instead of the traditional view of leadership where a sole agent is responsible for an organization with its attending challenges, Tourish refreshingly offers a better one. In the end he says:

“I advocate a nuanced view of leadership, in which leader agency is acknowledged to exist but in which it is balanced by a view which takes fuller account of the agency of other organizational actors and the degree to which this agency is complicit in the construction of leader agency and action.”4

Based on this assessment, it is not surprising Jesus and Washington are considered to be great leaders because they included others in the process and shared victories as well as defeats. We ought to study them carefully to glean leadership principles to bring organizational vision to reality in a confused world seeking direction. 

Culture: Blessing or Bane?

This is a tough one. One one hand we see a newness in celebrating diversity and on the other it appears that the emergence of culture was a form of punishment by God when he confused people’s languages at the Tower of Babel. 

Erin Meyer’s Culture Map

In the United States celebrating significant cultural events (Cinco de Mayo, St. Patrick’s Day, etc.) is part of who we are as a nation. And yet at the same time we bemoan and protest the fact that minority groups are still victims of discrimination. So which is it? More culture, less culture or what? 

I believe some clarity on the issue will salve the confused. The book of Genesis gives us a full account of what happened. God created everything and assigned humankind to have responsible dominion over his creation. Adam and Eve had complete freedom to do anything they pleased except eat of the tree of the knowledge of good and evil. They succumbed to the serpent’s temptation, ate of the forbidden fruit and God punished them. 

Man continued in rebellion which ultimately resulted in God destroying his creation save Noah, his family and anything on the ark. It did not take long before the lessons of the flood were forgotten. Instead of obeying God by “filling and subduing the earth” man decided to come together and make a name for themselves.1 God could have punished them with equal flood-like results but in an act of mercy God decides instead to thwart their plans by confusing their one language and disperse them over the face of the earth. And we have needed translation services ever since. 

Understanding each other across cultural difference is a challenge.2 But it’s a welcomed challenge. And here lies the beauty of the Gospel. If all we have in this life, devoid of hope and restoration, we will be doomed, hopeless in our human interactions. Frustrations, civil unrest, chaos, all the accompanying social ills would be the norm in a post-truth society. 

However, the good news, the thing we hope for, the answer to our prayer “thy kingdom come, your will be done, on earth as it is in heaven” will be fulfilled in a curse that has been redeemed. William Edgar wisely points out that access to the tree of life that was taken away from us due to sin is now present in the new earth for the healing of the nations (Rev 22:2, 7; 22:14, 19).3 Further it appears there will be a great celebration praising and worshipping God in the eschaton, “After this I looked, and behold, a great multitude that no one could number, from every nation, from all tribes and peoples and languages, standing before the throne and before the Lamb, clothed in white robes, with palm branches in their hands, and crying out with a loud voice, ‘Salvation belongs to our God who sits on the throne, and to the Lamb!’”4 It is clear by John’s choice of words, “nation”, “tribe” and “languages”, that what began as instruments of the curse in the book of Genesis is now redeemed in Revelations. We start in the garden and we find ourselves back in. Only this time enriched by each others strengths and nothing of our weaknesses.5 That is the gospel message. Is culture a bane or a blessing? It’s both.

Is An Economy Devoid of Morality?

Karl Polanyi’s seminal work, The Great Transformation: Political and Economic Origins of Our Time is to be commended for pointing out the dire consequences of leaving market economies unchecked. During his time the role economy played in people’s everyday lives was taken for granted. The idea was simple then. How does one make a lot of money? Making lots of money equated with progress, so the thinking went. Progress meant opportunities, stronger local and national economies, innovation, low employment, etc., but most of all it attained peace.1 Peace was attained at the expense of liberty. When a people lose their sense of priorities, first things become second things, then second things become first things – a sort of Faustian bargain.

The 100-year peace preceding the Great War proved to many that whatever economic mechanisms were in place worked, and the bargain appeared to work. However, that did not last. The forces sustaining peace broke down. What caused its demise? A cursory reading of Polanyi, even by experts, might suggest that it was the result of a cultural milieu bent on prosperity in Western Europe in which markets were left completely autonomous. That human society accumulated wealth at the expense of other human needs is important to consider in this discussion. Polanyi blamed and bemoaned the fact that human society was subordinated2 by economics and not the other way around as it ought to be. However the charge that self-regulating markets caused societal decline leading to wars is too hasty. I realize this is a minority position and needs defending. My three main arguments are:

  • Polanyi himself did not make that claim. He said it himself in the beginning of his book:

“But if the breakdown of our civilization was timed by the failure of world economy, it was certainly not caused by it. Its origins lay more than a hundred years back in that social and technological upheaval from which the idea of a self-regulating market system sprang in Western Europe.”3

This is honest. To say that two things have a correlation but not causation is an important and honorable distinction. For example, we all know the rooster does not cause the sun to rise even though we observe the two occurring tother. The cause was something else which segues to the second observation.

  • Greed and jealousy caused economic collapse which led to wars, not the proliferation of self-regulating markets.

Polanyi writes:

“British and French interests differed in Africa; the British and the Russians were competing with one another in Asia; the Concert, though lamely, continued to function; in spite of the Triple Alliance, there were still more than two independent Powers to watch one another jealously.”4

He adds that this started the formation of alliances among the super powers, thereby threatening world peace. 

The 19th century experienced a boom in new infrastructure. Railroads were crisscrossing plains once desolate, the telegraph was invented and the industrial age was in full swing expanding the middle class. The upsurge in new technologies and innovation caused nations to compete against each other. World leaders began manipulating their currencies (now detached from the gold standard) resulting in imbalances of exports/imports. To the degree nations deal with these challenges is the same degree to which peace and prosperity either is maintained or sacrificed.

  • Polanyi’s discussion of the “balance of power” neglected to mention the genius of the United States constitution. His working definition appears to have only included a potential of two powers, that of the government and that of the governed. This was a popular view in British history5 until it was not in 1776. Polanyi argued that because the balance of power was shifted from the government to the people, sound economic policies were ignored thus contributing to his thesis that self-regulating markets do not work. However, this thinking becomes moot in light of the American Revolution. The balance of power principle as it was conceived in the British empire did not go far enough. It was not enough to consider two opposing forces jockeying for their own interests. This set up only served to pit people against their rulers. The American experiment took those powers and subdivided them even more. The revolutionaries wanted to ensure that no one person could dictate the affairs of the governed, thus establishing three branches of government. Each looked upon the affairs of the other, which the Weimar Republic failed to see.

Finally, there is the myth of self-regulation that needs to be addressed. Polanyi was convinced that a self-regulating economy was not good for the public good. However, the more insightful question to be asked is since when have we not had a self-regulating economy? And even if we had one, which I think we do, who would be responsible for regulating it? Is it the government or the governed who controls the economy? Ultimately it is the people who regulate it because fascist regimes eventually get replace by benevolent ones. Moreover, it is hopelessly difficult to differentiate the ideals of the government and the governed. This was made clear even from the opening lines of the U.S. Constitution (Preamble), “We the people of the United States…” in that the people themselves are the government. 

Self-regulation does happen. It happened in the 1930s to correct the market after the Great Depression under Herbert Hoover and as recently as 2008, after the Great Recession by Barack Obama in the Dodd-Frank Wall Street Reform and Consumer Protection Act.6

The economy serves the people and not the other way around as Polanyi would assert. This is good. And by keeping first things first we will have learned that there is no place for greed and jealousy, for all they do is “stir up conflict.”7

If It’s Broken, Fix It

The oft-quoted English aphorism, “if it ain’t broke, don’t fix it” is a good rule of thumb. However, if something is broken, fixing it in most cases is preferable to replacing it. But I highly doubt the saying will catch on. Is Evangelicalism in need of fixing as pundits seem to suggest?1 Are we ready to give up on Evangelicalism to avoid public embarrassment? Or do we euphemistically refer to it as something else? Worse yet, relegate it to the spinmeisters so we attach a new meaning to it, vis-a-vis tolerance?

A few months ago I found myself within earshot of a conversation around the idea of whether or not to distance a major private Christian university from Evangelicalism. My heart sank upon hearing this. This sounded like giving in to cultural pressure. This organization had stood firm in recent past when the issues were far greater. We must always be cautious and prudent that we do not throw the baby out with the bathwater.

British historian David W. Bebbington is a helpful reminder. There is so much good and has remained in Evangelicalism, even today and to think otherwise is simply intellectually and academically naive. What founded Evangelicalism in the 18th century Britain largely remains unchanged.2 Astonishly (providentially), the following distinctives are still common place: conversionism, activism, biblicism and crucicentrism.

A quick look at the statement of faith of Evangelical churches reveal this to be the case. A brief survey of the mission and vision of the church I attend highlight these qualities. It might be a bold claim but I think we would be hard pressed to see large swaths of evangelical organizations deviate from their original founding values. Sure, there are drifts here and there but those are exceptions. Our very own George Fox University can confidently point to her core values3 as being solidly Evangelical and has remained unchanged since the founding more than 125 years ago.

One of the reasons we find ourselves at peril of losing Evangelicalism is that we have simply taken it for granted. Taken for granted in a sense that we no longer think and talk in our respective communities about what it means to be bearers of the good news (evangel). For starters, it would be worthwhile as church leaders to remind one another of the etymological origins of the word evangelical. It comes from two Greek words, “eu” meaning good and “angel” meaning messenger of God. When put together we are reintroduced to our special identity as those who have received and entrusted with spreading the good news.4

The Lord knows we experience spiritual amnesia. This is the reason time and time again he had instructed his people to erect monuments as remembrances of his faithfulness and goodness. We find an example of this in Joshua 3 when we read about God causing the Jordan River to stop flowing so the Israelites could cross on dry ground into the Promised Land. Festivals and sacrifices achieve the same purpose.

Ben Reaoch, a pastor serving at Three Rivers Grace Church in PA wrote an article about the importance of remembering. He cleverly puts his points in a nice mnemonic device by laying it out in alliterative form. He said we ought to engage in: Think, Thank, Tell, Traditions, Transcribe, Taste and See. He briefly expounds on each one. For the purposes of this piece we will not delve into each one. Suffice it to say, we are living in times when we cannot afford any longer to simply take our faith for granted. 

Is Evangelicalism in need of repair? Os Guinness, social critic and Oxford don believes in this American Hour that it is time for reform. He reminds us that even centuries before anyone started calling themselves Evangelical in the continent that the Reformers in the 16th century claimed it for themselves. The spirit of Evangelicalism goes as far back as St. Athanasius and St. Augustine when they talked about the principle of being “reformed to the image of God.” After we have regained and reclaimed who we truly are, we must repent and humbly ask God and those who we have wronged to forgive us because we have betrayed our beliefs by our behavior. Only then can true reform begin. 

Who Needs Theology?

The late great R.C. Sproul published a book in 2000 titled The Consequences of Ideas in which he traces, in survey fashion, the contours of Western thought through the ages and its resulting effects on culture. It is one of the best of its kind since it is accessibly written and yet comprehensive in its scope. He starts with the pre-Socratics and takes the reader through the big ideas of Realism, Idealism, Rationalism, Empiricism, Skepticism and Existentialism (the order is intentional); and yes, all these “-isms” have had their consequences.

There is no time and space for elaborating on the evolution of these ideas. However, suffice it to say, the need for theology is dire. For in the study of theology we find the answers to all of humanity’s deep questions, i.e., meaning of life, existence, God, etc. Going back to our progression of ideas above, somewhere between Skepticism and Existentialism, something terribly happened in the world of ideas for which Christianity is still suffering the consequences. 

Consider the following historical segment in the development of ideas. In the 18th century Immanuel Kant, who many consider is the father of modern philosophy, struggled with whether we can know God. Rationalism and empiricism failed to provide satisfying answers. As his ideas developed he argued that our minds distinguish between the phenomenal and the noumenal1; the former being the things perceived by the five senses, the latter being things beyond the realm of knowledge, which included the knowledge of God. He did not set out to argue that objective reality did not exist. However, the wall dividing the phenomenal and noumenal is so wide that one cannot go around it, too deep to get under it and so high that no one can climb over it. But since ethics, the second of Kant’s concerns, is necessary for human flourishing, we must, as Kant argues, live “as if” God exists.

A generation later, Soren Kierkegaard, living under the Kantian revolution, develops his ideas which ultimately leads to the foundation for existentialism. Wracked with guilt about the decadence around him plus his own father’s adultery, Kierkegaard concludes that life only makes sense through pain and suffering. In one of his works he advances his notion of the three stages of life2, each with increasing moral standing. The first two debased stages can be achieved through deliberation. The Religious Stage, the highest of the three, cannot be reached by thought according to Kierkegaard. However, since this is desirable, one must act with pure subjective passion and take a “leap of faith” to the top. This is fodder for what Francis Schaeffer, writing in the 20th century, called the leap to the “second story.” Facts, science, (phenomenal) etc., being bolted to the first story and anything religious (noumenal) is sent to the second story where one has to leap to get there. It’s in the second story where subjective feelings reside, a place where knowledge is impossible.

Ironically, Kant and Kierkegaard were theologians. But their works have derailed Christianity in disastrous ways. The church since the Enlightenment has struggled to regain her prominence and relevance in society. It’s also in this cultural milieu that many Ivy League universities had its beginnings with the mission to educate students in theology in order to evangelize the world. Unfortunately, many of these institutions capitulated to culture and lost their Christian moorings. Anyone today would be hard-pressed to see any vestiges of faith except in the dark corners of these places, relegated to speculations and mere opinions. The distinction and separation between the sacred and secular was secured. 

Some of these institutions remained strong in the faith, but accepted the sacred/secular divide, decided to separate themselves from culture. This is how fundamentalism began. It had great intentions, unyielding to the ebb and flow of culture and go back to basics, the fundamentals of the faith. Despite the noble and courageous efforts of the founders of these places of higher learning were, they missed an important thing: culture. 

Stanley J. Grenz and Roger E. Olson in Who Needs Theology?: An Invitation to the Study of God reminds us:

“Christians have always sought to articulate their faith within the context in which God calls them to live and minister. We share the same task. Like our forebears, we desire to set forth our beliefs in a manner that will assist us in being the people of God in our world. That is, we desire a theology that is not only biblical and Christian but also relevant.”3

Of all the theological tools available to us, culture, I believe, is where most of the work needs to happen. The challenges of culture today, i.e., LGBT+, political divisions, tolerance, racial tensions, etc. are not incidental to our theology. They ought to be front and center. That’s not to say that we allow culture to dictate our theology. But more often than not, the church turns a deaf ear to issues that matter most to the folks in our community. The transformative power of the Gospel is more than this, but definitely not less. 

So, who needs theology? To echo Grenz and Olson, we all do4. But let’s add one more thing.  As believers let us also make it our goal to make theology attractive enough that all will want to study it.

Humble Leadership

Scripture tells us in Romans 12: 3 to “…not think of yourself more highly than you ought, but rather think of yourself with sober judgment, in accordance with the faith God has distributed to each of you.” This is not to say that we ought not to think of ourselves as anything at all. Good meaning people misunderstand the meaning of humility. Many think, especially secular folks, that humility means being a doormat, to be submissive in the sense that they allow others to dominate them. Our Lord was humble in birth, in his earthly ministry and in his death. Jesus says in John 10 that he “lay down my life that I may take it up again. No one takes it from me, but I lay it down of my own accord.” Humility is active, not passive.

When we think of a leader, we think of the CEO-type of a large corporation, charismatic, flamboyant, a person with a commanding presence. Our culture and media props up men such as Steve Jobs, Elon Musk or women like Meg Whitman and Marissa Mayer and say that these are the quintessential leaders to emulate because of their successes in business. This may be so, but it’s interesting that none of those names even come up in academic leadership research. What might be missing from their skill set? Do they have what leadership researcher J Richard Hackman calls the “It” factor or not?1 Are leaders more or less born with leadership traits or is it something according to Jennifer A. Chatman and Jessica A. Kennedy are skills that are “inherently developmental?2

Whatever counts as good leadership, there appears to be convincing studies that have surprised experts and dismantled our assumptions. That is, whereas we thought great leaders were dynamic presenters, clever financiers or luminaries in their fields, etc. It turns out one of the most significant predictor of a great leader is humility3. Jim Collins who has studied and taught leadership for decades at the highest levels calls this “Level 5 Leadership” and is well documented in his book Good to Great. Here he says:

“We were surprised, shocked really, to discover the type of leadership required for turning a good company into a great one. Compared to high-profile leaders with big personalities who make headlines and become celebrities , the good-to-great leaders seem to have come from Mars. Self-effacing, quiet, reserved, even shy—these leaders are a paradoxical blend of personal humility and professional will. They are more like Lincoln and Socrates than Patton or Caesar.”4

Years later, still convinced about this “shocking” discovery he adds in a recent Harvard Business Review article that “the essential ingredient for taking a company to greatness is having a “Level 5” leader, an executive in whom extreme personal humility blends paradoxically with intense professional will.”5

This is rather significant and demands our attention because this time he adds the intensifying adjective “extreme” to humility this time. This study affirms what Christians have known for ages. We have to lead like Jesus led if we are to expect lasting impact. I for one am convinced that this is the only effective way to go as we lead our churches, non-profit organizations, corporations, etc. But the bigger question is, will we be the kind of leaders who, by God’s grace, seek to be humble? 

Practice Makes Perfect

Constructivism, deconstructionism, structuralism, poststructuralism, modernity, modernism, postmodernism, postmodernity, etc. are useful methodologies that help our understanding of human nature and the way they situate themselves in the world. Habermas, Heidegger, Foucault, Derrida, Rorty are some of the familiar names who dominate these fields of knowledge. While studying some of these experts in the book Profiles in Contemporary Social Theory, edited by Anthony Elliott and Bryan S. Turner, I realized two things, perhaps a shared experience that shaped these men: Marxism and World War I. This paper will not delve into the intersection of these two narratives except to make a simple observation. 

“The war to end all wars,” originally idealistic, now used sardonically, was a very dark time in human history. Most wars until then were fought on a local level, tribe against tribe, nothing of the sort that involved multiple nations fighting each other under the banner of alliances. In an ideal sense, that war, as heinous as it was, exposed human depravity in unthinkable ways that it was inconceivable to imagine something much worse. 

Marxism, on the other hand, predated the great wars and appeared on the scene when no other competing views of human development, structure and functioning (social theory) existed , or at least was not in vogue. Karl Marx in the early 19th century was successful in capturing our collective imagination to frame and map our shared experiences at the time. When the “war to end all wars” failed to end wars, it was no surprise then that the thinkers following it were forced to rethink their ideas about human nature. This is, my opinion, what has led to the interdisciplinary art and science of social theory.

Fast forward to today. James K.A. Smith, a professor of philosophy at Calvin College has done a great job of introducing some of these postmodern thinkers to the project of discipleship and new ways of thinking about apologetics, albeit in subtle forms. However, his avant-garde ideas about human flourishing and behavior are not without its critics.1 Nevertheless, he brings fresh perspective to the conversation that can no longer be ignored. 

Evangelicalism has been a strong force for Christianity since the Reformation, and deservedly so. Its adherents helped us focus our attention to the primacy of God’s mission (Missio Dei), the Gospel, which literally means “good news.” This focus has lost its meaning in recent days and we as followers of Jesus must seek relevant ways to once again partner with God in his mission to save souls. For far too long we have imbibed in the notion that all our actions are a result of a process of deliberations in our minds, choosing the best options for the eventual outcome in our behaviors. This is wrong headed. This is where Smith is helpful as he directs our attention to social theorists such as Maurice Merleau-Ponty and Pierre Bourdieu who claim that there is another way of knowing and behavior is not primarily located in or generated from the mind. 

The claim that our behavior originates from other than our minds is unpopular. But there appears to be inchoate, tacit order to our actions that we can say it literally resides in the very core of our bodies. Karen Rouggly wrote a pithy reflective blog explaining this phenomenon.2 It’s important that Evangelicals warm up to the idea that we are not simply “brains on a stick” because it just might be the missing ingredient to our sanctification. 

Here is something to consider. Devoted followers of Christ seek to be like him. We read books, pray, attend conferences and conventions to better understand why we behave the way we do. If all we do is focus on the mind’s ability to go through a process of deliberation to arrive deductively at a conclusion that forces our action toward righteousness, then we are deluded. If right behavior is contingent upon right belief then we ought to expect greater sanctification in our personal lives than we have already experienced. The fact that this is not the case tells us there is another method we have not considered. 

Perhaps we ought to consider Smith’s project in his book Imagining the Kingdomwhere he talks about a process of “deformation” and looking at the “Christian perception of the world” by borrowing concepts from Merleau-Ponty such as “practognosia,” a know-how that is absorbed through our bodies; or Pierre Bourdieu’s “habitus”3 that is an inculcation that works deeply in pre-reflexive ways to the glory of God. Since all truth is God’s truth, Evangelicals should not be afraid but press forward with courage to use tools like social theory to usher in the next revival.

We are not anywhere close to another world war, God forbid, but we must not take these ideas for granted. No one wants to go through terrible human suffering just to get us to adjust our thinking.


Here’s a fun one. JTB is an acronym affectionately known among philosophers as Justified True Belief. It is a theory of knowledge that claims for anyone to know anything one must believe something as true and have good justification for it. For example, I have a belief that I am writing in English. Evaluating that belief through the JTB model I conclude that I do posses knowledge of this belief: (1) I believe I am writing in English; (2) it is true, the fact that I am writing in English; and (3) I am justified since you are reading this and therefore have demonstrated the fact that I have written in English. This appears obvious and uncontroversial.

This method of knowing had a rich philosophical pedigree and appeared unassailable until Edmund Gettier, an American philosopher, challenged it in 1963. In philosophy, if one can present a valid counterexample against a proposition, then that proposition must be reevaluated for soundness. He rocked the philosophical world by demonstrating that a person could have justified true belief and yet have no knowledge. His famous counterexample was that of Jones and Smith vying for the same job. 

This is not the place or time to summarize his particular rejoinder here but consider a similar example. Sally glances at the hands on the clock indicating that it is 5:00PM; it’s been a long day and it’s time to go home. Does she have justified true belief that counts as knowledge? Our first thought might be yes. It was truly 5:00PM and the clock indicated the right time. But unbeknownst to Sally the clock broke down 12 hours prior so basically she got lucky. This may seem silly and petty but philosophers spend a lot of time and resources contemplating and writing about these kinds of thought experiments. 

I confess, a part of me enjoys getting together with friends who are way smarter than me to have conversations around challenging subjects. I do it not only to delight in fellowship but also to learn. Books do the same for me. God created us to be curious beings. He created everything and gave us responsible dominion over it. Knowledge is indispensable to proper governance of the things God has entrusted to us. But how do we acquire knowledge? 

For most of my life the method of acquiring knowledge seemed to follow the method of JTB. I was not aware of the term then but throughout my schooling rationalism1 (the epistemological view that regards reason as the chief source and test of knowledge) seemed to be the unquestioned foundation and warrant for teaching and learning. I suspect this experience is true for many of us. Being rational is good. Being a rationalist is not good. Those are two different things. 

I owe James K.A. Smith a debt of gratitude for helping me understand that we are not “primarily theorizers.”2 We are more than our brain. We are not “brains on a stick” as Smith is fond of saying. He is right on this. There is another legitimate way of knowing without using the rigid matrix of JTB.

Sarah Pink is another pioneer in helping us understand another method of knowing. In her book Doing Visual Ethnography she introduces us to the world of field research and how one can legitimately collect data using photography, videos and other similar technologies. This data collection process however, does not locate its primary purpose on the artifacts being studied per se. Instead, the researcher is careful to step into the world of the people (ethno) being studied (graphe). The images an ethnographer collects is not analyzed in a rational, traditional way. Rather, with camera on hand, he or she steps in, and is invited to be part of a conversation to learn a people’s norms and behaviors. Pink rejects the idea “that the written word is essentially a superior medium of ethnographic representation.”33

Ironically, as I write this, we remember and celebrate the time when Martin Luther challenged the church about the excesses in some of her practices. While we cannot overstate the significance and importance of that momentous event as it has given rise to rich cultures and human flourishing, it improperly put a wedge between art and word. Whereas before the Reformation, images had its inherent way of communicating truth. Since then, word has been placed as master, a mediator, the only way to appreciate art. This idea of art appreciation has continued to govern our way of looking at art, especially among Christians. 

A Hollywood writer shared a story with me about the time Mel Gibson was to release The Passion to the public. He decided to preview The Passion to Evangelical leaders before releasing it to movie goers to garner feedback. After the showing ended they all praised him for his excellent work. However, that did not stop them from suggesting that he add the words of John 3:16 just before the credits rolled, as if the message was not clear enough. Gibson was wise to ignore that advice. The film went on to be a huge success, taking the title as the highest grossing rated-R movie of all time. The love of God was communicated in moving pictures and the visceral reactions of movie watchers were enough to assure us knowledge was gained.

I Wonder

“I wonder…” Those words shared by Dr. Jason Clark was meant to convey a particular posture in how we study and learn. I forget exactly the context in which it was shared, but it was one of his talks meant to encourage our cohort to hold our ideas, thoughts and learnings loosely. The memory still holds me captivated to that moment when I heard it afresh in the context of higher learning. These words evoke a kind of child-like playfulness, a naiveté, an invitation to imagine something else. For me, it gently invites me to a space in my mind where I just slow down in my thinking; to pause and reflect on my thoughts. 

There was a time in my life when I tried to place ideas in sort of a binary form. Every idea in my mind went through a filter: either true or false; valid or invalid, warranted or unwarranted, etc.. My project was simply not to have gray areas in my thinking. There was simply no time and space for this. I had assumed that for anyone to be considered a good and clear thinker, he or she would have considered and settled the big ideas in philosophy and theology, i.e., nature of time, determinism vs. freewill, calvinism vs. arminianism, etc.. After all, theology was the “queen of the sciences” and once ruled the many domains of human knowledge. If there was anything important to be known, it was in these disciplines. My academic training, I’m afraid encouraged this attitude, but in the end served as a corrective.

Thank goodness I have sobered up. I no longer think this way. Besides, it was tiring and in the end intellectually dishonest. There were just too many occasions when I exclaimed “Wow! I once held this belief X so passionately and now know it to be false” that I had to stop and reevaluate how I had to come to know anything. My ego can only withstand enough of these before realizing I am in not omniscient. Pride has a way of hiding the obvious. I did not know it at the time, but I was getting introduced to intellectual humility, the virtue Richard Paul and Linda Elder write about in their chapter on Essential Intellectual Traits.1 It became clear. I realized that if I could be wrong on something I had previously not doubted, no beliefs of mine were impervious to further scrutiny. 

Did I succumb to wrong impressions about humility; the twisted notion that to be humble requires being a doormat? By God’s grace, no. Like I said earlier, I felt freer. Philip Dow in his book Virtuous Minds: Intellectual Character Development writes “intellectually humble people value truth over their egos’ need to be right, they are freed up to admit the limits of their own knowledge.”2 As an amateur Christian apologist it is easy to feel inadequate when I cannot give adequate answers to seekers’ questions. While it is no excuse to neglect doing the hard work of study, it is liberating to admit I do not know certain things, but I can always find out. 

A prideful person does not listen to advice because of the predetermination that he or she is right. Scripture supports intellectual humility in ways we might have overlooked. Consider Proverbs 12:15 “The way of fools seems right to them, but the wise listen to advice.” Abraham Lincoln is arguably one of the best presidents in American history; famous for saving the Union, but not many know about his humility. One historian tells of a story about an occasion when Lincoln gave a direct order to Edwin Stanton that was disobeyed. Despite their differences, which was not a secret, one would have expected some deference on Stanton’s part since this was after the Civil War. Lincoln could have easily dismissed him for insubordination but instead decided to meet with him to listen to what others in Lincoln’s cabinet considered an insult. He said, “that is no insult, it an expression of opinion; and what troubles me most about it is that Stanton said it and Stanton is usually right.”3

When I think about Lincoln’s example of humility I am brought to new heights because with the Holy Spirit’s help we can be like him. And yet at the same time I am brought to new lows  because there are not enough men and women today who come close to this. The reflective ministry practitioner in me asks the following: Do I exhibit the kind of intellectual humility in my research that I do not hesitate to ask for advice from my faculty advisor? Am I willing to go where truth leads and expose prejudices for what they are? Am I willing to listen to differing opinions with a predisposition to change my mind? I wonder.

Aha and Eureka Moments

For as long as I can remember, writing well has always been a roadblock to pursuing advanced degrees—at least the kind of writing required to pass courses. Secondary education and the years in college did not prepare me well for the task of writing. Sure, we had English, Literature, Grammar, but nothing on how to write. I do not remember spending class hours on the mechanics of writing nor any of my teachers spending time helping us write better. What I do remember are the red marks on the margins critiquing style, not adhering to rules, punctuation, etc. 

By the time I got to college, a particular kind of writing was assumed. Again, no instructor in any of my classes gave us any clue as to what goals or practical ends our writing ought to be. Our syllabi had writing assignments and the assumption of everyone was that by the due date we would turn in pages with writing in it. Then the red marks and grades came in. I never failed any writing assignments but I never knew in advance if my writing would ever get an “A.” If I ended up with high marks, it was accidental. It all seemed like a mystery to me what criteria was used to evaluate our papers. Especially when one puts in a lot of time and sincere effort into it.

The anxiety over whether or not I could succeed at writing at the doctoral level had for a long time ruled out the idea of pursuing another degree beyond an MA. It still paralyzes me to think that a much longer paper will be due by the end of the Spring (2019) and wondering if I will be up to the task. I know it sounds crazy that I even bring this up since we are not even close to being done this semester and I am already worried about the next one. 

I start with this to help me frame and contrast the new things I am learning in the art and science of writing. To say this book by Derek Rowntree Learn How to Study: A Realistic Approach is groundbreaking is an understatement for me. How so? Because it destroys all the rules of writing I grew up learning. This absolutely is for me an aha and a eureka moment. I kid you not, I literally jumped out of my seat when I read the following from Rowntree:

The best one-sentence guide to effective writing I’ve ever heard is: ‘Write like you talk.’ In my own writing (e.g. as in this book), I try to put down on paper what I would say to my reader if he or she were witting there in front of me. In other words, I aim for a style that is informal and fairly conversational — but without being matey or chatty. Whether such an approach would be acceptable to your tutors is something I leave you to decide.1

That was under the subheading “Writing simply and directly.” He continues with a list of tips that was, without exaggeration, the opposite of what I was taught in high school, i.e., it’s okay and even preferred for writers to use personal pronouns such as “I”, use everyday words, use short and simple sentences, etc. I do not know nor can explain why I was taught the complete opposite of what is taught in this book. It’s not like this resource was not available when I was going through high school. This was somewhat perplexing to me that I had to ask my son who is in the 10th grade and was sitting next to me when I had my writing epiphany. I asked him if any of these things in the book was news to him. He basically said that they learned all the rules just like I did but their teachers give them leeway in their writing assignments; more freedom to express themselves. 

Listening to my son I was reminded about the proper roles rules bear on writing, and for that matter much of life. Rules are very much like fences or railings. They serve the purpose of preventing things from falling off the edge or other dangers. It does not follow that just because they are there that we should stay close to them. The spaces in the center provide safety and freedom to explore. In this case more freedom to explore and express our ideas on paper. Indeed, this realization that there is far less restrictions in writing than I had been taught is liberating. I just hope and pray our tutors agree with Rowntree on this.