Sunday, December 22, 2013

The Nature of Human Nature II

[Johnathan Clayborn]

I know, I write about all of these weird philosophical topics a lot. Human nature is one that I constantly find myself coming back to over and over. I have this deep-rooted desire to understand everything, especially why I am the way that I am and how I fit into the larger structure that is human civilization. 

I have been reading several books lately, among them is How the Irish Saved Civilization, by Thomas Cahill. I've been enjoying it thoroughly so far. At one part of the book the author mentions something in passing that really struck a chord with me and I immediately took it and applied a different context to it. 

The author had mentioned how emotional the Irish were, and how their emotions that they had back them have not changed and we still feel the same way today. That was an epiphany of sorts for me. This entire time I have been looking at the answers for human nature as a cognitive or behavioral function. But it occurs to me now that this is folly. Our behavior, as individuals and as a society, changes over time. Our thought processes, individually and collectively, also change over time as we learn new things. But our emotions remain constant. For thousands upon thousands of years the emotions of the human spectrum have not changed. 

We are gifted with a wide range of emotions; angst, fear, love, hope, anxiety, despair, sorrow, joy, grief, sadness, pride, depression, fury, terror, jealousy, happiness, regret, euphoria, anger, faith, rage, remorse, pity, compassion, peace, a broad, wonderful, beautiful range of emotions. The despair and terror that we feel today are just as powerful and overwhelming as our ancestors who lived hundreds of thousands of years ago. 

I suggest, that human nature is not defined by specific actions that we take individually or collectively, but rather by the range of our emotions. Simply put; human nature is to feel. That is the curse and the blessing of the human condition. Whether the feelings be good or bad, slight or overwhelming, it is those very feelings that let us know that we are alive. 

Monday, December 9, 2013

The Platonic Fallacy

[Johnathan Clayborn]

Those of you who know me well know that I am a voracious reader. I read lots of book on a wide variety of topics. I've been reading a fascinating book on Irish History (one that I lost and had to buy again) called "How the Irish saved Civilization" by Thomas Cahill. It's a pretty good book, and quite insightful. One of the things that he mentions at one point is the Platonic Fallacy, which was something that I wanted to delve into a little more.

The Platonic Fallacy, of course, derives from the great Greek philosopher, Plato. The crux of the fallacy is the thought that "knowledge is virtue". This is something that I had encountered before as a youth and as a teenager, but I never really paid much attention to the deeper meaning of it until now, especially not in the context of what Plato actually meant.

According to Plato, knowledge is virtue. In his mind, the two words are synonymous and interchangeable. But are they really? The dictionary defines virtue as "behavior showing high moral standards", and lists synonyms; goodness, righteousness, morality, integrity, dignity, rectitude, honor, decency, respectability, purity, etc. Never once does the dictionary list knowledge as a literal synonym for virtue. 

Knowledge is many things. As one adage expresses, knowledge is power. That might be true, but if knowledge is power, and if Plato was right, wouldn't that also make power equal virtue? Power is many things, but Virtue it is not. As another old adage says, "power corrupts, and absolute power corrupts absolutely". So, either this adage is wrong and power is virtue, or else Plato is wrong. Based on the words and actions of the people who are often in power, I would tend to believe that the adage is right and Plato is wrong.

Knowledge can be many things. Knowledge can be respectable. Knowledge can be pure (especially knowledge that is pursued for the sake of simply knowing and not trying to prove or disprove anything). Knowledge can also be dignified, to a point. But that is where I think the similarities between knowledge and virtue end. Am I saying that knowledge is an ill-gotten endeavor to be discredited and shunned? Hardly. I wouldn't be a PhD student if I felt that way. But, I am saying that knowledge itself does not posses any intrinsic properties which make it morally just or righteous. Knowledge is a tool. And, like all tools, the purpose and intent of the tool is shaped and defined by the will of the person who wields that tool.

Let's think about this in more depth. There are ethics; rules by which society mandates we should abide. Then there are morals; rules that our own personal beliefs and convictions mandate that we must abide. Knowledge may help shape and inform these different rules, but knowledge in and of itself is not the same as these rules.

To put this into a practical, philosophical example, let us speak of the digital age and computers. With practice, experience, and a little tutelage, anyone can learn to use computers, and learn to use them well. With copious amounts of practice, and knowledge gained through experience, some people can become quite skilled at modifying lines of computer code to change the way that the computer behaves. This is knowledge. By itself this knowledge is neither good, nor evil. It is simply knowledge that exists in a state completely devoid of moral or ethical implications, as all knowledge does. But, when a person decides to use this knowledge, then it takes on the moral and ethical implications of their actions. For example, if a person decides to use the knowledge of changing lines of computer code to steal information, or forge fake identities, or steal money, then society would typically say that the knowledge of "hacking" is evil and shouldn't be learned. However, without understanding how hacking works, then other people cannot use that same knowledge to build software and tools to protect people from those who would use that knowledge for ill.

Just because a person possesses knowledge does not automatically make them virtuous; their actions determine that. And there are certainly many people who perform virtuous actions that do not have much education or knowledge. The amount of education that a person has doesn't automatically make them better than anyone else, or more important, and certainly not more virtuous. Virtue has one simple and unequivocal measure; the intent and actions of an individual. I have more to say on that topic, so maybe "intent" will be the topic of another post in the near future. 

Sunday, December 8, 2013

The Nature of Human Nature

[Johnathan Clayborn]

As many of you who know me well know, I often volunteer my time for various activities and organizations. I've been part of Relay for Life committees, been on the public safety council for my city, and fed the homeless more times than I can count. One of the volunteer things that I do is answer questions on AllExperts.com. I volunteer both as an English language expert, and in the "writer's block" category.

For the most part the people who ask questions there are learning to speak English and they want to know if they've used to the correct verb tense, or the correct adverb, etc. However, every now then one of my regular, recurring questioners asks me something profound. Today I got an email from one such person in Poland who has emailed me several times. As they learn more about me and my background and experiences a few of them sometimes use that channel to ask questions about philosophy of psychology. Last time I heard from this gentleman in Poland he was asking me to interpret what a particular author was saying about different kinds of love and whether or not I agreed. Here's was today's question for me:

Hello Mr. Johnathan. 
Once again I'm not going to ask about grammar or writing, but rather about psychology. 

Last time I asked you about love. Today my question pertains to a similar subject, I think. 

Do you think that all people, by nature, need someone else to share their life with? It seems to me that most of the people cannot live alone, they feel they need someone and sooner or later find him or her. Of course there are monks/priests/nuns etc. who live in celibacy, but aren't they, in this way, going against their nature? 

I think I heard or read somewhere that we are, in a sense, (I'm not sure which word is correct here) imperfect or handicapped because we cannot live (or at least it's not comfortable for us to live)without the other person - of course I'm talking about a relationship between a man and a woman.

What are your thoughts on this?

I rather enjoy this more philosophical questions as they break up the tedium of explaining verb-tense agreement and other such grammatical rules. 

This is a very complex question, with many different factors to consider and many different possible philosophical lenses to view the question from. 

I'll start with the easier part of this question, whether or not people can live their lives alone. This, in and of itself, is a complex question. Intrinsically humans are very social creatures. We like to belong to groups. As Professor Diamond explains in his book, "Guns, Germs, and Steel", this is a throwback of evolutionary psychology from our hunter-gatherer stage. Being part of a group meant that that you would have a much better chance of survival. This component of human psychology has carried over to modern society. We, as a species, do prefer to be part of a group. We often find ways to associate ourselves with a group and we actually strive for and foster that "us versus them" mentality. When there are no ready-made groups available, we often invent some so that we can develop this sense of belonging, which is one of the steps in Maslow's Hierarchy of Needs. With the advent of the internet, this is much, much easier than ever before in history. Now we have the ability to divide ourselves along racial lines, nationalities, political lines, religious beliefs, sports teams, which games and shows and band we like, etc. 

Just because humans have a natural tendency for socialization and belonging doesn't mean that people as individuals are always well-suited for this. There was an article in a recent issue of Psychology Today magazine that talked about the growing demographic of deliberately unmarried single adults, people who have chosen to live a single lifestyle because they are happier being by themselves than with a partner (a concept that seems strange to me, personally). As humans I do think that we all want to belong, that we all want to love and be loved, and that we all want someone to share our joys, our achievements, and our heartaches with. But I do think that there are a number of people out there who are not only capable of living alone, but who actually prefer it. Many of these people date, and fulfill their need for intimate human contact with either friends who understand the boundaries of that relationship, or with people whom they meet randomly every so often. 

But, this brings us to the question of whether or not this behavior is in violation of human nature. When I was younger I used to do a philosophy exercise. I would pick a question or a topic and then on one day I would make arguments in favor of that position. On the next day I would argue the opposite point. I would pick random things like "Can one person really make a difference?" and argue the point back and forth with myself for weeks. One of my questions was "is there really such a thing as human nature at all"? To this day I am convinced that if there really is a such thing as human nature, it is not a cleanly-defined set of rules or parameters for behavior. I think it's much more complex and abstract than that. The reason that I think this is that almost every type of human behavior that you can think of has a counter-part and opposite behavior that exists in dichotomous opposition to the first behavior. For example; there are those people who would donate and give away most of the things that they own to help others, but there are others would who steal and horde things away for themselves. There are those who would stand up and protect the weak and the infirm, but there are also those who would pick on them and prey on them. There are those who say that human nature is to believe in something greater (perhaps God), but there are also those not only don't believe in religion but don't believe that there is anything greater. I could go on, but the point is; how can any of these behaviors be "human nature" if there exists another behavior that is the exact opposite? If it were human nature, then wouldn't all people be more or less compelled to be that way through the intrinsic properties of human DNA? So, I propose to you, that maybe human nature is to be in conflict, with ourselves, and with each other, in such a way that there are no clearly-defined rules that define the human experience. 

Tuesday, November 26, 2013

The Speed of Dark?

[Johnathan Clayborn]
Here's a fun philosophical/physics quandary for you, a conversation I had with several of my coworkers a few years ago: what is the speed of dark? This question isn't intended as a joke, but as a serious thought for discussion. Logically, this question only has two possible answers: either the speed of dark is 0, or the speed of dark is the same as the speed of light. Scientifically this could be expressed as ds=0 or ds=ls.

To answer to this question it is going to require a little bit of discussion about the nature of dark. What exactly is dark anyway?

When this question first came up in the original discussion one of my coworkers immediately fired back with the comment that there was "no such thing as dark". I found this to be a perplexing statement. No such thing as Dark? Really? When I questioned him he said that Dark was really just the absence of light, and therefore not a real construct. But, if Dark is just the absence of light, and therefore not real, then what possible way would they have to measure that light against in order to assign it a lumens rating, or candlepower? It could be argued that "Dark" is simply 0 on the lumens scale, but being that it is an integral part of that scale and the basis of all luminosity measurements, then it certainly seems like it would be a real and valid construct.

Okay, so if we operate on the assumption that dark is a valid theoretical construct, how does that help us to understand how fast this thing moves? The answer to that question lies in understanding the nature of that thing. How does it work? What is it comprised of? In the case of light, this is easily understand. Light is an energy wave that is comprised of photons. These photon have energy, but no mass. So light is not a solid thing. What about darkness?

Darkness, on the other hand, does not intrinsically contain photons, although it does abosorb them. It then converts these photons into heat energy (which is why black cars are hotter than white cars). These heat-releated properties are explained by many of the principles of thermodynamics.

To recap, light is an energy wave that contains protons and moves with momentum. Darkness is a construct that absorbs photon energy and re-emits it as heat. But does it move? What happens when light spills into a void where darkness once was? Does the darkness vacate that space at the same speed at which the light enters? Or does the darkness linger and allow the light to permeate through it?

Although not immediately obvious, the answer, is that the speed of darkness is actually 0. Darkness doesn't flee before the light. Darkness absorbs the light until a specific threshold is reached and no more photons can be absorbed, and then it essentially becomes light.
How can we know that this is true? The answer has been in front of us the whole time: light and dark can occupy the same space at the same time. We know this because dimmer switches are possible. It is possible to adjust how much light floods a space by increasing the intensity of the light emissions. Once the light stops being emitted, the influx of photons ceases and the darkness bleeds off the photons it has absorbed and returns to being darkness once more.




Trusting Your Instincts

[Johnathan Clayborn]
Recently I've found myself in a situation that I would have never dreamed of finding myself in. It's a quandary with no real right answers one way or the other. It's not a situation that I would have asked for, and yet, despite that, it's one that I'm very glad that I'm in. And yes, I'm being deliberately vague, and no, I won't elaborate.

The point of this is that sometimes, you just have a feeling and know what you need to do. You may not want to do it, you might dread it or put it off, but you know what needs to happen. How do you know? Because you feel it in your gut. This undeniable inner voice from somewhere deep within your core speaks to you, telling you how it is.

It's kind of one of those situations that, had I explained the details you would most likely consider me completely crazy. And yet, I know more surely than I have ever known anything that this is the right path to take, even though it's not the easy path. How do I know? Because my inner voice tells me so. My psyche, or my subconscious, or my soul, or whatever you want to call it, it speaks, nay it shouts to be heard. To deny what my gut tells me is right causes me stress, anxiety and emotional and physical discomfort.

But what if you don't have such a strong feeling? What if you are faced with a choice and you aren't sure how to proceed? Are there other ways to tell? There are lots of tricks to help you. One piece of advice from LifeHacker is to pretend like you are giving advice to a friend who is going through the same situation. This is easier said than done in most cases. I'm very good at giving advice. Many of my friends seek me out because of my wisdom and insight. However, even knowing what I would tell them in that situation, sometimes it's difficult to take your own advice. So what other options are there?

Lifehacker also suggests limiting the amount of information that you take in. Sometimes taking in too much information can cloud the decision making process because you get hung up on the trivial details. Of course, in some cases this is not always possible, so what else is there?

Making a list of pros and cons can help some people who are very analytical or overly-organized see "the big picture". I've certainly done this a few times myself, but it doesn't work in every situation.

Some research suggests that there are many different factors that might affect your decision making including (but not limited to); time of day, how angry you are, your surroundings, the ambient music, etc. But if you are aware that these might be influencing your decision they are easy enough to counteract.

One resource suggests something called the "10-10-10 rule" for decision making. Ask how you will feel in 10 minutes, 10 months, and 10 years about your choice. If you would still be happy with it, it's a good idea. The fallacy of this logic is that you don't know how you will feel in 10 months or 10 years; all you have to go on is how you feel right now.

Surprisingly there is a lot of research that suggests that quick, snap decisions, can be the best method. There was an article in Psychology Today that talked about this next premise, although I can't seem to find it, so I'll post the link to what is essentially the same concept on Lifehacker below. Basically, the PT article said to think about the question, and then close your eyes, push everything out of your head and then after you were calm, count to three and say your answer. Whatever you decide under the pressure like that is generally better and will make you happier. The LifeHacker method was a bit more involved and included a piece of paper and a long distraction, but it included the same basic gut response. One poster who responded to the LH article had a brilliant idea: to flip a coin; heads or tail and then take note of which side you hoped that it landed on. That would be the right decision.

The point of all of this? Some decisions can seem impossibly hard on the surface, especially if you think about and analyze them with your cognitive brain. But sometimes it's just best to turn that off and trust your gut instincts.

References:



The Necessary Art of Self-Deception

[Johnathan Clayborn]
I know that as a blogger I'm not doing a very good job of posting regular articles. But, to be perfectly honest this pursuit is at least of quatinary importance to me falling in line after school, family, and work obligations are met (not necessarily in that order). So, it should come as no surprise that I haven't posted in a while. These last few months have kept me very busy, but on the up side I have officially completed my 1st year of my PhD progam.

I had many interesting conversations and topics come up during the last couple of months, but didn't have the time to write about them. I'm going to use this opportunity to write about one of the more recent conversations that came up in school during my Culture and Ethics class.

As human beings we are amazingly good at the art of self-deception. We also have this amusingly frustrating ability to define things in the absolute. This is particularly true when it comes to questions and topics of a moral or ethical nature.

One thing that I've noticed of late is that humans are generally very quick to point the blame and criticize or ridicule those whom they feel have violated their preconceived notions of morality. We have our own set of moral codes and beliefs that we staunchly and stoutly cling to, safely believing that our moral code is infallible and that we would adhere to these rules unquestionably.

In truth, life is simply not that simple. Now, mind you, I'm not advocating the creative implementation of excuses to defy your moral code. I'm not even suggesting that you should actively seek opportunities to do so. But, I am saying that when it comes to how we evaluate our actions and behavior perhaps there is a little too much confirmation bias at play.

Often I have been in conversations with people who say things like "I would never steal anything". Really? Never? Never ever? That's an interesting lie that you're telling yourself. Now, for 1 person out of 1,000 this might be true, with emphasis on might. But for the vast majority of us we're engaging in self-deception. Does this mean that most people are thieves and crooks? No. But, what I am suggesting is that there might exist an extenuating circumstance so grievious that the most logical choice of action would be to break your moral code.

Take, for example, a natural disaster or a public crisis. Suppose that your city is in panic. Power is out. Your refridgerator is no longer working. Instead of having 2 weeks of food on hand (which is the national average according to FEMA, by the way), your cold goods are now bad and you have 5 days worth of dry goods to eat. In a survival situation people should keep in mind the rule of 3's: people can generally not survive more than 3 minutes without air, 3 days without water, and 3 weeks without food. Now, bearing this in mind, if you were in this siutation where you had no electricity and little food, if you were to fast-forward 2 weeks to the point where you have haven't eaten in days and you're starving. Maybe you have small children to think of as well. If you came across a store that had food inside, would break in and steal it? Most people would when faced with that situation, even those people who devoutly profess that they would never steal anything. What if there was someone who was hoarding food? Would you physically attack them to try to take food for your own family? If they think they could take them in a fight or they were desperate enough, most people would. Of course, it's easy to deny it now when everything is kosher and say that you're above all of that, but if you ever find yourself in such a situation see what you think then.

So why is it that we do this? Why do we have this inherent need to lie to ourselves? The answer, simply, is because we have to. We have to beleive that we are good people and are incapable of egregious acts like what I described above. To admit to ourselves that we would, in fact, break many or of our own moral codes if the situation mandated that we must presents us with a logical fallacy that our cognitive brains are not capable of
reconciling. Granted, the tipping point for most people to cross their ethical and moral boundaries is quite varied. For some people it doesn't take much, but for others the situation has to be extraordinary. A lot of this self-deception is the fault of our own egos and super-egos, which are the psychological process that our brains use to develop our self image. We tend to be happier and healthier when we can picture ourselves as good people, incapable of any wrongdoing at any time.

Friday, November 1, 2013

Albert Bandura: Social Modeling Theory - fact or fiction?

[Johnathan Clayborn]
Question: Can classroom bullying be attributed to social modeling and/or vicarious learning?

To answer the question of whether or not classroom bullying can be attributed to Social Modeling Theory, we have to first examine if social modeling theory is viable and logical outside of the classroom.

Albert Bandura postulated that people learn behavior through modeling (Bandura, 1977). According to this theory, people learn how to behave by watching those around them. There are certainly some examples where this would seem to be the case, but as a general principle, I disagree with Bandura, at least with regards to violence. I do not believe that violence is a trait that is learned through social modeling at all. Violence, whether we like it or not, is an intrinsic part of being human.

Any day that you watch the news you will see evidence of violence; shootings, muggings, rape, murder. It happens every day all over the world. Most of the media is quick to blame violent video games, saying that the shooters were influenced and learned how to do it from these video games, which would support Bandura's theory (ABC News, 2013). However, others, myself included, are quick to argue the fallacy of this logic in what has become a hotly debated topic (Atlantic wire, 2013).

Regardless of what side of this debate you find yourself on you cannot logically deny the fact that violence has existed long before video games. There are many famous mass shootings that happened prior to the advent of violent video games: George Hennard - 1991, Patrick Henry Sherrill - 1986, James Huberty - 1984, George Banks - 1982, Joseph Whitman - 1966, etc (CNN, 2013). So, if social modeling through video games is to blame for today's modern killers, then what social modeling is to blame for these killers? Television? What about mass murders that happened prior to television? Did Jack the Ripper learn how to kill through social modeling in victorian literature?

In parts of the world that are completely devoid of media, children still violent games with each other (Diamond, 1999). One could make the argument that these children learned to be violent through social modeling through their parents and other adults in their communities. However, violence has been used as a family event and a form of entertainment for hundreds of years. During the "wild west" and the colonial era of American history public executions were common practice that included the whole family. This practice originated from Medieval europe where it was precedated by the Roman practice of the gladitorial games.

Probably the most damning concept to Banduras theory is Otzi the Iceman. Otzi is the oldest, intact complete human remains ever recovered from anywhere in the world with his mummified corpse dating back 5,000 years ago (PBS, 2011). He is damning to bandura's theory because in addition to being the oldest intact human ever found, he's also the world's oldest cold-case murder victim; he was shot in the back with an arrow and cudgeled over the head (PBS, 2011). If Bandura's theory of social modeling is correct, and we learn our behaviors from those around us and our parents, then where did the people who came before learn how to be violent? One might argue that violence was learned and perpetuated from the earliest humans, but where did they learn it? Additionally, there are many examples of people who do the opposite of the behaviors that were socially modeled for them; pastor's chidlren becoming involved in gangs or drugs, people who grew up in a gang family becoming police officers, etc. If social modeling is correct, then why do these people deviate from, and do the exact opposite of the behaviors that were modeled for them?

To get back to the original question; can classroom bullying be attributed to social modeling and/or vicarious learning? No, I don't think so. There is a lot of evidence that suggests that this is not the reason why children bully each other. I would argue that there is something within our residual evoluationary psychology that makes children bully each other, rather than social modeling. I would argue that peer pressure and the need for acceptance plays a much bigger role than social modeling.

Personally I think that peer pressure and evolutionary psychology play a more important role than social modeling. Humans are an intrinsically social group. In our days as a hunter-gatherer society society ostracization from the group meant almost certain death (Diamond, 1999). This is one reason we still like to form groups and perpetuate an "us vs. them" mentality in everything we do from religion to politics to sports. In a throwback from our hunter-gatherer days we still view strength as a desirable trait and people are incined to try to "prove" their strength by picking on those weaker than they are.

References:

ABC News (2013) How Violent Video Games fit in with Violent Behavior. http://abcnews.go.com/Technology/navy-yard-shooter-played-military-style-videogames-relevant/story?id=20285169

Atlantic Wire (2013) Don't Blame Video Games for Monday's Mass shooting. http://www.theatlanticwire.com/entertainment/2013/09/dont-blame-violent-video-games-mondays-mass-shooting/69499/

Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review,
CNN (2013) 25 Deadliest Shootings in US History http://www.cnn.com/2013/09/16/us/20-deadliest-mass-shootings-in-u-s-history-fast-facts/

Diamond, Jared. (1999) Guns, Germs, and steel: The fate of human civilizations.
PBS (2011) Iceman Murder Mystery http://www.pbs.org/wgbh/nova/ancient/iceman-murder-mystery.html

Sunday, August 18, 2013

Counting Creation

[Greg Bullock]
The Schwarzschild radius of a mass is the size that the mass must be compressed to for it to become a black hole. For our sun, the Schwarzschild radius is about 3 km, much smaller than the sun's radius, so the sun is not a black hole.

Black holes are formed when stars much larger than our sun collapse under their own gravity until they are compressed to a size smaller than their Schwarzschild radius. At that point, the event horizon of the black hole forms at the radius. Our own universe appears to contain enough matter so that it is effectively in the interior of a black hole, with the event horizon occurring at the universe's cosmological horizon.

There is an upper limit to the amount of information contained in a black hole, and it is determined not by the volume of the black hole, but by the area of the two-dimensional surface of its event horizon. For our universe, this upper limit is estimated to be 10^122 bits. An incomprehensibly powerful computer (located outside of our universe) could store the state of the universe at a particular instant by recording the state of these 10^122 bits.

Time is not infinitely divisible. The smallest unit of time that has any meaning is Planck time, the time it takes light to travel a distance of one Planck length. Storing the state of the universe at each interval of Plank time would enable the simulation of the universe, by playing back the states in order. Constructing such a computer would be tantamount to constructing the universe.

It is not possible to build such a computer because the state of the universe cannot be completely determined, due to the Heisenberg Uncertainty Principal. There is a way, however, to build a computer to accomplish the same thing without having to determine or store any configurations of the universe.

All that is necessary is to build a computer that counts in binary. Given an infinite amount of time, the computer will eventually count up to a configuration state of our universe. In fact, the computer will eventually count to a vast number that represents all successive configurations of our universe, from the first instant of creation to the last evaporating black hole. In the process of counting to infinity, it will not only calculate our entire universe's full history, but all possible universes and their histories.

It may be that everything we perceive in the universe, as well as we who are doing the perceiving are nothing more than numbers reached by counting—no different from the counting we learned to do when we were small children, first realizing there was no limit to how high we could count.

Friday, April 26, 2013

Mature Politics

[Greg Bullock]
As people age, they are said to become more "mature." The term implies not only increasing chronological age, but also changes in temperament and outlook that typically, though not invariably, come with age.

The traits associated with maturity are those such as self-control, generosity, and the ability to plan for the future. These sorts of characteristics are conspicuously lacking in young children, and to a more-or-less degree in immature adults. Although the attributes of maturity seem to have little to do with one another aside from being correlated with chronological age they do share a common feature: increasingly broad horizons of perspective.

People who are mature do not focus on themselves alone or on the immediate at the expense of the future. They are able to understand the perspectives of others and exhibit sympathy. They make sacrifices over the long term to secure a better future. They exhibit emotional self control, because they understand other viewpoints and the implications of irrational behavior. They can deal with setbacks and frustrations, because they can draw on experience to anticipate that things will eventually get better.

Maturity seems to be increasingly lacking in our society. In fact, a lack of maturity is the central problem in politics today.

People cannot engage in civil discourse because they are incapable of understanding others' points of view. They pursue policies that benefit themselves regardless of the consequences for others. They even favor policies that benefit themselves in the short run at the expense of harm in the long run. The mature not only care about their own immediate future, but also the well being of future generations. They are willing to compromise because they appreciate the concerns of others and don't feel the need to vilify opponents as evil or ignorant.

The maturity of the founders of this country is palpable when you read what they wrote. Ironically, as our country has matured chronologically, it has regressed emotionally. It is difficult to see a way forward, but another trait shared by the mature is persistence motivated by hope and a determination to move forward one step at a time rather than giving into cynicism or despair.

Tuesday, April 16, 2013

The woes of standardized testing

[Johnathan Clayborn]
I've been extremely busy lately and haven't had much time to write. But, this week's discussion question in my Psych 8760 - Ed Psych class proved to be worthy of a post here. This week we're talking about standardized testing and their role in the school system. Honestly, I've sort of been waiting for this discussion to come up. I have rather strong opinions about this topic as I've witnessed this phenomenon firsthand.
One must understand the potential benefits of standardized tests, of which there are many. First, standardized tests can be used effectively in the formation of summative assessments that can potentially be used as a guide to determine how much information a student has retained (note: this does not prove, necessarily, what they have understood). To this end summative assessments and standardized testing can also be used to uncover flaws in the curriculum itself (a technique I employ frequently). If 80% of the students of a given class miss the exact same question on the test there are only two logical possibilities; the question is poorly written and confusing, or the information pertaining to that question was not effectively delivered. Standardized testing can also be used in a secondary capacity to measure the teaching ability of a teacher; all things being equal (classes made up of the same demographic, teachers with access to the same curriculum and same resources), then the scores of multiple classes should be within a few points of each other, in theory. Any deviation above or below the median scores by a significant amount would warrant a closer look to determine what's going on.


The biggest problem that I have, personally, with standardized testing is that they are over-hyped and over-emphasized by morons in politics. Bean counters in Washington and at the State and County levels wanted a way that they could easily track and compare data about how much a student has learned and make a determination if a school is being "effective" or not, which in turn directly affects the decision of how much money the school receives. Reading  essay questions would take too much time and be impractical, so they forced the standardized testing down the school system's collective throat. People who have little to no background in education are dictating policy on how the education system should be run; a recipe for disaster. Now they are even beginning to tie teachers' annual pay raises (and even their continued employment) to how well their classes do on these tests! Is it any wonder that 50% of all new teachers quit within the first 5 years and that education is quickly becoming one of the professions with the highest turnover rates? 

I say that I've seen what's going on first-hand because I have experienced it in other areas. In the IT Industry, which I have worked professionally for many years, the certification that everyone wants/needs is called the "A+ Cert". It is a standardized test consisting of about 120 questions. There are literally "boot camps" that you can attend for a nominal fee they will cram the test answers in your head through rote memorization long enough for you to pass the test. Hooray! Now you have your A+ certification on paper, but you quickly forget the material for the questions, and worse yet, you have no idea what any of it means. You know that the answer to question 27 is B, but you have no idea why the answer is B. Under those conditions, is that certification actually even worth the paper that it is printed on? I don't think so. But, that's all that matters to the companies who seek out such candidates. Our school systems are quickly emulating these behavior patterns with more and more teachers "teaching to the test" because they, and their districts, have a vested financial interest in how well the students do on them. 

Because the very system that we use to measure our students is flawed, every decision that is based on that information is also flawed as a result. Our school systems are performing poorly you say? Well, let's just throw more money at the problem blindly, because clearly if we just push enough money at the problem it will simply go away. Yet, annual studies by the US Department of Education comparing students from the G8 countries of the same age and grade reveal that while the US has the single most expensive primary and secondary education system in the entire world, our test scores and student performance is only middle-of-the-pack. Other countries are performing better with less money. Clearly, a change in our methodology is required to bring about effective changes in our scores. In that regard, I propose that teaching is just as much an art form as it is a science. In much the same way that you could not adequately measure the ability of a painter with a standardized test, it should not be the end-all, be-all of educational measurements either. Which would you rather have; the caricature painter at the fair who can crack out 100 paintings in a day, or Leonardo DaVinci who only paints 1 painting in a week? The quality of the student's cognition should be of paramount importance, more so than the quantity of their recollection. I would rather see students who are taught how to think, to reason, to research, to experiment and explore rather than to see students blindly parrot back the same information that I told them verbatim, devoid of intellect, curiosity, or desire to learn.

But, as good as standardized tests are, there are also many drawbacks. One drawback is that they make poor formative assessments. Another drawback is that they cannot measure how much a studentunderstands about a subject. They also do not necessarily translate well to cultures, languages, or ethnic groups that are non-English speaking (a critique also shared by the Stanford-Binet IQ tests).


With regards to what manner is best for measuring students, I think it has to be a mixed approach; one that utilizes both standardized testing and cognitive assessments of reasoning ability. This is the same solution that was posited by Sanders and Horn (1995). 


References:
Sanders, W., & Horn, S. (1995). Educational assessment reassessed: The usefulness of standardized and alternative measures of student achievement as indicators for the assessment of educational outcomes. Education Policy Analysis Archives, 3(6). Retrieved fromhttp://epaa.asu.edu/ojs/article/view/649/771

Thursday, March 28, 2013

What is a Superhero?

[Johnathan Clayborn]
I was at work the other day on a conference call. During the start of the call the host wanted us to identify ourselves and had us do an ice-breaker by stating who our favorite superhero is. Most people had answers to this question right away, and I gave an answer too, but as the meeting progressed I really started thinking about the question in a lot more detail. What IS a superhero anyway?

To many the obvious answer would be someone with super powers. Some of the iconic superheroes of modern times, like Superman, for example, were born with their powers. Others, like Spiderman and the Fantastic 4, went through an event that changed their physiology to turn them into superheroes. Still others, like the X-Men, are a combination of both methods, but their powers are the result of genetic mutations rather than some kind of magical force. But, is that really what defines a superhero?

Let's examine this for a moment. Superheroes have super powers, sure. But, then again, so do super villains. Look at Magento, for example. He has super powers, he just chooses to use them for different purposes. So, clearly being a superhero is more than just having awesome abilities. A lot of it boils down to the choices that you make; what do you use your abilities for? If you use your abilities for evil or simply to further your own personal gains, then by most people's definitions you would not be a superhero. But, if you used your powers for the betterment of mankind and to combat evil, then surely that would qualify you as a superhero.

So, with that concept in mind let's re-examine our opening criteria; what is a superhero anyway? It's someone who has super powers and uses their abilities for the betterment of the greater good. Right? Well, what about Batman? Here's a superhero who defies that logic. He's just a regular person, granted, he's a rich, intellectual person, but he has no super powers. All he has is cool gadgets. He chooses to spend his immeasurable fortune on developing things that he needs to be able to fight crime and make a positive difference in the world. What about Ironman? Sure, he has a power-reactor built into his chest, but, he's in the same category as Batman, basically he's just a regular dude who happens to have tons of money and intellect. But, like Batman he uses his fortune to help others. Without his suit, like Batman, he's very vulnerable. So, based on these two immensely popular superheroes, clearly having super powers isn't necessarily a prerequiste to being a superhero.

So, based on that information, is it possible that a superhero is simply someone who sacrifices their own safety, well-being, and monetary gain to make a positive difference in the world for the betterment of the greater good? If that's true, then it seems to me that the world is full of real-life super heroes who are overlooked and ignored; paramedics, firefighters, police officers, military personnel. Wouldn't they qualify as superhero status also? Or are the gadgets that they use not high-tech enough?

Saturday, January 19, 2013

The Atomic Fallacy

[Greg Bullock]
Wikipedia cites the following speech to illustrate the meaning and origins of the so-called “If-by-whiskey” fallacy [http://en.wikipedia.org/wiki/If-by-whiskey]:

My friends, I had not intended to discuss this controversial subject at this particular time. However, I want you to know that I do not shun controversy. On the contrary, I will take a stand on any issue at any time, regardless of how fraught with controversy it might be. You have asked me how I feel about whiskey. All right, here is how I feel about whiskey:

If when you say whiskey you mean the devil's brew, the poison scourge, the bloody monster, that defiles innocence, dethrones reason, destroys the home, creates misery and poverty, yea, literally takes the bread from the mouths of little children; if you mean the evil drink that topples the Christian man and woman from the pinnacle of righteous, gracious living into the bottomless pit of degradation, and despair, and shame and helplessness, and hopelessness, then certainly I am against it.

But, if when you say whiskey you mean the oil of conversation, the philosophic wine, the ale that is consumed when good fellows get together, that puts a song in their hearts and laughter on their lips, and the warm glow of contentment in their eyes; if you mean Christmas cheer; if you mean the stimulating drink that puts the spring in the old gentleman's step on a frosty, crispy morning; if you mean the drink which enables a man to magnify his joy, and his happiness, and to forget, if only for a little while, life's great tragedies, and heartaches, and sorrows; if you mean that drink, the sale of which pours into our treasuries untold millions of dollars, which are used to provide tender care for our little crippled children, our blind, our deaf, our dumb, our pitiful aged and infirm; to build highways and hospitals and schools, then certainly I am for it.

This is my stand. I will not retreat from it. I will not compromise.

I believe that the original speech from which the term is derived may have been misinterpreted, and that calling the "if-by-whiskey" concept a fallacy is itself fallacious.

The lawmaker’s speech is equally likely intended to be a sardonic mockery of what I would term the “atomic fallacy”–the idea that a concept represented by the conventional shorthand term for a controversial topic is an atomic proposition, which a rational person is logically bound to be wholly “for” or “against.”

Contemporary examples  of such terms would be “global warming” or “gun control.” Because discourse on a controversy tends to be dominated by its most zealous ideologues, debate reinforces the idea that these types of terms refer to a simple atomic proposition, rather than a collection of propositions. In fact, a concept such as “global warming” enfolds many separate propositions, propositions about which intelligent people could–with logical consistency and good sense–hold multiple divergent opinions.

The multiple propositions implied by the term “global warming” include: humans are able to detect and quantify the average temperature of the earth, the earth's average temperature is increasing, increased CO2 is the cause, human activity is the cause of increased CO2, the causal activity consists of oxidation of carbon fuels, CO2 is the most significant contributor to warming, the combined effects of warming are on balance negative, warming is not likely to be countered by offsetting natural effects, humans should take action to reduce CO2, humans can take action to reduce CO2, the actions taken should center on reducing oxidation of carbon fuels, actions that can be taken can be significant enough to reduce warming, actions will not have unintended effects worse than warming, effects will not fail to due unanticipated counter effects, and record high temperatures are evidence for global warming.

The last point is related to why advocates of the global warming concept have switched to the term "climate change." People might naively think due to attention directed to record high temperatures that extreme examples of cold temperatures would therefore imply evidence against global warming. Since increasing temperatures can cause shifts in ocean currents that can cause cold weather, average warming could plausibly cause cold effects. The change in terminology is intended to avoid counting cold temperatures as evidence against average temperature increases.

Most areas of public policy debate are discussed using convenient shorthand terms. In many cases, the advocates for a cause attempt to control which terms are used. In the case of abortion, the terms used were historically different for each side, with proponents of one side using “pro-choice,” and the other using “pro-life.” In these cases, the terms were intentionally selected to draw the mind towards an atomic, single concept and imply that any other view is opposed to that concept. Few people discussing a more abstract topic would consider themselves  opposed to “choice” or “life.” The debate over the terminology demonstrates the power that the atomic fallacy gives to the term that comes to be used when discussing a topic.

The atomic fallacy impedes constructive discussion of controversial topics. Failing to examine the deeper multiple realities behind a politically charged term may lead people to believe that compromise is not possible and to view those on the opposed side as closed-minded and evil. The result is polarization, gridlock, and conflict..

Seeing through the fallacy enables people to find common ground. They may even be able to make progress on some of the contributing causes to a problem when the ultimate controversies are not resolvable. Discovering the reality of a controversy becomes less of an intractable problem, because people can pursue specific lines of questioning, in contrast to the hurling of arbitrary “facts” and slogans into which debates tend to degenerate. Most importantly, awareness of the tendency to think in atomic terms and the deliberate exploitation of this tendency for propaganda purposes can allow someone to take a more rational and deliberative approach to both thinking and personal action.