Friday, September 9, 2011

Real and imagined mathematics

Do we discover mathematics, or do we create it? Though this question is an old one, it has fascinated me lately. Firstly, an operational definition of mathematics appears necessary to discuss the subject. I would define mathematics (and I believe reasonably so) as the investigation of the properties of logical systems. These logical systems have several attributes: definitions (the meanings of various terms and operations in mathematics e.g. axis, addition), axioms (statements we take as givens e.g. the shortest distance between 2 points is a straight line), and theorems (implications of the axioms). The job of the mathematician is to deduce conclusions (on significant occasions labelled 'theorems') from the definitions and axioms. Seeing as such conclusions follow directly from the axioms and definitions via a logical progression (e.g. axiom: shortest path between two points is a straight line; conclusion: a curve is not the shortest path between 2 points) are we discovering mathematics? It would seem reasonable to say we are discovering the implications of the definitions and axioms, as those inevitably ensue within any system with given definitions and axioms. However, seeing as the axioms and definitions were decided upon by ourselves, one might also argue that we create the definitions and axioms and therefore create mathematics.
The issue with this line of reasoning, it seems to me, is that while we may agree on certain axioms and definitions, I don't believe this implies we've created them. We have simply arrived upon certain ideas and agreed upon them. Before man defined what a line was, did the concept not exist? I think it's important to specify that I am not talking about real world approximations to the concept of an ideal line, which undoubtedly existed well beforehand, but rather the concept of an ideal line itself. Is it fair to say that we arrived upon this concept and accepted it, or that we invented it? Indeed, to say we invented it would seem to imply that the definition of a line depends entirely upon the mental capacity of human beings. This might imply by extension, that if our mental processes functioned differently, the definition of a line would change, a notion that seems utterly at odds to with me. We might call it differently, and refer to it in different ways, but the notion of a line remains the same in my opinion irrespective of human existence.
If we examine the other line of reasoning (which I am more inclined to), that we discover mathematics, this would imply that mathematical concepts have an independent exist of their own thereby allowing us to discover them. The immediate question here, is if mathematical concepts have an independent existence, where do they exist? Here we find ourselves returning to platonic ideas: the theory of forms. The theory of forms states that abstract concepts such as mathematical ones have an independent and immaterial existence outside any parameters we associate with the physical world, such as space and time. This is a hard notion to ever accept, and one for which by definition we can never produce empirical evidence. Nonetheless, it works for me.

Thursday, June 2, 2011

Metapost: The End of Junior Year

This final quarter, my blogging has changed in its structure and approach. Rather than dive straight into my opinions on an issue, I have organized my posts to explain the issue objectively at first and provide all the necessary information before then expressing my stance on the subject. This change I believe has been brought about by my work on my Junior Theme, in which I was required to explain an issue pertaining to the USA without expressing any opinions on it at all. The Junior Theme assignement taught me the value of empathy for the reader: assuming the reader does not know the subject and therefore explaining it to them adequately. If I do not inform my reader on the subject, it is useless for me to express my opinion on it at all.
This change in my blogging was important this quarter as I adressed some complicated subjects which required initial explanation. My post "The singularity or not?" adressed the ideas of the futurist Ray Kurzweil. Rather than arguing for or against his ideas immediately, I took time to clarify what they were and the concepts surrounding them before expressing my opinion on them: "Before continuing further, it is important to define the meaning of artificial intelligence. Currently it is split into two kinds: weak A.I., which has already been achieved to an extent, and strong A.I., which has not... ." In this first paragraph of my post, I define the difference between weak and strong A.I. (artifical intelligence), so that the reader knows exactly what it is that I am critiquing. My criticism turns out not to be of A.I. as a whole, but the idea of strong A.I. My prior explanations were important to making my post clear.
In my most recent post, "Down the rabbit hole", I once again make sure to explain the issue I am discussing. The post discusses the constant state of flux scientific ideas are in, with particular reference to the recent experimental evidence against Supersymmetry theory. To clarify my ideas, I make sure to provide an explanation of Supersymmetry and its potential importance as a theory: "Super-symmetry is a theory that relates elementary particles of one spin to to other particles that differ by half a unit of spin and are called 'super-partners'. This symmetry, if proved, would have the potential to help eliminate many known problems and quandaries in modern day physics." This brief explanation of Supersymmetry ensures that the reader understands the relevance of it to my post.
While I believe my empathy for the reader has improved, I have room to strengthen my arguments via more frequent reference to articles and sources. My post "Dreamers", which discusses the 'american dream' and its applicability to rich people, could have benefited from a reference to a dictionary definition of the 'american dream', to support my argument that its defintion prevents it from being available to rich people. Likewise, my earlier posts on the progress of my junior theme could have been made more meaningful to readers with links to sources I was using for the project. Without links to my sources, my posts were less effective in showing the reasons behind my thought processes. Overall, however, I believe my blogging has progressed well since the beginning of the year.

Monday, May 30, 2011

Down the rabbit hole

We've come a long way since Copernicus dared to propose a sun centered solar system. We've moved past Newtonian mechanics. We've arrived at Einstein's theory of relativity. We've probed even further into the workings of quantum particles. And today we find ourselves trying to tie it altogether into one grand and unified theory. Unfortunately, whenever we think we've arrived there, or believe we've come close, we haven't. New data always comes along and shatters our previously "solid" ideas. The standard model today for particle physics is known to be incomplete. It fails to explain phenomena such as gravity and the abundance of matter versus anti-matter. For many years the strongest contending theory proposed to solve such issues was 'super-symmetry'. Super-symmetry is a theory that relates elementary particles of one spin to to other particles that differ by half a unit of spin and are called 'super-partners'. This symmetry, if proved, would have the potential to help eliminate many known problems and quandaries in modern day physics.
However, supersymmetry, which predicted the shape of an electron to be egg like, has benefited from little experimental proof, as recent experiments have measured the electron to be (as far as we can tell) spherical. http://www.bbc.co.uk/news/science-environment-13545453 Seeing as data has appeared which contradicts a prediction of supersymmetry, physicists may find themselves having to rethink the theory. It would seem we are back to the drawing board again. Our worldview and scientific understanding never seems to cease changing. Will it ever? Will we ever understand the universe completely and come to the end of mankind's scientific exploration? I doubt it. The designer, if any, sure made it a challenge.

Tuesday, May 24, 2011

Dreamers

The term 'the american dream' brings many images to mind. Chiefly, however, it signifies the rise from rags to riches. It is the possibility of this phenomenon which supposedly marks the USA as unique: the opportunity for the poor to rise to wealthy. This is the definition of the 'american dream', and ought not to be confused with the general dreams that americans have.
A dream implies a fantasy about that which one does not have, and therefore to experience a dream, one cannot already be in possession of the object (material or not). To experience the 'american dream', which refers to the dream of obtaining wealth, one cannot already be in possession of wealth as it would then no longer be a dream but a reality. Therefore, those born into wealth in the upper class cannot experience the american dream. They are already rich, and wealth is no longer a dream. They are certainly able to experience and pursue other dreams, but not the dream coined as the 'american dream'.
People's perception of the 'american dream' may vary. Some may view it as the pursuit of wealth, others of happiness and others of success. However, this variation in perceptions does not change the true definition of the term 'american dream'. The term coins one specific dream only, irrespective of perception and its definition is in itself not open to interpretation. The dreams of american society, however, are. Seeing as the american dream has a precise definition, it is therefore open only to those born into sub-wealthy conditions.

Wednesday, April 13, 2011

The singularity or not?



Several weeks ago in Time magazine, I read an article which gripped me and set my mind working intensely for days upon end. The article focused on the visions of a respected science writer and inventor, Ray Kurzweil, who is closely affiliated with a movement known as the 'singularitarians'. 'Singularitarians' predict the arrival in about thirty years of a technological revolution called the 'singularity' which will change human society as we know it. This revolution will arrive in the form of artificial intelligence. The article is right here http://http//www.time.com/time/health/article/0,8599,2048138-1,00.html.
Before continuing further, it is important to define the meaning of artificial intelligence. Currently it is split into two kinds: weak A.I., which has already been achieved to an extent, and strong A.I., which has not, and many including myself would argue, never will be. Weak A.I. involves the creation of computers and robots which can replicate human behavior: our speech, motion, and the like. Strong A.I. refers to the creation of machines which not only mimic the physical actions of humans, but which are self aware and think just as we do. Ray Kurzweil bases his prediction for the coming of strong A.I. in 2045 off the idea of exponential growth. He examines the rate of progress in computing power over the past hundred years or so, which has grown exponentially (that is, the rate of improvement has become faster itself), and extrapolating off this growing rate of progress, predicts that computers will be made by 2045 which fully replicate the structure of the brain and are therefore conscious just as we are. He believes, in fact, that they will surpass human intelligence.

Such ideas make for fantastic headlines which draw readers magnetically. There is something quite mind churning and provocative about the idea of conscious machines. Singularitarians such as Kurzweil however, are computer scientists and engineers, and coming from such backgrounds ground their quest for A.I. in a fundamental philosophical assumption which they fail to truly examine themselves: the belief in a metaphysically materialistic worldview. By materialism I do not refer to anything relating to consumerism, but to the metaphysical assumption that the foundation of all reality is physical matter, and everything, including the human mind, is a product of physical matter and its interactions. Proponents of strong A.I. believe that the human mind, or 'essence', if you will is wholly reducible to the interaction of the brain's material parts. It can therefore be recreated with robotics.

Many casual readers of the Time article would likely fail to question this assumption, or rather would look away repulsed at the concept of conscious machines, unable however to provide a logically coherent argument against its possibility. My stance against materialism and the possibility of strong A.I. arises from what many philosophers of mind have termed the 'hard problem of consciousness'. The hard problem of consciousness refers to the logical explanatory gap in how one is able to describe all the working parts of a physical structure, such as the human brain, and then explain how, due to its physical interactions, it begins to feel and become self aware. Neuroscience is able to find the brain's nerve correlates of various feelings and thoughts, but is unable to explain how these nerves actually cause conscious experience. If one takes any arrangement of matter, and then proceeds to make it interact in any way, the laws of physics tell us nothing more than the physical interactions which should then ensue. There is nothing in any law of physics which tells us when matter should begin to have conscious experiences. There is no 'Newton's fourth law'. Moreover, any progress in physics will also yield only new laws of physical interactions, which will still leave this explanatory gap: why is it that when physical matter is arranged in a certain way, consciousness, as if from nowhere, arises? The philosopher of mind 'David Chalmers' is perhaps the strongest proponent of this problem. Here is one article of his in which he summarizes the issue. It is long and tough to read, but if one wants to it is well worth the time http://http//consc.net/papers/facing.pdf I think it is important to mention that the hard problem of consciousness is recognized not just by philosophers but by a number of notable quantum physicists today. The esteemed american physicist Henry Stapp sees this problem as the achilles heel of materialism as a metaphysical worldview. He is himself a proponent of mind body dualism: the metaphysical belief that the mind is separate from the body, and in his theory, acts through the brain.

Dualism is an old philosophy formulated by the french philosopher Descartes. He was most famous for his famous words "I think therefore I am", alluding to the idea that since he was able to think he could not doubt that his mind existed. He could however doubt that the physical reality around him existed, as his senses of sight, hearing etc. could all be illusions. He theorized that the mind was an immaterial substance, or soul, which controlled the material body. Dualism fell out of fashion with the development of Newton's mechanics. Following this development, the world was believed to be completely deterministic: all future events were caused directly by past ones, and so could be predicted to infinity. If some immaterial entity were to act upon the physical world, it would violate this fundamental postulate of Newtonian physics. With the coming of the twentieth century, this view was shattered. Indeed all twentieth century physics is so far removed from everyday experience, it is hard to accept if one is not acquainted with the field. Einstein showed, astoundingly, that time is in fact non linear, and travels at different rates for different observers. Subsequently, with the advent of quantum mechanics, the concept of determinism was overthrown. At the quantum mechanical level, events are somewhat random, and cannot be predicted for sure, but only given a probability of happening. That is to say, nothing at the subatomic level of physical reality is for sure set to happen. There are only probabilities. This opens up the door to dualism again. Consciousness, if a separate entity, could perhaps influence this randomness. This is the angle physicists such as Henry Stapp take. He is not the only one, and the noted Oxford University physicist Roger Penrose has devised his own theory in which consciousness is not the product of the brain but a fundamental entity of the universe itself. His ideas are heavily criticized by his materialistic peers, but seeing as materialism is a common prejudice among scientists today brought up with such a world view, and scientists are only beginning to investigate the true nature of consciousness, all bets are off at this point.

Am I a proponent of dualism? Perhaps, I haven't quite decided. I personally find the view of neutral monism more compelling: the idea that matter and mind are not separate, nor the same, but two different properties of some other more fundamental reality (the famous late twentieth century physicist David Bohm and 17th century philosopher Spinoza took a similar stance). One thing I'm sure of though is that I'm not a proponent of materialism, and do not think I'm a robot, as proponents of the quest for strong A.I. would have me believe. The study of consciousness is in its infancy, and I believe singularitarians among others should take a second to question, though not necessarily reject, their materialistic worldview. True scientific understanding cannot be advanced when prejudices go unexamined. I'm sorry this post was so long, but I had a lot of ideas to put out there.

Thursday, March 24, 2011

My junior theme and materialism

For my junior theme I have decided to explore the topic of american materialism. However, the initial proposed question, "why is american society so driven by materialistic gain?" was too broad in scope and unclearly defined. I needed to find a specific aspect of america's materialistic culture from which to approach the issue. I chose advertising. The statistics for american advertising expenditures are startling, with the spending figure rising to $131 billion in 2010. A figure such as this begs the question as to the reason for its magnitude. In exploring the reasons for such a figure, my junior theme will focus on america's consumer culture, which drives such high levels of advertisement. Other factors including americans' time spent before televisions and other forms of entertainment will be examined, but all such pastimes really fall under the scope of american materialism as well, allowing my junior theme to explore the topic extensively.

Monday, March 21, 2011

Determining my junior theme

I am currently considering two possible directions for my Junior theme. The first involves an examination of american consumerism and the USA's materialistic values. I will try to answer the following question: why is american society so motivated by materialistic gain? Though the question may need to be defined more specifically, answering this question will require examining, among other things, the history of the USA pre and post the second world war and the changes in philosophies the USA underwent. I will look at the 'american dream': the dream of rising from humble to wealthy that is supposedly the defining dream of american citizens, and examine reasons for its purely materialistic nature.

The other possibility I am considering for my Junior theme entails examining the causes of the USA's current economic crisis. Although the answer is seemingly obvious: too much was borrowed leaving the country in great debt, I will look at the government's tax policies and main areas of spending which drove it to borrow so much. In doing so I would hope to answer why the USA is in its current economic situation. The weakness of this topic as aJunior theme question is that the historical scope is small. At the end of Clinton's administration, the USA's economy had a surplus, and so the reasons for the economic crisis can be traced back only to the beginning of the Bush administration, which was for the most part responsible. One way to overcome this shortcoming for my Junior theme might be to examine the USA's history of debt and borrowing as a whole, and compare it with the actions of George Bush's administration in more recent years.

Sunday, March 20, 2011

A nuclear future or not?

The recent earthquake and tsunami in Japan have left over 10,000 people dead, tens of thousands injured, and a further 500,000 homeless. The figures are devastating and horrific to any who read them. However, the issues raised by these natural disasters unfortunately extend further. The Fukushima Daiichi nuclear reactor, located 150 miles north of Tokyo, had holes blown in two of its six reactors and its cooling systems blown out. Among the greatest risks posed by the nuclear crisis is that of radiation leakage, and the Japanese government's attempts to contain high level radiation are under close watch by the rest of the world. Escaped radiation is deadly and can result in fatal radiation sickness for those overexposed to it. The government has already evacuated all those living within 12 miles of the plant, and urged those within 18 miles to stay indoors. Should the radiation spread, though, much more of Japan may be affected. Such a nuclear disaster has inevitably raised great concerns throughout the world for nations developing nuclear programs of their own. Should a similar disaster befall them, they will be at the same deadly risk as Japan is now.
Recently, Venezuelan president Hugo Chavez called off his nation's plans for nuclear energy development in response to Japan's crisis http://www.bbc.co.uk/news/world-latin-america-12768148. In his eyes, the risks are too high, especially considering Japan's high technological proficiency and safety measures, all of which failed to prevent the disaster. In Chile, President Sebastian Pinera nonetheless decided to proceed with nuclear development plans, despite Chile's location on a dangerous ring of seismic activity named the 'ring of fire'. Similar debates in the UK ensue, where the CEO of EDF energy, Vincent De Rivaz, has argued that the UK's nuclear development programs must go ahead whilst taking into account lessons learned from Japan's disaster. http://www.bbc.co.uk/news/business-12799322. He sees no alternative to satisfy Britain's growing energy demands.
Despite its obvious risks, one must wonder if nuclear development is an option at all. Japan's disaster was a tragedy, yet in the face of fossil fuel depletion, rising energy consumption and an increasingly mechanized world, there seems to be little choice. Alternative energy sources predominantly feature solar and wind power. Neither of these are able to provide close to the required energy output to sustain developed countries such as the USA. While these may be used to augment other solutions, a stronger base of energy production is required, currently provided by fossil fuels. But what will provide this energy base when fossil fuels, given there non-renewable nature, run out? Nuclear power is the only source as of yet able to provide an adequate energy supply in the coming future. Unfortunately, given its high risks and instability it also seems like a 'pact with the devil' so to say. Unless we wish to live on a dangerous nuclear powered planet, finding a strong and sufficient alternative energy source should be amongst the world's highest priorities.

Sunday, March 6, 2011

Tokenism

This week in American studies our class focused on the issue of television tokenism. TV tokenism may be defined as the use of minority actors for supporting character roles which receive little screen time and are often relegated to sub plots. Usually these minority characters are inserted into the show to satisfy network demands for cast diversity. The main character is almost always caucasian. Cast photos from network dramas exemplify this phenomenon.
This photo is of the cast from the TV show "House", a

mature network drama following the story of a genius yet arrogant doctor whose unorthodox


ways of thinking allow him to solve challenging medical mysteries in the hospital. Among the


members of his team is a black doctor, Dr. Foreman, who worked his way up from poor origins.


Foreman's parents are strongly christian, and he is portrayed throughout the series as a resilient


and tough character. Foreman's character is depicted with the common stereotypes held of


African Americans. The cast needed diversity, and "House" fulfilled the requirement. The


leading character, House, is of course caucasian.

While House is only one example of TV tokenism, the phenomenon is common in other shows


and occasionally movies as well. However, shows are businesses. They sell us what we


consume.If television tokenism is common it is only because we as an audience demand it.

Shows are required to feature diverse casts so their audeinces do not jump to conclusions of

racism if the cast is lacks minority characters. Leading role characters are commonly caucasian

because we as an audience relate more easily to such characters and demand that this be the

case. If it were otherwise, shows would assign leading parts to minority actors. If television

tokenism is to be addressed and changed, our own attitudes must change. Businesses

strive to satisfy the customer, and if customer demands change, then so will the business.















Friday, February 18, 2011

Gay marriage and change

This week in class we discussed the emergence of the gay marriage campaign in recent years, and reasons for the timing of its appearance. In the past century american society has experienced the civil rights movement, suffrage, and now campaigns for gay marriage. Are they linked or is their proximity coincidental? Linking the emergence of the gay marriage movement with the preceding civil rights movement may appear a stretch, however, I believe history shows a greater likelihood of rights movements emerging following recent significant changes.
In Europe, the fall of the Roman Empire was followed by a long, largely stagnant dark age for human progress, during which the catholic church's dogma held full control. However, following the reformation: the protestant break from the catholic church, the scientific revolution quickly followed. It is no coincidence that the move to scientific reasoning as the dominant worldview closely followed the breaking of the catholic church's control. Questioning of the catholic church quickly led to questioning of the church as a whole, and so the reformation set the stage rather quickly for the scientific revolution. When one significant change is achieved, it would appear the changed environment opens the scope of people's views to other potential change. A period of rapid change therefore follows. This pattern was reflected in England's fear following the success of the French revolution that such a revolution would occur in their own country. The fear stemmed from worry of change in France opening England's minds to change in their own nation.
Within the USA the suffrage movement and civil rights movement also emerged in close proximity with one another. As soon as the unequal status of one group was questioned, this opened people up to questioning the unequal status of another group, though whether suffrage preceded or followed the civil rights movement I am unsure of. The past century has undergone a dense number of changes ranging from women's rights, to African American rights, and now campaigns for the rights of gays to marry. It would only seem likely that these changes are following the same pattern as others have done: periods of slow change intermingled with periodic shorter ones of rapid change. The gay marriage campaign, I believe, was sparked by the significant progress achieved by the civil rights movement. Change, it would seem, it somewhat viral, and for the good or the better, follows on from itself quickly .

Monday, January 17, 2011

Metapost: another quarter of blogging

This second quarter my blogging has improved upon the weaknesses of my first quarter's posts, while still leaving room for improvement in arguing for my points of view. Last quarter I criticized the slow development of my blogs to provide a clear thesis concerning my view on the subject matter. This quarter, from the outset, my first blog "Freedom's boundaries" makes my position clear on the management of civil liberties in wartime: "I do not believe anyone should be allowed to preach violence against fellow americans in the name of a supposedly 'greater cause', religious or not." This articulation of my thesis is present in each of my blogs this second quarter.
Improving on my first quarter's blogging, I have made sure to include source links more frequently in my posts. Posts which address current affairs, such as "Virtual Worlds", give reference to an article in order to factually support my arguments. Posts which address theoretical discussions e.g. "Adolescence, adulthood and the in between" do not. While such posts address theoretical questions, such as the definition of adulthood, the issues discussed still apply to modern day society and so reference to news articles could have still helped strengthened my argument in showing its relevance today, e.g. how the behavior of today's adolescence affects the way we ought to define adulthood.
Since the start of the quarter my blogs have progressed in the elaboration of my arguments. In one of my earlier blogs, "The enemy", I state my opinion without truly showing the thought process behind it, "While this was my initial reaction, my thoughts changed quickly. While wars are a clash of force between two countries to settle a conflict, how we conduct wars determines our moral status." I describe my thoughts, without the reasoning that led me to them. In my final post, "Right and Wrong", I provide the reasoning behind my opinion: "If others held opposing viewpoints, by which infidelity was perfectly acceptable, there would be no apparent logical process by which I could prove them wrong. The issue would come down to a clash of viewpoints..." In this post I show the reasoning behind my viewpoint that an absolute standard for morals is undefinable. The argument is developed in greater depth further on in the post.
Although my development of argument has progressed, this coming semester I should take greater care to address the opposing side of the argument, something I have for the most part failed to do, and which can only lend stronger credibility to my own argument.


My best post: "Right and wrong"

Sunday, January 16, 2011

Right and wrong

This week in class we examined Kohlberg's stages of development in a human being, which examine human motivations to follow rules and laws at different stages of life. The motivations begin with fear of punishment, progressing to individual gain, then to good relationships with others, later to maintaining social order, then to viewing laws as a social contract enabling individual rights, and finally to the recognition of universal principles. Indeed not all individuals necessarily reach the final stage or stages, but Kohlberg maps out the moral development most people undergo from infancy to old age. The varied motivations to act justly, progressing from individual, to societal to universally moral perspectives raised an issue in my mind: the ability or inability to justify any set of morals.
How do we define right from wrong? Most of us are raised with a cultural viewpoint on the definition of moral behavior, but is it possible to justify our morals beyond mere individual or cultural perspective? For instance, if two people debate the moral status of adultery, how can the correct view point be determined? I myself am undoubtedly against it, but if asked to justify my view, I struggle to think what I could say beyond the obvious statement (to me) that cheating on a marital partner is wrong. If others held opposing viewpoints, by which infidelity was perfectly acceptable, there would be no apparent logical process by which I could prove them wrong. The issue would come down to a clash of viewpoints, both of which were likely determined in our cultural upbringing, in my case a western one which supports the concept of fidelity. Other cultures around the world see no problem with polygamy though.
If morals are all defined by perspective, how can standards for moral behavior be set and imposed on others? An article in the BBC caught my attention not long ago concerning the issue of stoning in Iran, http://www.bbc.co.uk/news/world-middle-east-10956520. The article described the case of a woman sentenced to death by stoning (though the sentence may have been changed to hanging) for adultery. This provoked understandable outrage amongst human rights groups. The issue that arose in my mind, though, was that if Iranian culture sees the stoning of unfaithful women as morally acceptable, what higher authority if any can such human rights groups appeal to in order to prove the moral error of this Iranian practice (which I certainly consider outrageous, in case I am misunderstood)? Many may turn to God and religion as the source for their morals, but for those who lack such beliefs, a vacuum is left in which an absolute standard for morals is seemingly undefinable and arguably unjustified. One can no more argue for the morality of human equality than for the immorality of a monarchy beyond their own personal conviction that one is right and the other is wrong (which is my moral conviction nonetheless). Whilst one might believe in equal rights for all, the other might believe in the divine right of kings to rule, which was the common belief in mediaeval Europe and still is in some countries.
In the absence of a certain method by which to define human morals, it would appear all we can do is define our own country's morals as the majority sees fit. As soon as we start trying to impose our own definitions of right and wrong elsewhere, we have entered into murky waters in which no issue or debate can be resolved with certainty.

Wednesday, January 12, 2011

Adulthood, adolescence, and the in between

This week we examined the theme of transition from childhood to adulthood, seeking to find a definition for the transition. Is it an arbitrary age dictated by the law i.e. 18? Is it a state of mind? Or is it financial security? In examining each of these propositions none of them seemed the right one to me. The age of 18 has nothing special about it, at least not in my eyes. Most are far from emotional maturity. Two years have passed since education was mandatory, and 16 is certainly too early an age to be considered adulthood. One does receive most of their rights at eighteen, however the validity of this legally chosen age must be questioned, as the issue in my mind is what ought to constitute adulthood, not what is legally considered adulthood.
Many may consider adulthood to be the reaching of some mindset or maturity level. However, the problem with this is that one then has the trouble of defining the required mindset. Is it intellectual? Is it ambitious or spiritual? It seems impossible to me to define the adult mindset when all adults have different mindsets and often different opinions on what constitutes maturity. One might define adulthood as the mindset of considering oneself an adult, however this suggests that one can never be mistaken in considering themselves an adult too early. I think adolescents often consider themselves fully grownup before they actually are. They often, for example, consume drugs without readiness or desire to fully consider the implications of their decision. They make rash decisions in an attempt to establish their independence, while being unwilling to accept the consequences.
In my view, accepting full consequences for one's actions defines adulthood. Those still supported by their parents financially cannot be considered as adults, as their college fees are paid for by their parents shielding them from the financial consequences of unpaid college tuition. To attain adulthood, one must take full legal and moral responsibility for all their actions, without protection from such consequences by the law or their parents. Legal adulthood is therefore only the first step in adulthood, representing full legal accountability for their actions. The next stage is cessation of any form of reliance on their parents to help them deal with consequences. A true adult is responsible in all ways for himself and the state of his life. Only when one receives the full blow of his or her decisions can he or she be considered an adult.

Saturday, January 1, 2011

Virtual worlds

This vacation I saw the movie Tron: Legacy, a sequel to the original movie depicting the story of a computer scientist who accomplishes his dream of inserting himself physically into the world of the computer and creating a virtual world inside it. He considers this virtual world perfection, and the future of the human race, until the programs start creating themselves and turning against him. While the movie is fun science fiction, it sends a message about the increasing time today's generation spends immersed in the world of computers and the internet.
Today it seems adolescents spend almost as much time online as they do communicating verbally with each other. The online world is a key a component in the lives of users of facebook, twitter, myspace, and other social networking sites. Such sights are perhaps almost as important to them as interactions in the real world. http://mashable.com/2010/08/02/stats-time-spent-online/ Social networking has gradually increased its time slot in users' days, until it now dominates almost all other forms of communication. Perhaps future generations will slowly replace personal interactions with online activity.
Gradually we may find ourselves living in a virtual world if we let our fascination with the internet and its countless vaults of information dominate our time. And perhaps, as in Tron: Legacy, we will find ourselves living in a world seemingly desirable to begin with, but ultimately unpleasant.