I have been reflecting on regrets for some time now and on the reasons for someone to regret something. Imagine the following scenario: Wife has a very stable job. She and the husband decide he should quit his job and dedicate three years exclusively to taking care of their newborn child. Husband quits his job. A week later against all odds wife loses her job. Now both are unemployed.
Should they regret their decision about the husband’s job? It certainly put them in a situation they would rather not be and having decided differently would have granted them a better condition now. Nonetheless, I argue that it is not rational to regret their decision.
The loss of the wife’s job was an unforeseeable event, it makes no sense for them to resent their failure to anticipate it. We must asses decisions based on the knowledge we had at the time of making the decision, not based on information we acquired afterwards. Obviously that if you know you know you may have insufficient information and have the resources to become more knowledgeable before making your decision, and still decides not to do that, you do have grounds to regret your failure to study your options better.
But in the case of the couple above, they did take into account the possibility of the wife losing her job, but considered it to be statistically insignificant. They weighed the facts and made their decision. There is no reason for them to second-guess their decision making ability. We constantly make statistical decisions and have to choose a compromise. I am less likely to die travelling by public transport than driving. However, the likelihood of me dying whilst driving responsibly is sufficiently small for my comfort requirements to outweigh my fear of death in this case. Therefore it is reasonable to choose the comfort of driving even though it is statistically more dangerous than the alternative.
In the end, my criterion for regrets is the following:
Did I make the best decision I could with the information I had available at the time?
If I did, there is no reason for regretting it, even if the outcome was unfortunate. There are more things outside of our control than we could ever know.
Obviously, don’t use this to justify lousy decision making! It is our duty to assess everything that is at stake, come up with possible alternatives, and have contingency plans for worse case scenarios. The more important the decision, the more elaborate and rigorous your reasoning must be. If you did all of that, treat new circumstances for what they are: new circumstances. Feeling guilty when you made a fine choice will only undermine your future decision making.
This more elaborate view of regrets allows me to better assessment my past choices, which leads to a more judicious improvement of my decision making process and greater confidence in my verdict (even when the outcome is unexpected). It also frees me from the undue emotional burden of irrational regrets, relieving my cognitive load and allowing for better future decisions.
Like millions before me, I am terrible at waking up early. I’m also really bad at sticking to a habit or a set routine. I still often come up with a life-changing decision, something like cycling at least 30 minutes every day, and then start slacking until I quit after a week or so.
Sucking at these things is not an exclusive privilege of mine. From what I’ve read, apparently you may do that too. Actually, if you are anything like me, you are reading this very text knowing you should be doing something else but you have no willpower whatsoever. Never fear, there is a way out.
Even though I’m bad at it, I’ve had daily habits and routines for the last 8 years or so. Lately I started sucking at it even less by understanding a bit of the science behind it. Basically, the greatest enemy of habit is Willpower Depletion.
To make certain decisions we need willpower, but willpower is not an infinite resource. To exert self-control we use a lot of glucose, and if we run out of glucose our willpower gets depleted (Roy F. Baumeister et al., 2007). This means that if we need a lot of willpower to do something, we will easily end up running out of it and giving in.
There is a cute experiment where researchers placed a few kids in a room with a marshmallow, told them “If you wait until I come back, you will have two marshmallows”, and then left. It’s a simple concept of life, if you can defer your gratification, you will have a greater reward later. In this case the kids were alone in an empty room with a marshmallow, not eating it required heroic efforts.
So, which kids won? Were they the most genetically optimised for this situation? Nope. Many kids lost in the very first seconds. The most successful kids were the ones who tried to distract themselves by doing something else. This meant that they didn’t need to exert a lot of willpower because their mind was focused on something else.
The solution for waking up early, sticking to a diet or committing to a habit is setting things up so that you need to exert the least amount of willpower possible. Change your environment to induce you to keep to your habit.
The classic example is that if you are on a diet, instead of resisting the ice-cream in the freezer, simply don’t buy ice-cream or quickly throw away the one that is there. This way there is no willpower required to resist the temptation. When waking up early, get out of bed as quickly as possible so that you don’t have to be deciding not to go back to sleep for 20 minutes. I like putting my alarm away from the bed so I have to get up to stop it.
And that’s how you stick to your habits, by changing the environment so you don’t have to be deciding on doing them every time. Religiously dedicating some time to reading every day is quite easy when you have 40 minutes train commute to and from work.
In 2011 the phenomenon of Massive Open Online Courses (MOOC) emerged with great visibility and promises of access to state-of-the-art knowledge at low cost and in a flexible format. In a society with increasingly higher educational needs, MOOCs seem to provide a solution which disregards financial, geographical and academic requirements, having the student’s willingness to learn as their only prerequisite. Today MOOCs are increasing in size and number, unveiling a new educational paradigm for the time-conscious information era. As their stark difference with high-cost, comparatively inflexible university education becomes more evident, the question of the future of universities and the effect of this newfound model to their curriculum becomes more pressing.
In my presentation I aim to demonstrate that university education is recognised as being more than solely a collection of modules and that the new horizons presented by MOOCs demonstrate an effective and very efficient model to achieve student engagement and information retention that must not be neglected. I will argue that MOOCs will not replace universities and that the discussions they raise go beyond that of distance versus in presence education, addressing the learning and engagement patterns of a constantly connected generation. The insights provided by MOOCs cannot be ignored and will inevitably permeate the latter’s modus operandi as it in fact already have with certain aspects of these courses such as anytime, anywhere access to courseware already gaining space within UWL, being emulated by tools such as Blackboard, UWL Replay and access to Lynda.com. It will be shown that this phenomenon should not be translated into a worry that loosely trained professionals may erode academic relevance, but that the future of academia will not be threatened by these courses, on the contrary, it will be enhanced through the appropriation of their insights. Once more education institutions are called to rethink their methods to best serve society’s needs in both the transmission and furtherance of knowledge.
MOOCs and the University Curriculum
In 2008 Stephen Downes and George Siemens offered the course “Connectivism and Connective Knowledge” online from the University of Manitoba, Canada. They made it available for whoever wanted to take part on it and an incredible number of 2200 people subscribed to the course. (Johnson et al., n.d.). Afterwards, the next notable event was in 2011 when the online version of the “Artificial Intelligence” module from Stanford University had 160,000 students and in the following year “Circuits and Electronics” was offered online openly by the MIT and gathered 155,000 students (Milheim, 2013). These are some early examples of what is now known as Massive Open Online Courses, or MOOCs. Two teachers of some of those courses, amazed by what happened when they made them available online founded their own companies dedicated to producing these kind of courses in this very specific online format; that is how Coursera and Udacity came to life. After them, Harvard and the MIT formed Edx and the Open University started FutureLearn.
For those who do not know how a MOOC is like, it typically is divided into weeks, each week containing a series of video lectures, reading materials and a multiple choice test. The video lectures usually figure a whiteboard with a hand writing on it and a teacher talking as if he were right at your side. The video pauses some times and asks a question about what was being said, to which every student must answer. This way not only one smart student answers the question but everyone. At the end of the course there is usually a peer-reviewed assignment.
The founders and partners of these companies have then gone out to evangelise about the benefits of MOOCs and the revolution that they were bringing about. The impact of some of those courses was truly impressive. For instance, the first Artificial Intelligence course from Stanford University, which was mentioned before, was completed by 23000 students. This is more than the combination of all other students of this subject area in the world (How free online courses are changing traditional education., 2013). This is because MOOCs are characteristically open; they are free and they do not have strict academic requirements, which means that you can sign up for the course regardless of your background. By being online they also do not place geographical constraints on their students and people from around the world can access them. Additionally, MOOCs have been given by the best Universities in the planet and promise to deliver education of the same level that their students are getting on campus. It is a promise of state-of-the-art knowledge taught by star professors for free.
This makes a stark contrast with UK’s higher education, and actually with post-secondary education almost anywhere, which inevitably have academic requirements for enrolment, a high cost, and place a geographical limitation by requiring students to be there. This may thus raise the question of whether making MOOCs the future of higher education is an offer you can’t refuse. Good education, for anyone, anywhere, for free.
But not so fast. MOOCs also have a series of disadvantages which may or may not be a problem depending on what you expect the MOOC to accomplish. Firstly, because there are so many students, teachers cannot coach them or asses their work. Students having difficulties have to look for help in the course forum and examinations are either peer-assessed or automatically evaluated by a program. This gives room for plagiarism as there is no way to know whether a student really did his work or paid someone else to do it given that no one knows the student. The assessment question is of crucial importance and given that there is no reliable way to assess students, MOOCs cannot provide credit bearing qualifications, only certificates of course completion. Although a solution is being sought, this means that at least for now MOOCs cannot substitute university education.
But assessments are not the only reason for that. Among other disadvantages of MOOCs we see the difficulty to contextualise the content with the students’ environments, the unfeasibility of adequate group work in this online setting, and a great criticism has been their high drop-out rate. In the “Circuits and Electronics” course from MIT mentioned before, for example, only 9,000 out of the 155,000 students completed the course (Milheim, 2013). This, however, must not be mistakenly thought to be so because the course was bad or too difficult; there are many reasons why people take these freely available courses apart from earning a certificate. Actually, interestingly, many people who take these courses already have a degree and may be taking them to refresh their memory or may be interested in sections of the course.
Nonetheless, a critical problem with MOOCs is that they are currently unable to attract a stream of revenue capable of funding themselves or the organisations that make them available. Companies like Coursera, Udacity and EdX are not making money, they are losing a substantial amount of money.
But even losing this money and having these drawbacks these companies are still receiving funding and more and more universities are engaging with the idea and contributing with courses. Why? Because it is being massively adopted and people are really learning. It is hard to tell who were the ones that really learned it and who were the ones that didn’t, but people that apply themselves learn well and enjoy the process because engagement is strong with this one. These courses have gained such great visibility because they are able to capture students’ attention and engage with them in a novel and effective manner.
Now, the novelty of the method is not a product of any new principle or insight into how learning and teaching works. On the contrary, it is the application of known learning paradigms into a technological context. We know what works and what does not work in education already as we have all heard of “active learning,” “peer learning,” “flipping the lecture,” etc. MOOCs are just an online implementation of what works. The reason MOOCs are so popular, however, is not solely its active search for an appropriate implementation of common knowledge about teaching, but its impact is maximised by a passive omission of Higher Education institutions in doing so. Although we know how learning occurs and effective methods to teach, the truth is that academic pedagogy can be not very good and classes can many times amount to a lecturer giving a monologue even though we know that that is not a very effective method to achieve information retention (Vardi, 2012). With MOOCs learning is now more fun and, very importantly, quicker.
This configures a better way to achieve student engagement and knowledge retention. This could only be achieved because of technology. Technology is intrinsically disruptive as it seeks to provide better ways to do things that were already being done before to then achieve new things; thus, its goal is to make things obsolete. One of the reasons some lecturers do not like MOOCs is the fear that they may be made obsolete. And that is a very reasonable fear as resource thirsty institutions may start to offer some modules online to save money, thus hiring less PhDs and making graduate or undergraduate students course advisors (How free online courses are changing traditional education., 2013). So will lecturers be or not be made obsolete, that is the question.
Technology’s solutions have advantages and disadvantages that must be pondered. Soren Kierkegaard, a Danish Philosopher, spoke about a character which he called the ironist, a purely negative character that would not posit anything but only negate. He would undermine the present state of things bringing a crisis to it and out of this crisis, a new paradigm would emerge, establishing itself as the new standard. The ironist, however, had no idea about the new paradigm; he didn’t propose or expect it, he was only aiming at bringing about the crisis, undermining the status quo; he was merely destructive. That is technology. It shows the points of lack and omission in the current educational system, pointing out what is wrong, but do not propose a viable solution; it is now the role of the University to plan a new structure to support these findings. Once a paradigm has been undermined we cannot go back to it anymore, we must face its shortcomings and seek a new model. When technology progresses we cannot ignore it, we must appropriate it, make it ours and progress from that point. There was an epoch when calculations were done by hand by people. When electronic calculators came about there was no point in ignoring it by fear of losing the job, people had to make use of it and improve their roles. Likewise, Universities must not hold on to their current state but embrace this new advance making good use of it.
The time has passed when the retrieval of information was hard, time-consuming and expensive; when the teacher was the sole bearer of the content and of information about where to find the content. Which books should you read, which authors are doing the best work. It is hard to search that through books. You had to go to class and get the most of what you were getting there, because you were not able to get it anywhere else but with your teacher in the classroom. Then the internet came and gave access to content to everyone. Not only during a lecture, at any time; not only at university, anywhere. Information was finally within everyone’s reach, but the internet still didn’t provide the means through which people could assimilate the knowledge contained in this information, that is, it didn’t teach the content that it made available. It is as if you were given loads of parts and told that now you can build this incredible robot, but no one would give you an instructions manual. You had the parts but you could never accomplish building the robot. That is the relevance of MOOCs and the way in which they are revolutionary; they bridge this gap; they introduce an effective way to teach people the content that was made available to them. It can teach at any time, anywhere, teach well and in an engaging manner.
So will they at some point replace universities, then? Well, with great powers comes great responsibilities. Professor Susan Holmes, from Stanford University, said: “I don’t think that Online Courses can give you a Stanford education just as I don’t think that Facebook can give you a social life.” (How free online courses are changing traditional education., 2013). To answer this questions we must understand that a university education is much more than just a collection of completed modules. More than inculcating knowledge into a student’s head, a university aims to provide a liberal arts education that will form a critically thinking person, someone able to write and express himself well It presents students with a new environment, that of academicism and research. For undergraduates, for example, nowadays it is a pivotal part of their lives that represents the coming out from underneath the parents’ wings and making the first, or final, steps towards independence
It is thus clear that MOOCs although excellent at conveying information cannot emulate the social aspects of Universities. We must see that for universities the impact of MOOCs is not their evacuation due to a stampede to online courses; the impact of MOOCs is the impact of their methodology. And that is the main point of my presentation; that the impact of MOOCs consists of the popularisation of their methodology and what it can accomplish. This is the point where technology was disruptive and undermined the former state of things. It now requires a new teaching paradigm. Why would I have to commute to a place to hear someone soliloquizing live if this could be recorded so I watch it anywhere I want? Why pretend that we still live in a time where there is no other way to convey this information other than telling it to me in person? Now there are good, effective ways to convey information online, sometimes better than in person. Videos intercalated with questions; lectures in pieces of about 10 minutes to avoid loss of attention; courseware available 24hours. The University must embrace and appropriate these methods and understand that they are very good at conveying information but also that this information is not relational. Asking questions, for example, is not good in online learning. Universities must find ways to strengthen the social aspect in classroom time, the interrelations part of learning. To call people into a room to give a one hour monologue is not good enough anymore. Making lectures available online will require classes to be better thought through and lead to better teaching (Rakera Tiree, 2015).
The integration of those technologies is inexorable, as time passes standards grow higher and higher and anything below those standards will sooner or later stop being tolerated. The University of West London is adopting some of those technologies already such as 24 hour access to courseware through Blackboard, access to classes online with UWL Replay and I had one module in which content revision consisted on the completion of courses on Codecademy. Thus this methodology will penetrate through academic teaching and a better use of time will be accomplished with a transference of knowledge that is more effective, having the shortcomings of online education, such as assessments and the social aspect, being supplemented by the university’s physical resources. There is still a lot of room for improvement in the usage of online tools and the format of the classroom. Partnership with companies that offer MOOCs should be considered envisaging to offer additional modules online and thus create a more diverse and diversified curriculum.
It is now responsibility of the University and its lecturers to build a solid way to integrate this fantastic tool into academic teaching, recognising that in a time of continuous improvement, things will never be the same again.
I would like to read some words of Karl Marx on the revolutionising of production, but that fit very well with the evolution of teaching:
Constant revolutionising of production, uninterrupted disturbance of all social conditions, everlasting uncertainty and agitation distinguish the bourgeois epoch from all earlier ones. All fixed, fast-frozen relations, with their train of ancient and venerable prejudices and opinions, are swept away, all new-formed ones become antiquated before they can ossify. All that is solid melts into air, all that is holy is profaned, and man is at last compelled to face with sober senses his real conditions of life, and his relations with his kind. (Marx and Engels, 2005)
As new, more effective and efficient ways of doing things are proposed, the unsettledness of change may corner us into preferring the stability of the good old fashioned way of doing things. But we must not be afraid of change, as it will happen regardless of our fears; and even if we try to ignore them, they will be back.
I would like to end on the note that the issue under discussion must not be misunderstood to be that of distance education versus classroom education. The Open University has shown the feasibility of quality distance education already. It is ranked as one of the top 5 UK universities in student satisfaction and in the latest Research Assessment it was ranked in the top third of UK higher education institutions with 14% of its research as world leading. (“The Open University,” n.d.)
Bibliography and References
Chen, X., Barnett, D.R., Stephens, C., 2013. Fad or future: The advantages and challenges of massive open online courses (MOOCs), in: Research-to Practice Conference in Adult and Higher Education. pp. 20–21.
Daniel, S.J., 2013. Making Sense of MOOCs: Musings in a Maze of Myth, Paradox and Possibility [J]. Open Educ. Res. 3, 006.
How free online courses are changing traditional education., 2013. . PBSO News Hour.
Johnson, D.D., LeCounte, J., Valentin, C., Valentin, M.A., n.d. The Origins of MOOCs: The Beginning of the Revolution of All At Once-Ness.
The early 19th century was greatly populated by definitive answers; a result of the Enlightenment’s relentless focus on reason. On the other hand, it was also populated by the thought that answers are never definitive; a result of Romanticism and its focus on subjectivity. Amid this clash of opposites, there was Søren Aabye Kierkegaard, who chose not to choose any side in this contradictory framework and, although in favour of the Law of excluded middle, did not go to any of the extremes of his times. Having to choose to be either a Romantic or a rationalist, Kierkegaard chose to be none –nothing would be more suitable than a paradoxical position for an author whose life was dedicated to contradiction. Taking Socrates as his model, Kierkegaard seized the Greek philosopher’s task and made it his own life mission to use contradiction as a tool for the critique of actuality, a tool that is not to be resolved but to be appropriated. Kierkegaard’s stirring of a rethinking of his contemporary religious paradigm provides us with a model for the application –and a demonstration of the prevailing validity– of Socratic irony as a tool for the critique of our age’s objectively established conceptions and represents a plea to the acceptance that contradiction is a constituent part of life.
Kierkegaard’s life can be compared to the throwing of a bowling ball. There is thorough preparation and careful consideration before the ball is thrown; before any movement is made, there is a significant dedication of time and effort to mentally prepare its trajectory and once the ball is thrown, it follows its pre-established path to the end –even if the end happens to be farther than expected. Similarly, Kierkegaard’s life work was largely defined during his early years. In his master’s thesis, The Concept of Irony, Kierkegaard developed the mind-set he would hold to throughout the entirety of his career. The thesis explored Socrates’ character and life in perspective of numerous philosophical trends of the early 19th century. This study had a constitutive role in Kierkegaard’s life, which is evidenced when, in his last years, he writes: “The only analogy I have before me is Socrates; my task is a Socratic task…” (M, 341). The Concept of Irony was, thus, Kierkegaard’s establishment of Socrates –and the negativity he represented– as a model for his philosophy.
A major influence in Kierkegaard’s thinking were the ideas of the German philosopher Georg Wilhelm Friedrich Hegel, which he studied meticulously. Hegel’s exposition of Socrates identified in him the conception of a hitherto unimagined philosophical concept that provoked an irreversible change of paradigm in the Greek worldview. In Sophocles’ play Antigone, Hegel saw a portrait of pre-Socratic understanding of ethics and morals in the main character’s disregard for a newly established edict on the grounds of its incongruence to a higher, objective principle. Socrates’ mesmerising insight was, thus, the shift from this predominant culture of objective truths (given by deities or established in tradition) to the introspective world of subjective freedom, where the subject “must attain to truth through himself” (Lectures on the History of philosophy, vol. 1 p. 399). However, this advance was not consummated in Socrates –who attributed the responsibility over the judgement of important matters to a personal Daimon– but had to be further developed throughout the following ages. Socrates did not achieve this change by establishing novel dogmas; Hegel saw him as a purely negative figure, being the outcome of his work not the institution of positive doctrines but the sole negation of the existing ones. Drawing from this point, Kierkegaard constructs his idea of “pure irony” (CI, 253) as infinitely negative, that which is uniquely destructive and refuses to propose anything positive. By not actually suggesting any positive assertion, irony frees the ironist from any restraint to his position; hence “…the salient feature of irony is the subjective freedom” (CI, 253), the freedom the subject possesses to take any position he sees fit to fulfil irony’s purpose.
Johan Ludvig Heiberg, one of Hegel’s followers, based on Hegel’s Theory of History and having Socrates’ case as his matrix, identified a pattern connecting ironists and paradigmatic shifts in humanity. Heiberg sustains that when scientific or philosophical advances undermine a society’s understanding of the world, a period of anxiety and uncertainty is established until the old “actuality is displaced by another actuality” (CI, 260). This period is brought about by the ironist, to whom “actuality has lost its validity” (CI, 259). Not being bound to custom and tradition, the ironist enjoys complete subjective freedom and assumes the spreading of awareness of the present crisis as his role; Kierkegaard names these as Prophetic individuals. These individuals do not create the new actuality —for they do not posit anything– but only precipitate the crisis. Being ahead of their time, they are commonly misjudged and end up becoming a “sacrifice that the world process demands” (CI, 261).
Still following Hegel, Kierkegaard adopts the German philosopher’s view on Johann Gottlieb Fichte’s theory of subjectivity. Fichte, drawing on Kant’s concept of man as the measure of all things, placed a strong emphasis on the subject and his sensorial and emotional perceptions, alleging these to be the key attributes to affirm one’s individuality. Fichte’s theory was the basis for German Romanticism and its focus on subjectivity, which placed individual interpretation higher than collective judgement in all instances. Hegel was swift to point out the flaw in Fichte’s theory, noting its failure to perceive rationality as the uniting feature of human race. Bearing that in mind, Hegel opposed Romantics’ attempts to establish arbitrary rules based on their individual understanding; he believed the subject should arrive at the truth by himself, but this truth is not contingent on the subject but universal –men’s shared rationality would warrant the arrival at a common truth. Following from that, Hegel saw philosophy’s goal as the revelation of that truth and censured Socrates for stopping at the aporetic stage. At this point Kierkegaard’s and Hegel’s views begin to diverge; Kierkegaard saw Socrates’ function as that of the pure ironist, the prophetic individual, not positing anything but uniquely negating. For this reason, Kierkegaard was highly critical of Hans Lassen Martensen, a Hegelian professor of his who encouraged his students to doubt everything and afterwards go a step further than Socrates, stating a positive concept. Kierkegaard considered it to be preposterous, for he was establishing ideas as empty as those he had just disestablished, and also dangerous, for Martensen’s indiscriminate doubting led to a deconstruction of sound concepts whose absence would lead to despair. Kierkegaard’s critique took the form of a satirical novel, De omnibus dubitandum est, and he also identified the same pattern in Goethe’s Faust and in the Greek Sophists.
These are the leading ideas of Kierkegaard’s life work, guiding it from start to finish. During his first authorial period, he wrote a series of signed and pseudonymous books, both groups dealing with similar themes, the former being simpler and aiming at the common citizen, and the latter having a scholarly tone and focusing on highly educated individuals. His first publication under a pseudonym was Either/Or, which was a critique of Romanticism’s eminent irony, the indiscriminate critique of all society which “necessarily ends in an absence of all content, in a moral nihilism.”(Poul Martil Moller, “Om Begrebet Ironie”). By putting two drastically different characters in juxtaposition and not offering any conclusion to what should be understood of it, Kierkegaard made a clear allusion to Socrates’ aporia. Afterwards, in Fear and Trembling, Kierkegaard discusses faith and states the profoundly paradoxical character of Christian belief. It was the first step towards his goal of establishing contradiction as a necessary attribute of Christianity and to use it as a tool for the critique of religious institutions. Fear and Trembling was followed by The Concept of Anxiety, where the idea of appropriation was introduced, which is “to translate the achievement of scientific scholarship into personal life, to appropriate it personally.”(CA, 328). Then came Prefaces, where just like Socrates, the author claims to know nothing and uses this to criticise the current philosophical views. It was followed by Stages on Life’s way, a problematization of the forgiveness of sin, which again wound up simply posing a contradiction but not proposing any solution to it. During this time, a series of signed works were published alongside the pseudonymous ones and it all culminated in the Concluding Unscientific Postscript. The Postscript revealed Kierkegaard’s intention with all his works, explaining that it was not his will to construct dogmas but only to point out the contradictory character immanent to Christianity and foster the individual quest for what is true for oneself. It marked the end of Kierkegaard’s first authorial period. Kierkegaard’s second authorial period was dedicated to a severe attack on the Danish church and, just as before, he uses the Socratic concepts of irony, ignorance, negation, aporia, maieutics and Socrates’ role as the gadfly to effect this critique.
Kierkegaard saw in Socrates an example of prophetic individual and saw contradiction and paradox as the result of his ironic action. He identifies himself with Socrates and labours to bring to light the inherent contradictions of life. However, it must be noted that Kierkegaard’s paradoxes are different from the contradictions Socrates faced. Socrates saw that proposed definitions did not represent a concept accurately –excluding instances of it or including unrelated cases– and that established the contradiction; Socrates’ contradictions were discrepancies between a concept and its translation into language in a fixed definition. Thus, contradiction only appeared with the positing of a definition. The aporetic result of dialogues was an attempt to avoid being contradictory by conforming to not finding this perfect definition. In Kierkegaard’s themes, contradiction does not only appear between the concept and its worded representation but also within the concept itself. Socrates was concerned with ideas like beauty, piety, justice and virtue, which although difficult to explain are notions of which one have a reasonably plain perception. Conversely, Christianity is contradictory in Sich (in itself). The infinite in the finite, the eternal in the temporal, sin as a disregard for the universal and pious faith as a teleological suspension of it, all this evidences the inherently paradoxical character of Christianity, which establishes a contradiction even before the consideration of any positive description of it. Kierkegaard’s greatest appeal is for his readers not to try to solve the contradiction but to embrace it and keep it in mind as a never-ending critique of actuality.
More than a hundred and fifty years have passed since Kierkegaard’s attack to the church and his insight holds its validity now more than ever before. In an age where change occurs at an unprecedented speed, Heiberg’s pattern of substitution of actuality can be felt by individuals in their own lives. As actualities rapidly pass away and new ones come in their stead, to hold onto a firm set of convictions becomes ever harder, and, without these, individuals are placed in imminent danger of falling prey to a state of everlasting alienation, being constantly stripped of fundamental principles. To avoid this despairing condition, one must take a stand and adopt a philosophy that holds true for oneself. However, this same condition creates a wariness of new ideals –that they will reveal themselves to be as empty as the former ones– and that is exactly where the keeping in perspective of the contradictions intrinsic to concepts performs its critical role. The paradox will serve to judge actuality, through it ideas will be accepted or disregarded and in this process the individual will construct his own understanding of the beliefs shared in the community of our rationality. It is with this paradoxical criticism that we –like Kierkegaard in his time— face the uncertain future with boldness and the assurance that if our thinking is right, like a bowling ball thrown we will stay on track even if the end happens to be farther than we expect.
Concept of Irony – CI
Either/Or – EO
Fear & Trembling – FT
Concept of Anxiety – CA
Philosophical Fragments – PF
Concluding Unscientific Postscript – CUP
Stages in Life’s Way – SLW
The Corsair Affair – TCA
Prefaces – P
Journals – JJ
Sickness unto Death – SUD
Practice in Christianity – PC
The Moment – M
History of Philosophy, Vol. 1-3 – HP(v)
A house is supposed to be a place you run to, not a place you run from. However, sometimes things can go a bit out of track and, with time, sail on astray up to a point where one’s house can hardly be considered home anymore. One of the reasons for that is ontological; humans are never quite finished, we are eternally becoming. That means people change with time. Unnoticeably, this change can move foundations of people’s thinking to opposite directions, leading to a point where, inside the same house, the core way in which life is seen and tackled differs so much that a dialog between the worldviews is rendered almost impossible; and when this results in conflicting behaviours, living together is rendered impossible. That’s when home becomes just a house.
The main difference between a Democracy and a Republic is the existence of a Constitution. A Constitution is a document that states inalienable rights of individuals within a society. That means that even if 99% of the population wants to do something, if it infringes the rights of the other 1% they cannot do it. Just as in a state, in private life a house should not be a dictatorship. If many people live in the same place, they all must be able to express their individuality, thus everyone must have a say. Decisions must be made democratically, taking into account everyone’s opinions and preferences. As Mr Marx worded it, “the free development of each is a condition for the free development of all.” However, as in any democracy, majority rule can be oppressive for a minority. And that’s where the constitution comes in. Almost as a declaration of principles and values, a house’s “constitution” will be a guide to behaviour and decision-making, aiming to assure that no one –be it man, woman, visitor, a senior or a junior member of the community– will abuse of anyone’s rights, combating despotism and arbitrariness of conduct, and making sure everyone’s behavioural foundations are clear and firm, keeping home home for everyone.
Because, as I heard somewhere, “when you are hurt and in trouble and you aren’t sure were to go, you go to the place where your heart calls home.”
Having that in mind, here is the “constitution” I would set up; these are my house’s values and rules (still under construction, new clauses are still being added):
Everyone must be treated with respect. The following paragraphs explain what it means to treat people with respect.
People must treat each other as they would like to be treated plus following all clauses of the house’s constitution.
Men and women are intellectually equal and must treat each other as such.
Prejudice of any kind –be it skin colour, nationality, religion, sexuality, social status or any other kind– is recognised to be fruit of ignorance and is unacceptable.
Generalisations are seeing with enormous suspicion, boarding on prejudice, therefore ignorance. Thus, should be avoided.
Justice is understood to be possible only out of forgiveness. If there is no forgiveness it is not justice, it is vengeance.
Correction of any type aims at the establishment of justice. Correction’s goal is rehabilitation, never sheer punishment. The individual being corrected must be made aware of what was done wrongly, why it was wrong and what should have been done instead.
It is understood that actions are physical expressions of mental concepts, therefore corrections are aimed at the thinking process that originated the behaviour and not blinded by case-specific attitudes.
Screaming, verbal aggression and physical aggression are considered despotic means of coercion that rely on fear and authoritarian abuse. They are understood to be a primitive behaviour resorted to in face one’s inability to express oneself logically. This behaviour is a violation of the main idea of respect and is therefore intolerable.
Gossip is unwelcome.
Any action that does harm of refuses to help is considered wrong.
Senior members of the family shall be treated with deference –which means common respect plus acknowledgement of their historical importance in the family.
Polite treatment is part of being respectful. “Please”, “Thank you”, “Good morning” and “Good night” are not optional.
No one is obliged to do anything unless specified in this document.
Children are obliged to obey their parents as long as the parents’ orders do not go against this document.
Amendments to this document must be decided democratically among everyone 16 years old or over that lives in the house, taking into account children’s opinions.
There is no specific distribution of house chores based on sex. Chores are assigned according to one’s ability to perform it. Everyone that lives in the house must contribute in the house chores.
Everyone’s sleep is sacred. Whenever someone is sleeping everyone awake must respect it and make the most not to disturb it.
Sharing is vehemently enforced; however, it must be a personal decision. People have the right to individual non-shared property –except for items of communal use like food, hygiene products, etc.
Once I heard that three things cause depression: loss, injustice and uncertainty. Being uncertain of life, or of your consciousness, is with no doubt scary enough to qualify, but finding out that life is at hand is game changing as not many things can be. Monday night I went out with my church’s medical van to provide minimally invasive medical treatment to London’s homeless. I am not a doctor myself and my knowledge of the human body is pretty much limited to the names of our body parts, but, nonetheless, I play a part on the team. While doctors are taking care of people inside the van, some friends and I are talking to the ones waiting to be seen. My task is to ‘build relationships’. C’mon, I don’t give any medicine, don’t influence in diagnosis, I don’t sign any papers, I don’t even drive the van; I just walk around and talk to people about random stuff. Who are we fooling? I don’t really produce very much. It can be regarded as social work, but while I am there, productive would not be the most suitable adjective to describe me. At least that is what I honestly thought. Until this Monday.
A homeless family of about nine people was being seen, one by one, at the medical van. First a daughter, then the mother, the mother’s boyfriend and then the uncle. The uncle was taking quite long. After some time the van’s door opened and the doctor called the mother over. We could see through the door opening that the uncle had a very sad face and was on the edge of crying. The mother went in, the mother came out, a daughter went in, the daughter came out, the family became agitated, small arguments erupted and were rapidly contained, everyone begun to light their cigarettes. It didn’t take long and the ambulance arrived. They took him to the hospital.
What happened? He was having a heart attack. He went to the van because of a blood circulation problem in his leg due to his excess of weight. While waiting, he commented he had been feeling some chest pains. We asked if he was feeling it at that moment. He said no. He was a man of few words. We spoke a little more until he was called inside the van. By the time he went inside, he already felt confident enough and opened up to the doctors. They checked him, called the ambulance, and saved his life.
He would not have been honest with the doctors had he not felt secure. Being homeless you learn you are on your own and you can trust no one. However, while waiting he met David, Peter, Ajay, Ken and me, the unproductive relationship building team. We spoke to him, listened to him, paid attention to what he said, we did our job, showed him what we are there for. We are there to help; we are there because we care. If it wasn’t for that he would probably be dead by now.
Then that raises the question: were we being productive while doing our social work, or was it just a beautiful casualty worth blogging about that happened while we were doing something to calm our conscience and stroke our ego building up an image of benevolence? I think productivity is the doing things that improve our life. We try to improve our life by working so we get more money and eventually more comfort; we read and study a lot so we are more knowledgeable and cultured; we exercise! No one wants to be fat or ill. Then we spend all this time doing all that so we can live better and have others live better too. It reminds me of my brother when he was very little. He wanted to kick the ball the hardest possible, so he took some distance. Then he took more distance. Then if he could only take a bit more distance, the kick would be even harder. He is already quite far but he is sure that if he gets a little farther it would make a difference. He spent so much energy trying to get more distance he practically forgot about the kick.
We are always doing stuff so we can live better. Very often we forget the living part. Brian had his life saved because he felt we placed value on him. That is something no one had done for him for a long time. He felt like living again. He felt alive, and that saved his life. And there I was, trying to be productive. I forgot Brian was not a number (the fourth, third, fifth person to go on the van). I forgot living is about people. I got distracted taking the distance and forgot about the kick. . Brian was finally living, right there with me, but I wasn’t living along.
After reading a biography, I usually end up with words in my mind that express the heart of what that person’s life was all about. Warren Buffet is about integrity and diligence. Jack Welch is definitely about hard work. I though Nelson Mandela’s life would be summarised in words like ‘nonconformity’, ‘fight for freedom’, ‘endurance’ and ‘perseverance’. It was unquestionably about all these things but even though Mandela calls himself a freedom fighter, what most strikingly stands out, the idea that is woven through every line, that leads the plot and guides his actions, Nelson Mandela’s entire life shouts out ‘Reconciliation’.
I won’t go deep into South African history but you probably know it was a racist country with racist policies that culminated in the institution of apartheid in 1948, depriving all non-whites of democratic rights – like voting and the right to come and go – and the chance to go up the economic ladder. It created severe segregation of races and institutionalised the view that white people were superior to all other races, blacks in particular. Growing up in this environment, Nelson Mandela’s inconformity with the racist system grew together with his ever-increasing political engagement.
Mandela Fought the government with all possible weapons; legal actions, mass manifestations in non-violent protests and even violent enterprises. He became a prominent leader in the African National Congress, the black party, and endured nearly three decades of imprisonment for the sake of the cause he believed in.
In a racist environment, being constantly harassed by a specific ethnic group, it is very easy for racism to get you. Nonetheless, Mandela never forgot his fight was against prejudice and oppression, not whites. In passage after passage through the book, you see Nelson Mandela trying to ‘evangelise’ his oppressors teaching them about his struggle — and being successful in a number of his attempts. He didn’t only reconcile white with black, Mandela uses his incisiveness to bring together enemy parties, enemy politicians, to overcome his own pride and take the step to reconcile his organisation with the government, in the end he reconciles his country with itself.
Nelson Mandela’s biography has a moving end with the sight of so much hatred and oppression being overcome by justice; a culmination to a happy ending in a real world tale. I cannot finish this text with my own words and who better than a freedom fighter to comment on what freedom really is. Note that even in this simple paragraph or two his words transpire reconciliation.
I am no more virtuous or self-sacrificing than the next man, but I found that I could not even enjoy the poor and limited freedoms I was allowed when I knew my people were not free. Freedom is indivisible; the chains on any one of my people were the chains on all of them, the chains on all of my people were the chains on me.
It was during those long and lonely years [in prison] that my hunger for the freedom of my own people became a hunger for the freedom of all people, white and black. I knew as well as I knew anything that the oppressor must be liberated just as surely as the oppressed. A man who takes away another man’s freedom is a prisoner of hatred, he is locked behind the bars of prejudice and narrow-mindedness. I am not truly free if I am taking away someone else’s freedom, just as surely as I am not free when my freedom is taken from me. The oppressed and the oppressor alike are robbed of their humanity
“I don’t like drama.” My girlfriend was denying to like the very movie type I regard as the best. It just couldn’t enter my mind how could someone not like drama. Poetry, music, paintings, photography, all arts inspire a sense of condescension and compassion that are, after all, what captivates us. It is when we partake of the emotions disclosed in the verses of an uncorresponded lover, immersing ourselves in his story as if it were our own, that we truly admire the beauty of his composition. All of art share this characteristic, the attempt to convey emotions. And which works are better than those classified as drama? Which are deeper or bear a greater metaphor for our own reality?
I want to watch a movie called Poetry. It is the story of a lady that discovers she has Alzheimer’s and decides to study poetry as her awareness of the world fades away. How sad that is, and how big of an inspiration to make us see the world around us differently. We have the privilege to remember and to move forward whilst she was doomed to move increasingly and relentlessly backwards. No action movie can invite you so strongly for a review of your pre-assumed concepts. It is a mind-changing experience to have someone else’s tears streaming through your eyes. It is more than entertainment, it is the creation of fellowship in the suffering of a person.
“I just don’t like to feel sad.” Absurd! What a preposterous assertion! Sadness is an integrating part of the world, much bigger part for some than for others. Watching a drama we are having the opportunity to get the taste of what the character was going through. And then I stopped. You know, the character probably would rather not to go through the sad situation, then how wrong would it be to not want to take part in something that was wished not to exist? “Oh but we have to be aware of other people’s realities”, you may say. But the tasting of people’s pains is not a requirement for active compassion to come about. Doctors do not need to contract illnesses to be compassionate or willing to help patients. And by all means, drama is still only an entertainment genre, it will entertain the consumer and do nothing to aid those living the dramatic situation shown.
In the end I realised, as usual, that the fool was me. She was right to want her mood lifted up and not brought down by her entertainment, After all, the life we all aspire is a painless one. My girlfriend’s remarks made me realise how curious it is that we enjoy being amused by experiencing the very things we avoid. Dale Carnegie said the thing we love to talk about the most is ourselves. Maybe we are attracted by drama because it is in some way talking about us. Maybe it is because we feel identified. Maybe the tears we are crying are not the character’s but our own.
Whatever you are planning to do, just do it. Sometimes you just plan too much. It would be great to have all of the best always. To study in the best university, do the best port graduation, write the best essays, create the best product, write the best book, kiss the prettiest girl and so on. However, I realised that you can make an impression with much less than that. To be the author of a bad book is better than authoring no book at all.
The more you read,
The more you know.
The more you know,
The smarter you grow.
The smarter you grow,
The stronger your voice,
When speaking your mind
or making your choice.
A poem for kids I read in the wall of a friend’s house. Nothing special in it. No hard words or novelty of any kind, just a simple incentive to reading with rhymes in it. Nonetheless, I can’t take it out of my head. I find myself reciting it over and over again during the day. I am convinced it is a good and effective tool to encourage children to read. The only thing you have to do is to keep repeating it to the kid.
Simple but effective. The secret is in the repetition, not necessarily in perfection. Repeat enough something ordinary and BOOM, it magically becomes extraordinary. There was a series of advertisements of a bank in Brazil that always stated at the end, “I am Brazilian and I never give up”. People saw the ads so many times a day that they began to say it as a joke. “That girl said no to me, but I’m Brazilian and I never give up”, “I’m really full but there is still food in my plate, I am Brazilian and I never give up.” After some time that became one of the attributes of Brazilian people’s character in the mind of the nation. It became an encouragement in times of adversity. When natural disasters plagued the country, the people worked together through the situation with the mindset that we are Brazilians and that’s what we do.
The key for your extraordinary future is in an ordinary daily habit. What is the one thing that if you do every day will yield you a game changing future?
I write on this blog. I haven’t written here for three months now. I was planning to do a mind-blowing, brain-exploding, paradigm-changing post, which never came out. I wanted to write but I kept thinking, ‘oh that’s not good enough’. Then I realised I don’t have to change the world all at once. The magic is in keeping writing. The magic is in keeping doing, keeping marching, keeping rowing. So, whatever you are planning to do, just do it. You can improve it later.
It is the one simple thing, that doesn’t need to be perfect, that you do over and over, that will give you extraordinary results at the end of the journey.
I was sat on the passenger seat. We would have a serious conversation about what I was about to do. “Some people take what is built and established and want to deconstruct it. They want to take everything out and see if they can build something different out of it. By doing so they completely distort the paramount characteristics of the thing, ending up with something that opposes what it was at first.” I couldn’t disagree more. It is easy to stick to what we are used to doing, but change is sometimes hard to accept — mainly when it is a super radical change.
There is a song in Portuguese that says “I’d rather be this waking metamorphosis, than have that old formed opinion about everything.” Isn’t that great? Isn’t that hard? He says he’d rather change his mind over and over again than have that same old opinion that may be based on stuff that made sense back then, but now do not make all that sense anymore. But he goes on to saying the most daring thing, “I want to say the opposite of what I said before.” As if it was not hard enough to change your mind, but stand up and proclaim the exact opposite of what you previously so surely stated takes a lot of guts.
I invited a friend of mine to write on the blog. I told her, ” posts must have more than 600 words. Preferably, more than a thousand.” She accepted, but three months later she has still written nothing. My blog posts are so large not even I have the patience to proof read what I write. It’s time to change! I want to say the opposite of what I said before. New rule on the blog: From now on no post will ever have more than 500 words! It doesn’t matter how great ,or interesting, or complex, or exciting is the subject, no ultra-long posts anymore —I know no one has half an hour to read posts on random subjects anyway.
Also, there is another new rule. On past posts I tried to say a lot of stuff in one single text, that’s not happening anymore. From now on each post will transmit one simple idea. Dr Mike Murdock always says, “when you are sick and tired of repeating something, then people are beginning to understand.” Nothing personal guys, I don’t think you are dumb — actually, if you read this blog, chances are that you are amongst the most intelligent independent thinkers of all internet — it’s just our human brain doing its information selection process.
Be ready for a new It’s All About Focus. A faster, more flexible and concise website. I’m working out my thinking and doing some 180 degrees change, I don’t know everything and when I discover something new I may have to review some concepts. What about you? Can you say the opposite of what you said before on something?