My last post (on some US university curricula becoming more applied) reminded me that there is quite a different approach between European and US universities.
European Universities which, by extension includes many Commonwealth countries, (including New Zealand, where I have experience), tend to specialise far more quickly in a chosen major, often having to strategically choose papers relevant to a major in the second, or even first years. This tend to favour ‘narrow and deep’ degrees.
On the other hand, the philosophy of US Universities (I hope I can represent them accurately) is more that you are not likely, or even expected to be prepared for your career by the end of your 3 year undergrad. Therefore you are more likely to get a ‘broad yet shallow’ (e.g. liberal arts) degree, that gives you the potential to work in a wider variety of careers after further training – either in post graduate, or on the job training.
I am not sure which is the best approach. On one hand, I have always tended to have broad interests (taking calculus, computing and philosophy papers during my Biochemistry degree while persuing business, music and film extracuricularly), and I strongly believe this gives you valuable multi-diciplinary perspectives in whatever you choose to focus on. However, I also believe your undergrad degree provides a valuable opportunity to dig deeper into a topic, start applying it, and see if it is a good fit with you.
In some ways, this seems confusing, and almost contradictory. However, I think the problem comes from misinterpreting specialisation as being the same as application. However, on closer inspection, they are not – I could know everything there is to know (of what knowledge is currently available) about a given, specialised topic, yet have never applied it – either to test a theory learned, or to aquire new knowledge, or to apply it to a real world problem etc. Therefore, maybe both European and US University Philosophies are ‘wrong’ as neither typically focus enough on application of knowledge, or even on understanding how it can be applied. I don’t mean doing homework assignments where you apply your knowledge to theoretical real-world examples (although this is a good start) – I mean getting out into the real world, understanding what it is like to apply this knowledge as a career, and understanding the imacts this has downstream. I guess this is partially addressed by work experience, but even that is too narrow, and the last time I experienced that was in my mid teens.
I have come to think of this kind of ‘kowledge application’/’work experience’ as a painfully obvious form of risk minimisation. If you think of acheiving your degree as a form of investment (of time and resources), then isn’t it basic due dilligence to dedicate even a fraction of that investment to ensuring the investment will generate a satisfactory return? In other words, ensuring that the time and resources put in result in a return of wealth and satisfaction. Assuming you spend 40 hours a week for 48 weeks a year over 3 years on a degree, that is 5760 hours, not including tuition fees. I would propose spending even 3 weeks – only about 2% of you investment, on investigating what it would be like to be applying that knowledge would be immensely valuable, and could save you a lot of pain in the long term. I suspect the 2% could actually be considerably higher.
Of course it is too much to ask for college students to ‘find’ this extra time, and structure and persue these investigations on their own. Therefore, it would make sense for Universities and Colleges to do this as part of their degrees. I am sure a few already do some of this, but I also know that a lot more can be done, and that a small extra investment up front could have a huge return just a few years down the track.