Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Our college is teaching us outdated technology
17 points by ponyous on April 5, 2014 | hide | past | favorite | 40 comments
One of my colleagues just sent me this quote saying "I think this is how our college works": "At the end of your four years of torture in university, there is the tendency to think that "I’ve learned all there is to learn"… only to come out into the real world and realize, "I’ve been learning outdated technology!""

We are basically learning stuff like XML, XSD, XSLT, JSP... I'm not saying anything of this is useless but I think priorities should be somewhat different, like learning JSON instead of XML, PHP instead of XML+XSLT...

I wonder what is correct way to tell professors that we are learning something that won't get us job as "easily" as PHP/Javascript would. Are there any other proposals that would improve our college program which I as a student can propose?



First of all, college isn't vocational training. So get over the whole "get us a job easily" thing.

Your employment is dependent on the effort you make towards your chosen career.

The true purposes of college is to teach you how to learn, to expose you to many new ideas, and to give you the opportunity to focus your life on thinking with the least amount of distraction.

If the curriculum is different than what you desire, then write your own curriculum, enroll in independent study classes and discuss with your advisor how you can achieve your goals.

If you approach it from your personal perspective you'll have much more success than if you approach it in the way you state "tell professors..." or "improve our college program". Those are bureaucratic battles and who wants to learn about the gnarly dark under belly of academia? You can go that route, but to win you'll end up spending your nights and days formulating presentations to administrative decision makers. YAWN. Institutional change is hard in all the wrong ways and dull in every way. And you know what the most likely outcome will be? A dean will agree with you and promise to look into adding a JS class in the 2015-16 academic year. You'll feel victorious but the dean will forget about in 10 minutes.

By taking personal responsibility for your own curriculum you can achieve something greater than a college degree. You will learn and exhibit two of the most important attributes in the software industry: initiative and independent self education. If you master those skills you're well on your way to a great career in engineering.


I was taught about XML, XSD and XSLT. When I got to industry, this made me wonder why we stuck with unstable JSON APIs and hand-coded validation, instead of using schemas for validation.

I was taught about SQL. When I got into industry and started using MongoDB, I wondered why data would go missing, and then realised it's because transactions are often a very good thing to have.

I learned some Modula 2, in 2011. Not because we were writing Modula 2, but because we were writing a compiler for it.

I've found quite a few times that while the technology we learn is old, there are many beneficial lessons to take away from it. Not just general theories and processes, not just learning how to learn, but there are actually some seriously good ideas in software development that new technologies haven't got, and often we in the startup/SV scene either haven't learnt them, or don't think they are important because they're 'old'.

I have several times found myself speaking to engineers who proudly 'didn't go to university' or taught themselves, to be hugely behind in terms of knowledge, and they just end up re-inventing things which have been around for decades, usually in Javascript. Re-invention isn't always a bad thing, it often injects new ideas, and the new versions can be really great to use, unlike some older technologies, but I think a significant number of people who think colleges are teaching outdated technology probably need to go to college.

Disclaimer though, this is based on my experience at one university, in the UK. I understand it might be different in the US and at other universities. I just find it frustrating when I know that my degree has taught me a huge amount of very useful stuff that I know some people in industry are lacking.

As far as getting a job goes, if you actually look at what most places want, it's Java, XML, C#, etc. Maybe not in the startup world, but the enterprise market is considerably larger, and ultimately where many people end up working.


>to be hugely behind in terms of knowledge, and they just end up re-inventing things which have been around for decades, usually in Javascript.

I have taken a few university courses, but have primarily taught myself and would greatly appreciate if you could expand on this.


A good University will teach you things you would almost never come into contact with on your own.

I will give you some random examples from my own degree. Some concepts are quite practical and others are theoretical, but all of the examples below taught me -- as someone who had started programming when he was 12, back in the DOS era -- something new.

- Data structures. I had played around with them myself before I went to university, but being formally introduced to the concept of data structures: lists (Queues, Linked Lists, Stacks); trees (Binary, B-Trees, Heaps, etc.) and the performance characteristics of each and how you would apply them. In my university it was first taught with mathematical notation followed by practical implementations in Pascal.

- Learning Prolog in my first year. An entirely procedural language that uses back tracking to 'solve' relations based on rules and facts. I loved this class. Prolog's not terribly practical but it taught me a lot about abstract data types and 'recursion'. The idea that a list is defined as a head (the item in a list) and the tail (the rest of the list) which can be iterated over recursively was a major eye-opener to me, an until then hobbyist coder.

- Haskell. A stark contrast to Prolog and Pascal, I learnt the power of composability and the ability to capture complex ideas with very powerful constructs and a very strong type system. The language is lazy so you can operate on infinite 'streams' of items which is another interesting way of thinking about the theory of computation.

- We were taught how to specify a PDP-11 (RAM banks, CPU, I/O) using an academic language that uses "rewriting logic". It was a very theoretical yet practical way of specifying a computer using only mathematical principles. The idea that the number 0 and the successor function is all you need to define the natural numbers and, combined with recursion, basic arithmetic was again the sort of thing you never ever do outside University.

- Compilers. I wrote compilers using lex and yacc. I have used that theory -- and the theory of regular languages -- to great effect since then to formally construct grammars and parsers for tasks I needed later on in life. And I could do it effectively and efficiently without reinventing the wheel or coding myself into a corner -- which is easy to do with compilers.

- Computer graphics. Basic convolution filters; image processing and the general theory of 3d graphics.

- Graph theory. When I first took it it seemed utterly "useless" but it has since then been one of the most useful things I know. I grasped the concept of Git right away thanks to that knowledge.

- Operating systems. Networking. Basic security. Complexity theory (the study of algorithm complexity) -- and the list goes on.

I remember maybe 30% of what I learnt but I can quickly jog my memory or pick up where I left off. I consider my degree invaluable from an academic sense...

... but it did nothing to make me a better developer. That, unfortunately, is something you have to work hard at. I got lucky: I had already been programming for 10 years by the time I graduated.


There's also the fact that older doesn't necessarily mean outdated or unused. There are still many jobs out there that require XML, XSD, XSLT, JSP, etc.

Organisations have software that's been around for many years, in many instances decades. Curiously they don't rewrite everything just because COBOL or Java or C++ or whatever has become unfashionable among some portions of the dev world ;-)


As I wrote in original post: "I'm not saying anything of this is useless but I think priorities should be somewhat different"


Why?

Genuine question ;-)

What utility value will you get out of PHP rather than Java? Why should priorities favour the PHP over Java, or C, or Lisp, or whatever?

I still have this vain hope that folk will get back to teaching programming - rather than specific programming languages. During my degree back in 1988-91 we built non-trivial programs in all of: Pop-11 (yes - nobody has heard of this ;-), Prolog, Lisp, ML, Modula-2, C, plus some stack-based assembler whose name escapes me at the moment. Not to mention trivial playing Smalltalk, Occam, shell scripting and probably others that I've forgotten. And this wasn't even a straight CS degree!

Sometime between now and then the universities started chucking out people who were just taught Java, or Python, or some other single-language.

Sigh.

I seem to have ranted a little off-topic... I'll shush now ;-)

Sometime


I'd worry more about the general topics that your learning, rather than the technology that your using to learn it.

Are you in a Computer Science class or a bogus how to use technology X class.

Are you learning Discrete Math, Big O, how to write a compiler, assembly language (doesn't really matter what platform).

If your learning CS, then you will have the tools you need to tackle any job using whatever tools you have at hand.

The "huge" differences between JSON and XML, don't really matter, if you can write a parser...


> We are basically learning stuff like XML

I find this hilarious because before I can go further into hacking into fancy cutting edge deep learning models, I have to process years and years of XML data from an external party.

As someone else said, the only purpose of University is to teach how to learn. Everything else is up to you. Again, there are places that teach you specific skills that are job relevant now (Codeacademy/Bootcamps). You could very well do them and get a high paying job too.

However.

Figure out what you want to eventually be in life. Do you want to be a high paid janitor who can be easily replaced? Or do you want be the guy who can see through all the bullshit frameworks and design solutions that are robust. The guy who remains relevant at 40.


If you feel uneasy with college, and you already have the skills you need, skip a quarter, semester or whichever time measure your college uses.

Your college should have a head professor, talk with him, explain him your feelings, and scrutiny the curriculum with him. He should give you more insight about where you might not be as good as you believe, and you will get valuable hints about where to start learning on your own.

Take the time you skipped to experiment what is to be a self-educated person, try to research on your own, read books, attend online courses, build projects, and everything else you find comfortable with to learn at your own pace.

If you are completely sure that you don't need college anymore, and you feel comfortable being a self-educated person, drop.

EDIT: In the country where I live (Mexico), there is an institution named CENEVAL[1], where you can graduate by knowledge acquired through job experience and self-education. Maybe in the country where you live there is a similar institution.

[1] http://www.ceneval.edu.mx/ceneval-web/content.do?page=1927


The stuff you're learning isn't outdated. It just might be more career-focused rather than front-page-hacker-news-or-reddit-worthy.

The example I like to bring up over and over again on HN is Union Pacific, because it is a great example. You will probably never see an article on the top of HN about railroad .NET or Java code, but it helps drive the US economy. (and they pay well. I have younger friends out of college that made $55 immediately and more experienced devs making $100K, which is insanely great compared to California given the cost of living differences. You're talking about owning the equivalent of a million-dollar CA home at 25 with cash to spare. )


You're going to have to live with this - the point of college is to teach you how to learn.

Longer version:

For the last two years I've worked for a major university - one of the top fifty. Prior to working for the university, I spent thirty years in industry. During my time in the commercial market, I thought of universities as ivory towers were everything is the latest and greatest ... because reading research papers only shows you the latest and greatest.

When I got here, I was shocked to find that many of our systems were 15, 20 or even 25 years behind those we were using in industry. Not that they weren't stable, but since our primary job is to "educate the youngin's", working systems were valued but there is no competitive advantage to replacing them.

My son is a super-senior in college, and I've noticed that the information he's learning is (on average) five to ten years old. And this actually makes sense. Research is bleeding edge, but much of it isn't commercially valuable. Since undergrad degrees are supposed to produce graduates who can obtain jobs in industry, industry generally has to adopt a technology before it's likely to be used in education.

Once industry adopts a technology, a professor (or department) has to recognize that adoption, make sure it's not simply a technology fad, verify that it aligns with the theoretical approaches that are valid and develop a curriculum.

Your university is actually providing you a valuable education by focusing on the technologies that are valued in industry, albeit a little behind the curve. Why? Because JSP is still used by a huge number of corporations and as a mark-up representation it's a pretty good example of many comparable technologies. Furthermore, JSF has been strengthened in the latest JavaEE releases and many large corporations have adopted these changes.

What if you don't want to work for a big stodgy corporation? You probably didn't need to go to a big stodgy corporation, but I question whether you're really following the technology trends very well. "PHP instead of XML+XSLT" sounds like a horrible way to do ETL, even for a start-up. Why wouldn't you pick NodeJS in that instance?

There are thousands of technology stacks and it's not practical for your university to teach all of them to you. how invested are you in learning the ones you're curious about outside of your classwork? You do realize you're responsible for your education and career right? If you don't learn new technologies on your own in the corporate world, you'll be quickly outdated.

So finally, my suggestion: Learn one of the technologies you're interested in, then go have a discussion with the professor about how that technology compares with what he's teaching you. I'm hoping he'll be intellectually curious and engage deeply in a conversation like this - recognizing that he can also learn something in the process. If not, your professor has indeed failed.


I think what you say makes a lot of sense, but the "learning to learn" idea has always bothered me a bit. All else being equal, why not learn something applicable while learning to learn?


Thanks for your reply it makes sense.

This fellow student also said something along this lines:"Next year I'm dropping out, I can learn 10 times more on my own than here...". What Do you think of this? Is this good decision?

Few of you mentioned "The point of college is to teach you how to learn", which seems interesting but it doesn't seem useful for me anymore since I already know a lot of stuff/tech, should I also drop out? What is your suggestion for a student who is amongst the best in college? (I already had a few job offers from start-ups in my country, I have good references, I'm researching on my own, won a few Computer science competitions...). Currently I'm working for a startup and completing first year of college, but replies here got me thinking - I might drop out next year.

I wrote PHP just as an example I actually had Node written there but then I changed my mind and wrote PHP instead of Node, because its more popular. You can replace PHP with anything more popular/new/trendy/useful there.


I could learn 10 times more in my own time, about Node.js and MongoDB.

It would take me 10 times longer, if I ever got around to it, to learn about theories of computation, languages/grammars/parsers, operating systems, discrete mathematics, evolutionary algorithms, cryptography...

There's more depth to the subject than the technologies that are popular right now.


Asking "Should I drop out?" is a dangerous question in this field. You're going to get both answers, the college grads will tell you that you absolutely should stay in school, and the non-grads will tell you that they never went to college and they turned out just fine.

Coming from the former camp I'll tell you what I personally thought was most important based my very very limited few months in industry. In addition to teaching you how to learn, college, especially a CS degree, teaches you how to think. I'm not using the things that I learned in college at my job. We use different languages and different technologies. I'm not writing up search algorithms and I'm not doing math problems to figure out if my code is fast enough. But all the learning that I did in college prepared me to learn all the things on the job that I needed to learn.

Your friend is saying that he can learn 10 times on his(her) own. Great. They should do that. And so should you. Teach yourself as much as you can. You're going to learn older, more stable things in college, because those languages/technologies make it easier to get the underlying concepts across. But if you have the drive to teach yourself things on your own, then you'll be better off when you graduate. It shows initiative to future employers and you'll develop great skills while you're at it.


I've been programming before I got into university, and for me, there would have been a lot of stuff I wouldn't have gone to learn on my own if I hadn't heard about it in university. Going to university exposed me to new ideas I could further explore deeply in my own time. It also help formalised some of my knowledge, for example the language theory behind regular expressions, as well as what OOP means exactly. More ways to describe concepts in my head, I think, has helped me compose bigger ideas.

Today I'm working as a web developer using Django in Python. I'd imagine if it wasn't for having watched my professor demonstrate a cgi C program web server, I wouldn't have gone on to dig up rabbit holes in php, rails, app-engine, and finally django in my own time, and I suppose rather than working as a web developer (and doing "real engineering" using version control, unit testing, integration testing, etc... all of which I first heard about in university and then further explored in my own time) I'd be slaving away in a company which works without version control and wouldn't know any better. And yes, don't laugh, I have indeed worked in a company part time without version control, before I learnt about it in university; when I did learn about it I introduced it to the company, and then quit. Let me tell you they were very grateful I told them about it. Nowadays they no longer merge code by hand.

    "Next year I'm dropping out, I can learn 10 times more on my own than here..."
I have found that to be generally true, i.e. I learn 10 times faster, on my own time, than attending lectures and doing tutorials and doing assignments, all of which were dumbed down for the average student, but in spite of that, university has still been worth it for me.


This seems to be good perspective on college, but I still don't think college have more upsides than downsides for me or anyone else who follow sources where latest tech is exposed (HN, reddit, ...). All the things you have mentioned (Version control, unit/integration testing, ...) I have already heard of or I already know - I'm not saying there isn't something else but I think I will get to know more and more things as I learn on work or by developing side-projects or just reading somewhere about it.

Damn I'm desperate, wasting my time with folks who are totally uninterested in learning something on their own, learning something that probably wont help me in future...


> More ways to describe concepts in my head, I think, has helped me compose bigger ideas.

This is the most valuable thing you will get out of any learning you do (in college or otherwise).

To quote Joel Spolsky [1], "... the best programmers all have an easy aptitude for dealing with multiple levels of abstraction simultaneously."

[1] http://www.joelonsoftware.com/articles/GuerrillaInterviewing...


I think dropping out would be a mistake and I can tell you from experience that I discounted all the "Basic Degree Requirement" and "GenEd" classes I was required to take. My advice is going to sound quite strange but ... I'd work extra hard on Sociology, Psychology and the Humanities.

The topics I hated most in college are those that I now find so very useful in understanding my fellow human beings. And as a software architect/developer, I've found that most projects fail because they don't understand their users, the problems their users are facing and why the existing solutions may have evolved. Work on classes that will help you understand the human factors (I suspect you have no problems with the technology).

On the other hand, I never recommend getting degrees in those disciplines (with the exception of perhaps a minor to go along with your STEM degree). Another useful area of study is business process modeling and business logistics as you'll understand a lot more about what problems software solves in the commercial markets.


My main dilemma in school: I loved the humanities, but I also wanted something a little more practical. For this very reason I majored in Cognitive Science at UCLA (with a focus on AI). I was required to take the CS courses and I also was given a chance to take courses under the realm of bio, philosophy, psychology, etc. I think education should require a balance. A well rounded individual doesn't necessarily make the best programmer, but does make the best member of society.


I agree with you now ... but you're much smarter than I was as a 17 year-old! (or perhaps the right word is wiser)


I didn't go to college, and am gainfully employed as a UI/UX designer and full-stack developer.

I've heard mixed reviews on whether college gives you enough "real world" value. From what I gather, it sounds like in college you'll learn more traditional "computer science" stuff, which is great if that's not your background. You'll learn classical patterns, some stuff about how to construct algorithms, memory management, big O time complexity, data structures, etc. I had the benefit of learning most of that in high school (they offered C and C++ courses and I had a wonderful teacher), and filled in the gaps by teaching myself (most of that can be learned in a few days if you have access to the internet).

There's a lot of practical knowledge that you'll need if you want to do web development somewhere that requires you to be good at your job. Things like source control (I've never heard of a university that teaches Git), quoting/estimating tasks (something that can only be learned with experience, no way around it), balancing business/tech needs, tooling/workflow optimization, how to choose an appropriate technology stack, and anything on the front-end more advanced than basic CSS, just for starters.

In short, college will NOT prepare you for the web industry. Expect to spend at least a year getting up to the level where you can have a development-oriented conversation with a real web developer. I can't speak to the rest of the industry, since my experience is pretty exclusive to the web.

It sounds like you're doing well just learning on your own, and thankfully we work in an industry where universities can't hoard all the knowledge: anything they can teach you can be found online. They just offer a structured environment in which to consume that knowledge, for those that need some extra hand-holding (probably because the high school education system is an absolute travesty, but that's a rant for another day).

One caveat: if your schooling is paid for (scholarship, parents, etc) then by all means take full advantage of the free ride. I was lucky enough to have a fiance that supported me while I spent about 6 months learning everything I needed to gain employment at a web development agency (and now I support her with that job while she's going to cosmetology school); I can't imagine how hard it would have been to learn all that while having to worry about bills. So, blow off your classes as much as you can without getting kicked out, learn everything you can about real web development, and once you have a job lined up, decide whether you really want to jump ship. Maybe start with contracting or part-time work at a dev shop somewhere.


tl;dr Use your college life to build cool stuff, just because you can. You won't have that liberty once you come out

> I might drop out next year.

My guess is that you are serious about this. As a student myself, and having been thinking on similar lines over three of four years of my college, my opinion is - finish what you started. It is absolutely true that what you learn in college is outdated. But, what you get in college and don't get once you come out is the sandboxed environment. In college, you are free to explore your will, do whatever you want (even find co founders, if you are enterpreneural); there is no pressure. The pressure palpably increases once you come out. You can do that (do cool stuff for fun) even after college, but it's infinitely harder.

> since I already know a lot of stuff/tech

You might be the best in your college, but never let it get into your head (I'm not saying you are)

PS: It became a long reply in the end. My apologies


I think I have this luck, that in my country I can repeat one year of college (free of charge), so I will probably drop-out (fail one year intentionally) next year and will experiment for a year - hopefully I will be successful as I plan.

"even find co founders, if you are enterpreneural" - I'm trying this the whole time not just in college but with everyone with IT interests. People in my country are just not that ambitious - they just want a safe job and an average salary. This one student to whom I've been talking too seems to be someone who I should hang out with.


If you don't mind me asking, what country are you in?


Slovenia (Southern part of Central Europe)


Everybody has gone through this - it's a staple of college. You're there to learn the basics and to learn how to learn.

If you aren't working on your thing and researching your own technology stack during college you're doing it wrong. Of course you're going to come out of there knowing nothing but old tech.


Personally, I would be way more interested in someone who came to me and said "Here is a class project originally written in X that I rewrote in Y because your job description included Y and not X" then someone who had originally learned Y in school.


The most valuable lesson you will learn in college is "how to learn." Put the attitude away and start learning.


Also, fix your damn title. "Our college is learning us outdated technology". Really?


I almost commented on the title, but I was guessing the poster might not be a native English speaker. Since you've brought it up, one correct way of phrasing the title would be "Our college is teaching us outdated technology".


I refrained for commenting on it for as long as I could. I weighed the non-native speaker too but finally old George Bush quotes pushed me over the edge.

;)


George Bush was the decider!


As smoyer said, I'm not native english speaker. I fixed it (hopefully)


Part of college is learning to deal with jerks who attack you for no good reason, and to not be intimidated intimidated by them.

Good job learning multiple languages, you are way ahead of most of us.


"and to not be intimidated intimidated by them"

I'm not going to fault Codhisattva for calling out the mistake in the title, or even the fact that it irritated him. I'd just like to exhort everyone to also offer corrections with comments like these. If I'm attempting to communicate in another language, having people who are comfortable correcting me (assuming they know what I mean at all) is important.

And I'd also like to point out that even people writing in their native language aren't perfect. We type fast, get interrupted a lot and have local dialects, so it's unrealistic to expect perfection.

P.S. I sure hope I don't find mistakes in this post ;)


I was not intimidated (what made you think that?). I'm actually happy that someone corrected me :)


It wasn't meant to be rude. My apologies for the ambiguity between tongue in cheek and rudeness.

Were this wikipedia or Stackoverflow I'd have just fixed it - I knew what you meant, of course.

The tongue in cheekiness comes from a (famous) quote from President Bush, "Is our children learning?" that the original title invoked.


I sympathize with how you feel, and I think some of these responses miss the point and are kind of inappropriately harsh for someone asking a perfectly legitimate question. I know plenty of CS students/grads feel the same way. I certainly do.

I definitely agree that college is partly about exposure and teaching you how to learn, but it's ridiculous to say that it's not their responsibility to teach you technologies that you'll be using professionally. Yes you should take it upon yourself to learn what you want or need to know. But with the cost of tuition - they should definitely be doing more than they are.

I know at my college (graduated in 2011), there seemed to be a disproportionate offering of courses that one would use if they wanted to become a video-game developer or an enterprise software engineer. They taught us very little about how to actually program for the web.

Getting back to your question.. heres' a couple things to keep in mind:

- It's not the same anywhere. There's a range in curriculums depending on what school you're at (and even within the same school). Some specialize in different things. Some have different philosophies. And some are better than others.

- The truth is, there's so many languages, libraries, frameworks, and technologies - and they're expanding in all directions faster than anyone can keep up. The only thing you can do is try and pick the ones that matter to you (based on what your aspirations are) and specialize. Anyways, given all this - imagine how hard it is for the universities to keep up themselves while designing a curriculum that can fit all of their students.

As far as what you can do - I'm not sure there's a whole lot you can do within the confines of your university, unless you were super adamant to the point of organizing events/rallies or pestering the hell out of your professors and school until they make some changes. And even then, who knows if it would work and in all the time you'd spend - you could probably have taught yourself a few of the things you're pushing to have them teach you.

Here's a couple other solutions:

- Find a professor/TA you like and knows some things you want to learn, and try to get them to tutor/mentor you. (In my experience, this one is hard because everyone's busy)

- Transfer schools. But do your research first to make sure you don't wind up in the same situation.

- Adapt by teaching yourself the things you want on the side. Tackle small projects, each with one or two new things you want to learn. (this is what the other responses were advocating, and I think you need to get in the habit of doing this regardless)

Hope this was somewhat helpful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: