The information you learn is outdated by the time you graduate.
However the core of any computer field is understanding change, and gaining the ability to learn quickly. For example in Software Engineering you will learn 'pseudo code', a language independent way of building algorithms. With pseudo code you can essentially create a template that can be used to construct the algorithm in any language (C, java, OCaml, VB, etc).
You will also learn historical algorithms and thought patterns, and then the more modern versions, leading you to understand the kinds of changes that take place.
Like any science field computer science has 2 sub-fields, applied and research. Those who work in the reasearch field will be right on the forefront, building new languages and operating systems, constructing all kinds of magnificent constructs. Those who work in the applied field will be focused on constructing code, applications. Every program you see was built by those in the applied field, upon techniques created by those in the research field. (Yes I simplified that, it is easier to explain like this).
If you aren't a researcher, then you are pretty much not going to have to worry too much. I mean, C#, which I would say is the most widespread language for applications, has been around since 2001, and was adopted in to the .NET framework. So if you were a programmer in say, 2003, using C# you would have noticed minimal changes. Only the addition of anonymous types, short hand, lambda functions, null collence, a few other nifty things that make your life easier.
Computer science is more volatile than other science fields, but if you love change and new challenges, then its a great field.
Isn't it exciting that in a decade you will still have new things to learn and new challenges to overcome?
Computing is invading every section of our life, investing in knowledge of how it works is the best possible investment.