...or any unmanaged programming, for that matter, apart from a small amount of assembler in year 1. It looks like the dedicated academics have won out over the practical people.
I was involving myself in a conversation on CodeProject about college courses (a US-based individual noted that their college offered COBOL!) and wanted to link to my old BSc Computing Science course (which I graduated from back in 2001). Then I noticed something horrible...
Java has taken over, at least above first year (Introduction to Systematic Programming is still taught in Ada). Object-Oriented Programming is now taught in Java (when I took the course, it was C++) and a new course, Internet Programming with Java, takes the place of the C course, Software for Systems Programming.
Now I'll admit that a lot of my fellow students didn't 'get' OOP because they got bogged down in the ugly mechanics of C++. The lecturer wasn't too hot, IMO, and wasn't properly focused on the OOP mechanisms of C++. Also, at the time, the course compiler was GCC 2.x which didn't do templates and the standard C++ library very well, so we were using a lot of the C subset of the language.
It looks like the Computer Graphics course might still be taught in C, but there you have another problem - in spending time teaching the language, you lose time to teach the actual meat of the course.
But how can you send out new graduates with no practical knowledge of low-level systems programming? If you have to break out of the cosy managed environment, how will you cope? Someone still has to write the VM and runtime, after all, and the system dependencies. Are we failing to train our software engineers to write the next generation of VMs?