Why the university teaches outdated useless stuff? My guesses:
- For the university, it fills out offerings and takes up credit hours, keeps the tuition dollars flowing
- For the teacher, it's something they already know how to teach, so it doesn't require nearly as much effort to teach as something more useful but maybe less familiar
- Universities are trusted with the decisions of what to teach and don't face much short-term accountability, so there's no real downside to teaching a useless course for another year
- They probably don't know it's useless. (They also don't care to find out because of the aforementioned points)
My graduate program did not teach UML, but I did learn it during my undergrad program. It was a relatively small part of the major software engineering course. It introduced the idea of formally specifying software, and it forced me to reestablish, visualize, and otherwise integrate what I was simultaneously learning about things like interfaces and inheritance. It was presented as an educational tool and not at all that we would be using it in industry. Far from useless or out of date in an educational context.
There's a bureaucratic reason too. Changing curricula is not fast and can take years (depending on the institution of course).
My department profs actually got in a bit of trouble with the university because they took the course that they should use for undergrad/graduate mentoring (something like a "special studies in XXX" placeholder, usually for independent or small group study that was special enough for course credit) and used it to create their own courses outside the review of approving a new course with a distinct number and credit count.
The reason they did it in the first place was because if they wanted to change the curricula for the existing courses, remove course numbers, or create new ones, the bureaucracy would take 4-8 semesters to get approval and complete. By which time some of the material was obsolete. One of the profs got fired and the rest quit, eventually.
"Useless" is a strong word. Definitely a bit less useful than intended though.
Before UML, it's hard to capture the state of corporate software development that allowed the insanity to take part. I mean, UML was the marriage of two different approaches to drawing object models that were locked in a battle: OMT and Booch method. There weren't tons of open forums for discussion and debate like the internet has now, there were conferences and such and these guys were basically trying to create formal methods for objects in a vacuum.
It was kind of existential stuff for a lot of the smaller players in the industry, everyone saw value in this newer approach to building software. "Reusable components" seemed huge. Tooling was expensive, training was expensive. Microsoft was moving as a scary rate, connect your cart to the wrong horse and it could cost you the company... On some of the usenet forums, about the most open discussion there was at the time, I read debates about the virtue of C++ style multiple inheritance vs single inheritance and there were product matrices for programming tools that had check boxes for crap like that. C++ and CLOS both supported multiple inheritance so to the casual observer they were "better." Now I've never seen serious industrial software written in CLOS or anyone even considering it but it "had the features." It was just a different and crazy time, kind of amazing how open source/free/libre has altered things, the entire culture of building software is different and probably more healthy.
I’m on the advisory board of my school’s comp sci department. Each of us advisors has our own experience and perspectives on what useful things the students should learn. Sometimes I’m arguing that no, they probably don’t need to learn RPG, just because that’s what one of my colleagues sees a lot in their branch of industry. In turn, they argue that some of my recommendations are more useful at SF tech startups than in long-term positions in the companies local to the school.
Without those various perspectives, you end up with students learning all kinds of goofy things just because no one said, nah, they’re probably not going to need that.
That was my take on it. How do you get students to practice the process of thinking conceptually first, then evaluate how thorough the planning is, without some tool like UML? Its one thing to lecture your students on the necessity of planning ahead. Its quite another to evaluate whether or not they know how to plan ahead.
Granted, UML is not used hardly anywhere, but I must've learned 30 different specific software tools in college that I never used outside of college. However, I've used something like each one of them. I took a technical drawing class in high school and I still use some of the techniques I learned in that, even though absolutely no one uses t-squares, triangles, and actual paper in modern technical drawing today.
There's a fair argument re: how much planning and conceptualizing should be done ahead of starting the Agile process, and also a fair argument re: what that planning should look like (crude flowchart? UML-compliant class diagram?). But in rejecting the UML tool, are we rejecting also the idea of advanced planning too? Like, how completely do you need to reject advance planning that it takes the Agile loop to reveal that the customer actually needs software that uses an observer pattern?
- For the university, it fills out offerings and takes up credit hours, keeps the tuition dollars flowing
- For the teacher, it's something they already know how to teach, so it doesn't require nearly as much effort to teach as something more useful but maybe less familiar
- Universities are trusted with the decisions of what to teach and don't face much short-term accountability, so there's no real downside to teaching a useless course for another year
- They probably don't know it's useless. (They also don't care to find out because of the aforementioned points)