Thoughts on IT education - the good, bad, and the ugly
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
So, with that background, how do all of you feel about the "state of IT education" and what can be done, if anything, to make it better?
The state is bad, no getting around that. Too little is taught, what is taught is often outdated or wrong. The pace of the educational process is often far too slow. And there is the insanely large challenge of incoming students half not knowing even the most basic tenants of the subject and the other half already having a mastery far in excess of the program.
Something that I see a lot is schools leaning towards teaching too much button pushing which becomes outdated practically overnight and should not need to be taught; and very little teaching of fundamentals that are essentially timeless.
Like I often say, because I actually learned the MCSE material in the 1990s, I can do most all IT today. Nothing has really changed. But people that just memorized answers at the time are totally lost today because what buttons to push are totally different, but the underlying technologies and factors have barely changed at all.
-
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
IT education has to be dated, by some amount. If education was bleeding edge, the person teaching the course would be learning the material with the class.
You have to do that to be an IT pro. If the professor isn't learning with the class, you've got a big problem.
Um.. . . read that again. The profession at least needs to know what is going on. Learning is good, but they should be learning the material before the rest of the class.
It's why they are the professor.
Not really, if you need the class to hold back for you to catch up, you are the opposite of a professor.
This is a scott-ism. The point of being in the class is because you want to learn from an expert on the course material. Not because you want to learn with the professor.
No, the goal is to learn. Period. Not to learn from an expert on the course material. That's both silly, why cares from where you learn as long as you learn? And impossible, colleges simply don't have material experts. It's not realistic for nearly any field, let alone IT.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
... how to incorporate Azure, AWS, or even Docker into the program.
Into a system admin program? Pretty casually. That's platform admin work. And doesn't require a lot of know how.
This is where I would think something like a course on architecture would be good. Something that teaches platforms, cloud, systems, and more holistic views of IT would make more sense. IT isn't as compartmentalized as it sounds (and as I often make it sound.) Job roles might be very tight, but they all work together. They are just cogs. You need to be teaching the machine at some point, probably before teaching much of the cogs.
-
@jaredbusch said in Thoughts on IT education - the good, bad, and the ugly:
Also, mods add tags.
Done
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
At some point I'm being disruptive to the status quo, which adds to my peers' workload and diminishes the necessary academic consistency.
Consistency only has value if it is consistently good. Something that is consistently bad would benefit from being inconsistent.
-
@scottalanmiller I want to get stuck in the weeds! The irony is that one of our staff IT managers was doing a tour of the campuses to see our setups (we maintain a separate network and MDF on all our "cyber center" campuses), and when I mentioned the push to allow for online (synchronous webcast, not asynchronous) classes in Server Admin - and the pushback from my peers - he was quick to point out that he had a team of six IT admins that NEVER touch the racks.
My peers on the curriculum committee are adamant that we have to be a "hands-on" program, and I agree to a point. But that's only to the point that you can remote in and work on your server! After that, all I care about is that students have real-time interactions with their professors during set class times instead of randomly checking in to their online classes.
You'd be surprised how hard it is to change the name of a program, even though over the 15 years I've been here we've gone from the Computer Information Systems program under the Business School, to the CIS/Computer Information Technology program, to our own School as "Computing and Informatics" to now just the School of IT. But, when I propose we go from Server Administration to Systems Administration, it goes over like a fart in church... sigh... -
@scottalanmiller Agreed.
-
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
IT education has to be dated, by some amount. If education was bleeding edge, the person teaching the course would be learning the material with the class.
You have to do that to be an IT pro. If the professor isn't learning with the class, you've got a big problem.
Um.. . . read that again. The profession at least needs to know what is going on. Learning is good, but they should be learning the material before the rest of the class.
It's why they are the professor.
Not really, if you need the class to hold back for you to catch up, you are the opposite of a professor.
To put my 2 cents in on it. There are times you bring "deep knowledge" to a new class you're teaching, and other times necessity puts you in the position of staying a week ahead of the students, so to speak. One of my mentors once said "if you ever want to really learn something, teach it". For instance, I have multiple degrees and industry certs, but I really think of myself as a professional educator because that's what I've actually been doing as a day-to-day job for a decade and a half.
-
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
IT education has to be dated, by some amount. If education was bleeding edge, the person teaching the course would be learning the material with the class.
You have to do that to be an IT pro. If the professor isn't learning with the class, you've got a big problem.
Um.. . . read that again. The profession at least needs to know what is going on. Learning is good, but they should be learning the material before the rest of the class.
It's why they are the professor.
Not really, if you need the class to hold back for you to catch up, you are the opposite of a professor.
This is a scott-ism. The point of being in the class is because you want to learn from an expert on the course material. Not because you want to learn with the professor.
It's weird. I agree with you both! For instance, we require our faculty to be certified in classes that lead to a cert, so if you teach the class that corresponds to the first MCSA cert (70-410 right now) you have to have that, but you don't have to have had the 411 and 412 passed to teach it. Should that change? I'm not sure. Also, we used to build the cert into the course, but we now have the "didactic" 3 credit hour classes that lead to a 1 credit hour "workforce preparation" class, so should we only have the cert requirement on the 1 credit hour classes so we're not short of qualified faculty to teach it? This is one of the reasons I'm on these boards now. I see SAM talk about LANless futures and what the "true" definition of IT is and I realize that at the very least I'll see a side I might be insulated from with your perspectives. Thank you for that!
-
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dashrender said in Thoughts on IT education - the good, bad, and the ugly:
@jaredbusch said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@jaredbusch said in Thoughts on IT education - the good, bad, and the ugly:
Also, mods add tags.
Sorry about that.
Meh, it is not the most obvious thing in the world.
You can add tags by editing your initial post if you want, or a mod will get to it.
LOL - while not IT related - the use of tags in general seems to be a pretty new thing. I know they have been around just about forever, but their actual use is pretty light. A good thing to toss into an intro class somewhere.
I was studying taxonomic classification in 2003, it was a major topic at the time because folksonomy of the web was really the hot thing of the era and that's where a lot of research and thought in IT was going. Long before these kinds of communities were arising. Taxonomy and folksonomy would be good topics for an IT curriculum.
Interesting. I'll look into it. For instance, I'm teaching an intro class in Informatics this semester, and I can see where this would be a good topic.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller I want to get stuck in the weeds! The irony is that one of our staff IT managers was doing a tour of the campuses to see our setups (we maintain a separate network and MDF on all our "cyber center" campuses), and when I mentioned the push to allow for online (synchronous webcast, not asynchronous) classes in Server Admin - and the pushback from my peers - he was quick to point out that he had a team of six IT admins that NEVER touch the racks.
That's correct. A system admin would never "touch" a server. In the enterprise space or the modern anything space, the idea that a server would even be plausible to touch is absurd. Servers live in data centers managed by highly secured bench techs and system admins do the IT work. IT work has no hands on, bench has hands on. Lots of IT in the SMB still touches things because they can't afford to bring on bench staff for the rare hands on task, but that's IT doing side work. IT work is logical, not physical 99.9% of the time.
For "real" system admins, it is trivial to go an entire career and never see your servers (or maybe any server) in person.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
My peers on the curriculum committee are adamant that we have to be a "hands-on" program, and I agree to a point. But that's only to the point that you can remote in and work on your server! After that, all I care about is that students have real-time interactions with their professors during set class times instead of randomly checking in to their online classes.
Having done both, I'm not sure that I agree. And I too have chaired that program for a state university (ThanksAJ was actually a student in one of my programs, lol. Coincidence there.) And getting time face to face with a professor has some value, if the professor has a lot of value to bring. But it comes with staggering consequences, like wasting time for the student commuting and sitting in class - all things that take away from their time to learn. It's a huge gamble that the professors will be so insanely valuable by just having a presence around students, that it will overshadow the insane costs in time involved with doing so.
Making students sit in class trains them to attend meetings as low level unskilled labour, making them be hands on with servers trains them for work at Best Buy. These things are distractions from learning, or worse, actively teach bad things. Do I want to hire students who know useless bench skills instead of IT, do I want to hire ones with so little concern for their own value that they spend it commuting to class instead of learning good skills? It teaches bad financial decision making and demonstrates a lack of critical thinking. Things that I find must college students struggle to explain in interviews - why did you choose to use your time and money in this way instead of.... and they have no answers.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
You'd be surprised how hard it is to change the name of a program, even though over the 15 years I've been here we've gone from the Computer Information Systems program under the Business School, to the CIS/Computer Information Technology program, to our own School as "Computing and Informatics" to now just the School of IT. But, when I propose we go from Server Administration to Systems Administration, it goes over like a fart in church... sigh...
Oh I know. I was the guy pushing for those changes at a college. That it is so hard to make class names mirror what they teach is one of the things that really highlights how much universities struggle with relevance.
Show me a kid that took a "server admin" class, and I'll show you a student not getting an interview. "You say you learned about IT, but your curriculum clearly says otherwise... is it that your professors have never seen IT, the school is teaching IT by mistake when it didn't want to or did you take classes in the wrong field and never figure it out?"
Those are the kinds of interview questions the school is setting students up for, if they get into IT interviews. In reality, it's setting them up to not get the interviews.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
IT education has to be dated, by some amount. If education was bleeding edge, the person teaching the course would be learning the material with the class.
You have to do that to be an IT pro. If the professor isn't learning with the class, you've got a big problem.
Um.. . . read that again. The profession at least needs to know what is going on. Learning is good, but they should be learning the material before the rest of the class.
It's why they are the professor.
Not really, if you need the class to hold back for you to catch up, you are the opposite of a professor.
To put my 2 cents in on it. There are times you bring "deep knowledge" to a new class you're teaching, and other times necessity puts you in the position of staying a week ahead of the students, so to speak. One of my mentors once said "if you ever want to really learn something, teach it". For instance, I have multiple degrees and industry certs, but I really think of myself as a professional educator because that's what I've actually been doing as a day-to-day job for a decade and a half.
In the university IT space, no professor has deep knowledge. It's unheard of. It's also completely unnecessary. Someone with deep knowledge would be worth way too much in the industry and likely be a bad educator (it's really rare to be really good at two things so different) and that deep knowledge would be of little to no use in a classroom.
You can be the best Linux kernel tuner in the world, but no college ever teaches Linux at that level. Not CalTech, not MIT, not RIT, no one. So having a professor with very expensive and very focused skills like that would be totally wasted and make the classes unnecessarily expensive without adding value.
Having enough knowledge to not get the material wrong is really all that makes sense. Enough to answer incoming questions, of course, is needed. But good foundational knowledge with little bits of "up to date" is way more appropriate.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
I was studying taxonomic classification in 2003, it was a major topic at the time because folksonomy of the web was really the hot thing of the era and that's where a lot of research and thought in IT was going. Long before these kinds of communities were arising. Taxonomy and folksonomy would be good topics for an IT curriculum.
Interesting. I'll look into it. For instance, I'm teaching an intro class in Informatics this semester, and I can see where this would be a good topic.
One other thought on this, slightly off-topic; do you think that AI might change tagging requirements in the future? In other words, even if we still teach taxonomic classification, would that only be as background for what AI starts doing "automagically"? I know that keyword analysis is getting a lot of attention in "meta-research" and topic survey papers.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
My peers on the curriculum committee are adamant that we have to be a "hands-on" program, and I agree to a point. But that's only to the point that you can remote in and work on your server! After that, all I care about is that students have real-time interactions with their professors during set class times instead of randomly checking in to their online classes.
This is old school thinking. The idea that people need to be locked up in cubicals all day to make sure things are getting done.
As mentioned by Scott - there is little reason that an IT person would ever need to occupy a space in an office building. They can do their work from darn near anywhere. But managers have this idea that if they can't see their people, well then their people must not be working. This same idea is what kills a lot of MSP work, heck kills good IT departments. When everything is good, and there are no problems, Management starts to question if IT is even needed. Of course this is crazy. If your car is running great, does that mean you shouldn't change the oil?
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
IT education has to be dated, by some amount. If education was bleeding edge, the person teaching the course would be learning the material with the class.
You have to do that to be an IT pro. If the professor isn't learning with the class, you've got a big problem.
Um.. . . read that again. The profession at least needs to know what is going on. Learning is good, but they should be learning the material before the rest of the class.
It's why they are the professor.
Not really, if you need the class to hold back for you to catch up, you are the opposite of a professor.
This is a scott-ism. The point of being in the class is because you want to learn from an expert on the course material. Not because you want to learn with the professor.
It's weird. I agree with you both! For instance, we require our faculty to be certified in classes that lead to a cert, so if you teach the class that corresponds to the first MCSA cert (70-410 right now) you have to have that, but you don't have to have had the 411 and 412 passed to teach it. Should that change? I'm not sure. Also, we used to build the cert into the course, but we now have the "didactic" 3 credit hour classes that lead to a 1 credit hour "workforce preparation" class, so should we only have the cert requirement on the 1 credit hour classes so we're not short of qualified faculty to teach it? This is one of the reasons I'm on these boards now. I see SAM talk about LANless futures and what the "true" definition of IT is and I realize that at the very least I'll see a side I might be insulated from with your perspectives. Thank you for that!
Having an MCSE to teach an MCSA would not be bad. Especially as the MCSA is quite broad and could cover almost anything in the MCSE. Although requiring the professors to be certified rather than knowing the material... personally I would not care at all. Both are very junior certs. Hopefully anyone teaching is so far beyond what is in those as to make the process silly. But, to be fair, I've never had a professor in that position, so having the certs might be a good compromise for the real world.
But having an MCSE doesn't come close to making someone an expert on Windows Administration, that's pretty close to entry level. Not quite, but you expect juniors in the field to get MCSEs on the job. By the time you get to mid-level or senior admins, those certs are long since trivialized. An expert on Windows Administration wouldn't be in a position of being judged by having certs.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
I was studying taxonomic classification in 2003, it was a major topic at the time because folksonomy of the web was really the hot thing of the era and that's where a lot of research and thought in IT was going. Long before these kinds of communities were arising. Taxonomy and folksonomy would be good topics for an IT curriculum.
Interesting. I'll look into it. For instance, I'm teaching an intro class in Informatics this semester, and I can see where this would be a good topic.
One other thought on this, slightly off-topic; do you think that AI might change tagging requirements in the future? In other words, even if we still teach taxonomic classification, would that only be as background for what AI starts doing "automagically"? I know that keyword analysis is getting a lot of attention in "meta-research" and topic survey papers.
Sure, that would be expected. AI is just replacing the humans in that case.
-
@scottalanmiller Yeah, I put the scare quotes around deep knowledge for a good reason I think. If I had to characterize myself at any rate I would say foundational knowledge is more descriptive for what I bring, personally.
-
@dashrender said in Thoughts on IT education - the good, bad, and the ugly:
This is old school thinking. The idea that people need to be locked up in cubicals all day to make sure things are getting done.
I'll pull my usual "this isn't old school" thinking point. Old people weren't stupid. This is bad thinking. Old school good thinking would have fixed this the same as modern good thinking.
It's common to associate "old school" with "bad decisions", but they are different. People who could work remotely, did, even fifty or a hundred years ago. It's just the ability to work remotely has enabled larger percentages to do this.
Old school decision making would have led to the same work from home rates as today. It as old school factors, like not having computers, that made people have to sit in offices all day. Not bad decisions.