Thoughts on IT education - the good, bad, and the ugly
-
@irj said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 So this is one of those get certified while getting a degree schools like WGU?
Yes and no. I don't think my college is going to start employing "course facilitators" instead of professors, and simply point students to the material and expect them to grind through it. On the other hand, as a 2 year college we're not diving too deep into theory and abstracted concepts because of the time scale we're at. Does that clarify it? I do know one of our graduates is doing the WGU thing right now as part of a BS and is getting their MCSA as part of it. Personally, I think we use the certs as external validation that we're staying relevant, but when I see the A+ and other certs not keeping up (the latest A+ cert finally eliminated floppy drive questions!) I worry we're slipping behind as well.
-
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
Back to curriculum... one thing that might be well suited for an AS focused world (which I believe that you said was the program) is simply going to "Topics in IT 1" and "Topics in IT 2" and so forth. Make them so general that the department simply then gets to decide the order and content so as to build up a core understanding by the end.
That is a component, for sure. We're doing Assoc. of Applied Science to be clear, so keep in mind our degrees correspond to HVAC, Electronics, Supply and Logistics, and similar levels of more-than-just-training. We're NOT ITT for instance, and even as we're trying to meet demand for our students we will not promise more than we can deliver or become a diploma or training mill.
-
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
The reality is, teaching IT as a serious of singular, long, focused disciplines doesn't work. That class in systems administration would really, really benefit from some discussion around networking. That networking class makes no sense without a systems class. That programming class would have been good before that systems automation class. That database class depends on a certain about of system knowledge. That systems class would have benefited from people understanding database workloads and so forth.
Why teach them separately when you can teach them together?
We do, but we have separate programs within the School, so you can get a degree (AS or AAS) in Database Management, but not without courses in Network Communications, "A+" Hardware/Software, Computing Logic, Systems Analysis and Design... you get the idea. I think the idea of "blended" classes deserves more attention, but as you probably know, the "silo" architecture is ever-present.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller We're doing 2 year degrees, so getting students into intro jobs is just fine for us, to put it in perspective. What I can't do and won't promote is becoming a cert mill. Our industry-led advisory boards are very clear; students need to get soft skills from us as much as the technical skills. That IMHO is why college is still relevant in IT. If anything a college graduate has learned how to learn, and the speech and history and english classes produce well-rounded applicants at the very least.
Right, certs should be totally uncoupled from a collegiate academic process.
http://www.smbitjournal.com/2016/12/legitimate-university-programs-are-not-certification-training/
If a job candidate got through screening that had gotten certs as part of a degree program, they'd be fired for dishonesty if we found out.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@irj said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 So this is one of those get certified while getting a degree schools like WGU?
Yes and no. I don't think my college is going to start employing "course facilitators" instead of professors, and simply point students to the material and expect them to grind through it. On the other hand, as a 2 year college we're not diving too deep into theory and abstracted concepts because of the time scale we're at. Does that clarify it? I do know one of our graduates is doing the WGU thing right now as part of a BS and is getting their MCSA as part of it. Personally, I think we use the certs as external validation that we're staying relevant, but when I see the A+ and other certs not keeping up (the latest A+ cert finally eliminated floppy drive questions!) I worry we're slipping behind as well.
Certs are not in any way a validation that you are relevant and certainly not ones that are not even in the right field. Certs have a place, a good one, but they are VENDOR TOOLS, not industry ones. It's not appropriate to be using them in an academic setting in any way unless, as you had originally stated, using them as a guide to the "level" of knowledge, but never as a guide to the actual knowledge.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
The reality is, teaching IT as a serious of singular, long, focused disciplines doesn't work. That class in systems administration would really, really benefit from some discussion around networking. That networking class makes no sense without a systems class. That programming class would have been good before that systems automation class. That database class depends on a certain about of system knowledge. That systems class would have benefited from people understanding database workloads and so forth.
Why teach them separately when you can teach them together?
We do, but we have separate programs within the School, so you can get a degree (AS or AAS) in Database Management, but not without courses in Network Communications, "A+" Hardware/Software, Computing Logic, Systems Analysis and Design... you get the idea. I think the idea of "blended" classes deserves more attention, but as you probably know, the "silo" architecture is ever-present.
As long as there is a silo at that level, I'd put the relevance at "zero". What good is a system admin that doesn't have a foundation? What good is a DBA that doesn't know systems basics? They are all useless.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
"A+" Hardware/Software
You have a college program in "middle school level education with a pointless certification for minimum wage labour at best buy"?
What is the purpose of a curriculum like that? All of the jobs are minimum wage or similar. And all are as open to high school students as to graduates.
This is like requiring basic typing for a degree.
-
@scottalanmiller I can assure you that we never lose sight that these are vendor tools. In fact we simply use them as tools to get our students what we believe is a good academic education. Having said that, this is why I'm here with all of you, to make sure we keep our eyes on the ball.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller I can assure you that we never lose sight that these are vendor tools. In fact we simply use them as tools to get our students what we believe is a good academic education. Having said that, this is why I'm here with all of you, to make sure we keep our eyes on the ball.
I think that system administration as a concept carries a lot more value than teaching closely to any cert track. Using the cert tracks as guides to currency (not relevance) is important, but I would set them as a low bar for total level (an MCSE is what, six months of work, not two years) and only one piece of the puzzle and overly vendor-nostic (I made that up.) I think a more agnostic approach with more industry principles would be more valuable and students would then be well prepared to get the right certs for them afterwards (or during, on their own time.)
-
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
"A+" Hardware/Software
You have a college program in "middle school level education with a pointless certification for minimum wage labour at best buy"?
What is the purpose of a curriculum like that? All of the jobs are minimum wage or similar. And all are as open to high school students as to graduates.
This is like requiring basic typing for a degree.
Exactly. We eliminated typing as a class for a business admin degree over a decade ago for instance, and I think we're at the point now where A+ is headed that way also. Even so, with open enrollment we have to have certain barrier classes to see if our students are serious about what we teach and if they wash out of an A+ class then it's a good thing for them as well as us, right?
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
Exactly. We eliminated typing as a class for a business admin degree over a decade ago for instance, and I think we're at the point now where A+ is headed that way also.
A+ should never have been there. I took the A+ on version 1 in the 1990s and it was offensive, to say the least. It's not just unprofessional, but outright incorrect and deceptive. It is required for entry level bench work at the worst (but largest) shops like Best Buy and Staples. But that is its sole value. No program, ever, should have used it in any way. If anything, it's actually better today, not worse. That's how bad it has been.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
Even so, with open enrollment we have to have certain barrier classes to see if our students are serious about what we teach and if they wash out of an A+ class then it's a good thing for them as well as us, right?
I get this point but I don't agree. I'm 100% for open enrollment, I believe that this is the only good way to stay competitive as a university. I take open enrollment schools more seriously than others because they focus more on actual academic competitiveness rather than arbitrary and artificial factors.
But having a useless class that might actually teach things that need to be untaught or, at the very least, wastes time in an already short curriculum that is needed for actual skills, is not a good idea. Have students wash out of useful classes. Test their technical ability and interest. An A+ class would wash me out and I'm a seven figure person in the industry. I might not be everyone's favourite student (actually, I often am) but I'd drop out of college if I took an A+ class and thought that that was what IT was going to be about. Instead of testing their academic ability, you are testing their patience in some cases.
No need for that. Make them take a real class with real material. If they wash out due to lack of ability, fine, you have lost nothing (compared to now.) But if they don't wash out because real material held their interest, you've won not only a student you would have lost, but possibly your best student.
-
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
Even so, with open enrollment we have to have certain barrier classes to see if our students are serious about what we teach and if they wash out of an A+ class then it's a good thing for them as well as us, right?
I get this point but I don't agree. I'm 100% for open enrollment, I believe that this is the only good way to stay competitive as a university. I take open enrollment schools more seriously than others because they focus more on actual academic competitiveness rather than arbitrary and artificial factors.
But having a useless class that might actually teach things that need to be untaught or, at the very least, wastes time in an already short curriculum that is needed for actual skills, is not a good idea. Have students wash out of useful classes. Test their technical ability and interest. An A+ class would wash me out and I'm a seven figure person in the industry. I might not be everyone's favourite student (actually, I often am) but I'd drop out of college if I took an A+ class and thought that that was what IT was going to be about. Instead of testing their academic ability, you are testing their patience in some cases.
No need for that. Make them take a real class with real material. If they wash out due to lack of ability, fine, you have lost nothing (compared to now.) But if they don't wash out because real material held their interest, you've won not only a student you would have lost, but possibly your best student.
I concur on not thinking A+ has anything to do with actual IT. On the "wash out" comment my thought process was more along the lines of if they can't handle that material then they're really going to struggle with classes on up the line. I'm commonly telling my students that A+ isn't any more than a "baby cert" also.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
I concur on not thinking A+ has anything to do with actual IT. On the "wash out" comment my thought process was more along the lines of if they can't handle that material then they're really going to struggle with classes on up the line. I'm commonly telling my students that A+ isn't any more than a "baby cert" also.
Right, I get the wash out bit. But they will fail with whatever class you give instead of the A+, right? So the A+ would not add any value there.
Example, instead of A+ material, you offer Net+ material. Slightly more advanced, way more applicable. They will wash out at the exact same point, but you won't have to waste a whole class for other students in order to do it.
-
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
Example, instead of A+ material, you offer Net+ material. Slightly more advanced, way more applicable. They will wash out at the exact same point, but you won't have to waste a whole class for other students in order to do it.
Actually, we had a Network+ class but traded it for a Cisco I class. The A+ has been there the whole time even so.
-
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
IT education has to be dated, by some amount. If education was bleeding edge, the person teaching the course would be learning the material with the class.
You have to do that to be an IT pro. If the professor isn't learning with the class, you've got a big problem.
Um.. . . read that again. The profession at least needs to know what is going on. Learning is good, but they should be learning the material before the rest of the class.
It's why they are the professor.
My thinking is that.... How is a professor supposed to be an expert on a new technology? It's new for everyone, so everyone will be learning it together. You can't postpone teaching something for 10 years to give professors a chance to become experts on something.
I probably learned Server 2016 professionally before professors even considered it for course material... (just as a simple example)
-
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@irj said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 So this is one of those get certified while getting a degree schools like WGU?
Yes and no. I don't think my college is going to start employing "course facilitators" instead of professors, and simply point students to the material and expect them to grind through it. On the other hand, as a 2 year college we're not diving too deep into theory and abstracted concepts because of the time scale we're at. Does that clarify it? I do know one of our graduates is doing the WGU thing right now as part of a BS and is getting their MCSA as part of it. Personally, I think we use the certs as external validation that we're staying relevant, but when I see the A+ and other certs not keeping up (the latest A+ cert finally eliminated floppy drive questions!) I worry we're slipping behind as well.
Certs are not in any way a validation that you are relevant and certainly not ones that are not even in the right field. Certs have a place, a good one, but they are VENDOR TOOLS, not industry ones. It's not appropriate to be using them in an academic setting in any way unless, as you had originally stated, using them as a guide to the "level" of knowledge, but never as a guide to the actual knowledge.
How do you teach IT or Systems Administration without teaching students about any technologies they would be using on the job? You can't administer a System (which is from a vendor) if you don't know anything about it.
So if a course wants to teach Linux or Windows Server administration... Well surely covering many of the things the "vendor tool" covers is a great start... Competencies, measured skills, etc.
-
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
Example, instead of A+ material, you offer Net+ material. Slightly more advanced, way more applicable. They will wash out at the exact same point, but you won't have to waste a whole class for other students in order to do it.
Actually, we had a Network+ class but traded it for a Cisco I class. The A+ has been there the whole time even so.
See, this i feel is quite bad. Net+ plus leans towards being academic in nature - it is a general purpose cert. A Cisco class is, I feel, totally inappropriate in a collegiate setting.
The difference might seem subtle, but is dramatic. The idea of the Net+ is around fundamental IT skills. It's general purpose. It is for all people in IT. It's "understanding how things work."
A Cisco class means you leave this world and it is a mix of advertising / vendor promotion and trade skill around that vendor. Trade class instead of foundation. No longer academic or collegiate in nature. Not appropriate for a university to teach, this is for the ITTs of the world.
-
@tim_g said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@dustinb3403 said in Thoughts on IT education - the good, bad, and the ugly:
IT education has to be dated, by some amount. If education was bleeding edge, the person teaching the course would be learning the material with the class.
You have to do that to be an IT pro. If the professor isn't learning with the class, you've got a big problem.
Um.. . . read that again. The profession at least needs to know what is going on. Learning is good, but they should be learning the material before the rest of the class.
It's why they are the professor.
My thinking is that.... How is a professor supposed to be an expert on a new technology? It's new for everyone, so everyone will be learning it together. You can't postpone teaching something for 10 years to give professors a chance to become experts on something.
I probably learned Server 2016 professionally before professors even considered it for course material... (just as a simple example)
Exactly. If you want your professors to have lots of experience and time on something, you guarantee that that stuff is ancient by the time that the students graduate.
-
@tim_g said in Thoughts on IT education - the good, bad, and the ugly:
@scottalanmiller said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 said in Thoughts on IT education - the good, bad, and the ugly:
@irj said in Thoughts on IT education - the good, bad, and the ugly:
@worden2 So this is one of those get certified while getting a degree schools like WGU?
Yes and no. I don't think my college is going to start employing "course facilitators" instead of professors, and simply point students to the material and expect them to grind through it. On the other hand, as a 2 year college we're not diving too deep into theory and abstracted concepts because of the time scale we're at. Does that clarify it? I do know one of our graduates is doing the WGU thing right now as part of a BS and is getting their MCSA as part of it. Personally, I think we use the certs as external validation that we're staying relevant, but when I see the A+ and other certs not keeping up (the latest A+ cert finally eliminated floppy drive questions!) I worry we're slipping behind as well.
Certs are not in any way a validation that you are relevant and certainly not ones that are not even in the right field. Certs have a place, a good one, but they are VENDOR TOOLS, not industry ones. It's not appropriate to be using them in an academic setting in any way unless, as you had originally stated, using them as a guide to the "level" of knowledge, but never as a guide to the actual knowledge.
How do you teach IT or Systems Administration without teaching students about any technologies they would be using on the job? You can't administer a System (which is from a vendor) if you don't know anything about it.
So if a course wants to teach Linux or Windows Server administration... Well surely covering many of the things the "vendor tool" covers is a great start... Competencies, measured skills, etc.
Well the first thing is that a course in college should not be teaching Linux or Windows administration, that's a trade school's job. They should be teaching concepts of administration. Now, that said, they need operating systems to use for that. But teaching concepts instead of specifics is the core concept of academic work and is very different than teaching to a vendor cert.
Remember collegiate academic work isn't for the purpose of teaching on the job skills, but to teach someone the fundamentals and concepts so that those specific skills will make sense. You aren't teaching them which button to push, but why a button like it needs to be pushed.
Example... you don't learn details of NTFS and ReFS, but you do learn file system concepts so that when someone tells you the details of NTFS and ReFS you can immediately understand them and understand other IT concepts when the market changes.
This is a problem I see with most college grads. Instead of learning IT concepts, they just memorize the motions to go through to accomplish a task. They are only trained to follow a script, they don't understand why they do things or what they do means.