As someone who has seen more than a few years since high school and college days, I look around at the generations that have come along and wonder what is happening to our educational system.
I have had my own travails with the educational system that I feel has, and is, failing my children, and I cannot understand what has happened in the years since I graduated.
It is also not lost on me that I sound like many of another generation that swore that the young were naïve and unknowledgeable. The difference, between those people and me, is that my opinion is shared by a much greater number of people; and objective testing, along with graduation rates, attest to the fact that I am correct in my assumption.
Yesterday, news of a survey was released about the suitability of applicants for many IT positions, and the revealed the fact that many were considered woefully inadequate in the preparation they had been given.
Results were that almost forty percent of the 376 organizations surveyed reported that those individuals hired either people without the skills needed to perform their jobs, or at least have gaps in their skillset which makes their performance compromised. The same survey had only 8 percent of the respondents saying that the graduates were well trained and ready for work.
Looking further into the problem, it is apparent that those doing the hiring have a clearly different idea of what is needed than the institutions where people are educated.
Is it the difference in ages between the hiring managers and the graduates, or is it really the educational disadvantages? As I grow older, I know I forget that I know things that younger people don’t, simply because those things are not really important to many (though I feel they should be).
I remember the first computer class I took at junior college, which started out with the history of computing. Many may believe that this is not of value, but I believe that in any bit of learning, there are things picked up that will be useful somewhere, at some point in life. The class then went into the structure and performance of the 6502 chips, which powered the first Commodore machines, and then the Intel 8088, so that we knew more than a bit about hardware. We went through all the subsystems of the computer, which are somewhat different now, but since many still use the same instruction set, modified to be sure, but using the same basic instructions to accomplish tasks it should still be relevant.
My son took the same beginning computing class in the same college a couple of years ago, and there was very little about the inner workings of the computer. The furthest he got into the architecture of the PC was identifying mother and daughter boards in one. The bulk of his first class in computer science (not general computing) concentrated mostly on using Microsoft Office, which may be helpful in his further endeavors, but is sadly lacking for someone wanting to be educated about computers and their operation from a physical standpoint.
It is much like my outlook on programming. Most outlines for a degree in programming that I have seen in the past 5 years (admittedly only from colleges in California) show nothing of the languages which were at the start of it all, and I believe that some of the basics being left out is not a good thing. How successful a programmer can you be if you have never seen how much more compact code can be had by using assembler, rather than a C compiler? How well can you write code if you have no clue how it interacts with the hardware (I’ve heard many say that knowledge of hardware for a software programmer is totally unnecessary.)
The surveyed people are looking for more of the kind of things that I studied, it seems, as the article from InfoWorld showed that they were looking for more preparation in many ways -
- 77 percent want schools to provide programming skills
- 82 percent seek database skills
- 76 percent would like schools to provide analysis and architectural skills
- 80 percent seek problem solving and technical skills
This skills gap apparently doesn’t doesn’t stop organizations from hiring professionals with little, if any previous experience. The survey found that nearly half of companies responding to the survey hire new IT employees straight out of school. Two-thirds of organizations do require at least some college internship experience among their hires, according to the study.
This appears to run counter to the ideas put forth earlier in the survey, as it should not be a problem if two-thirds are getting work experience as an intern, unless all they are doing is grabbing snacks and coffee (or is that pizza and Jolt cola?) for the staff. That kind of experience that the new-hires are receiving is not being recognized as on point to the tasks they need to accomplish.
Still, it is clear that the skills that are wanted, and deemed needed, are not coming from the formal education offered. It is too bad, and should be remedied, because that may very well be the solution to those who believe that we should not have United States corporations hiring IT staff from other nations.
So who do we blame for the problems of the IT departments of today and tomorrow? Is it the fault of the students, for not grabbing with gusto the information needed, whether part of the college curriculum or not? Is it the fault of the colleges, for being those ivory towers that the political right believes they are, not having much grasp on reality? Does it go further into the past, in those primary or secondary schools that failed to plant the seeds of knowledge, and the love for the knowledge itself?
Or is it simply that wish from those doing the hiring, that the people they will hire will be more than ready when hired, practically jumping into the chair at the beginning of each work day?