Educational institutions are not what they used to be. They no longer produce graduates that are "industry ready." This is something all of us in academia have heard often. But what is meant by "industry readiness?"
We are often told that we must update our curriculum regularly, to include technologies that the industry is currently working on. Since our faculty may not be able to update themselves so quickly, we should invite working professionals in our classrooms. Students should be encouraged to work on "live projects" (whatever that means). All this is supposed to ensure that the graduate when joins some company would hit the ground running. Currently, there is a lot of cost that companies incur on training and if that cost can be saved, our industry would be able to compete better in the global market.
But I still don't understand what will make students "industry ready."
In various industry forums, I ask a simple question. Will top 50 companies who generally hire graduates of the same discipline (say, Computer Science or Information Technology) come together and tell academia what programming language they want the graduates to know, and promise that 4 years later when these students graduate knowing that programming language, they will recruit them and assign them projects where they are required to work on programs in that particular language. (And programming language is just the most basic skill. We can ask the question about other knowledge elements and skills.)
I don't think any company can promise today that four years from now they will need only these technologies and not others. In such a situation, does it make sense to chase the dream of graduate being ready to contribute to a project on the day of joining.
When I pose such questions, some experienced industry veterans would point out that the industry readiness is not about removing the training requirement completely, but is about reducing the training requirement substantially. Can the graduate learn on the job, picking up a new skill or a new technology in a couple of weeks. Industry readiness, as per these experts is about having the skills to learn oneself.
This revised definition makes sense to me. And thankfully, it is possible to train students to be industry ready as per this definition. But, the folks visiting colleges for campus placements and those who attend these industry-academia workshops don't seem to be articulating this definition and therefore, there is utter confusion in academia.
The usual reply to this is that we ask for the graduate to be ready on day 1 in the hope that academia would provide graduates who are ready within a month of joining. So the day 1 thing is a negotiating position and they are willing to settle for day 31.
And herein lies the problem of lack of understanding of academia by industry. If an academic institution has to make its graduate ready for day 1, the curriculum and pedagogy will be very different than if the academic institution has to make its graduate ready for quick learning. So it is not a matter of negotiation since the two situations are very far apart. To make a student ready for day 1, an academic institution will have to select a few roles that it wants to prepare students for and have a curriculum that includes all technologies and skills needed for that role. But to make students ready for quick learning, an academic institution will have to have a deeper focus on basics, they will have to ensure that the student can apply knowledge from multiple courses (so do large projects), that the student can learn somethings on its own (through online or whatever) and after this, one can be reasonably sure that the student is ready for self-learning and will pick up any new knowledge/skills in 30 days.
So day 1 readiness means a narrower focus of education which is not good for either industry or for the career of the student. If industry really needs people who can learn things quickly, why not articulate that need clearly.
I am seeing some changes in industry already. For the last few years, it has become common for the job interviewers to ask what have students learn outside the curriculum. This is to see whether students have tried to do self-learning which is an indication of whether they will be able to pick up new knowledge/skills quickly.
If a company really wants the academic institution to prepare their graduates with specific skills and knowledge, they should recruit students very early on (say after 2 years or even earlier), start paying that student a salary (treat them as employees), ask them to take specific electives, do projects and internships as desired by the employer (since the student is now an employee), even ask them to take a semester off from academics and work and then come back and complete the degree, and if some student is willing to sign up for it, that would be fine. But demanding that all academic institutions teach a specific technology to all its students is not in the long term interest of students or even industry.
Of course, all this discussion is only about 20 percent of academic institutions. Eighty percent of academic institutions would not be able to prepare its graduates for day 1 or day 31 or day 101 irrespective of what definition of industry readiness is used.
7 comments:
Very very helpful insight post shared by you ,sir
sir the question you raised is definitely "THE" question which all the educational institution needs find an answer to.
The industry readiness is often misinterpreted .its not only about knowing the corporate culture, dressing or good knowledge outside the curriculum or may be the confidence but its beyond that .I think its the art of using the knowledge in the real world with a human touch, its about their emotional readiness to handle the real world situations, developing flexibility in accepting ideas and people, adjusting in adverse situations and on top of all making them a real human being.
Its not that to make students industry ready universities and education institutions are not taking pains. almost all of them have introduced certain soft skills and personality development programs but they are not being able to suffice to the arising need of the students and industry. if we try to find reason behind why they are not that effective? Its because of the program delivery, very less duration, and at times looks like just a formality.
My suggestion regarding this is that an intense learning program should be designed which runs during the entire course duration can meet the need of the hour. For the effective delivery and execution of this program the educational institutes can outsource these programs or hire a knowledge partner.
I agree with your overall point; it is definitely not academia's job to create students who are ready on Day 1. However for Day 31, I would like to point out some areas that are ignored by academia (as far as I can tell), and are really required by industry, and could be considered basics (in the sense that they'll not go out of fashion in 4 years):
1. Writing working programs: Ability to write a working program in a language of your choice: 60-80% of students manage to graduate with a CS/IT bachelors degree without the ability to write a fizzbuzz-level program that compiles and runs
2. Unit tests: Know what a unit test is, and know what and how many tests should be written for a given program to give some confidence that the program doesn't have obvious bugs
3. Tools: Ability to use a debugger; ability to use a source code control system
4. Existing codebases: Ability to make small modifications in a large program (as opposed to the ability to write a small program from scratch)
The better students have ability #1. It is rare for any student to have even one out of #2 #3 and #4.
Is this something academia should be teaching? If yes, does the capability to do so exist?
@Navin, Thanks for your comments. I completely agree. Does capability exists in academia to do this? Of course, yes. Will they do it? Well, depends on how industry articulates it, particularly those who go for campus placements tell the college folks. Today, too many of those coming to campuses are talking about a specific language, a specific tool, a specific technology. In fact, my own perception based on talking to many employers who have come to my current and previous colleges, I see that companies who are looking for a bit higher end role (with CTC of higher than 10L) usually look for strong basics, good projects (which would have some testing, debugging experience), and looking for evidence of self-learning (doing online courses outside the curriculum, for example). But companies who are looking for people at 3-6 L per annum, ask for specific technologies/languages etc. And the latter are lot more than former and that defines the narrative of industry.
In fact, based on this perception (I don't know how to do a scientific study to validate this), I have advised some colleges that to move from a median CTC of 3.5-4LPA to much higher median CTC of placements of their students, they should stop focusing more on basics and a very strong focus on projects. Projects should get much more faculty time, checking students not just at the end of semester, but at least every week if not twice a week. Students should not be able to finish projects by copy-paste after a google search. Faculty time is expensive but very necessary if we want our students to get jobs with higher CTC. At JKLU, we started this new focus with 2018-2022 batch, and I can say that our median CTC more than doubled compared with 2017-2021 batch. Note, I am talking of median, and not highest or average. So a lot of students got a much higher CTC.
@Dheeraj, Thanks for your detailed response. Regarding the 3-6L CTC companies, I agree that they need to do a better job of understanding what they want from colleges and then articulating it.
Regarding the increase in median CTC you've achieved via focus on projects: this is really a striking result. Have you, or are you planning, on writing a more detailed report on this journey? I've long advocated a stronger focus on projects, but so far the argument I used was "because they'll learn better" and I haven't seen too many results out of that. However, "they'll get better salaries" is a much more convincing argument. This causes "the right thing to do" to converge with "the profitable thing to do" which looks like a winning proposition.
^start focusing more on basics" instead of "stop focusing more on basics" in comment above.
@Navin, I will be writing about what all changes we did for 2018-22 batch very soon. Thanks.
Post a Comment