College Doesn’t Prepare the Workforce (Also, That’s Not it’s Job)

My pal AK writes about higher ed topics. I am behind with my RSS, and just now spotted his piece Cut the Bull: the Demise of the Baccalaureate has been Greatly Exaggerated, critiquing a story from Inside Higher Ed by Ray Schroeder about how colleges could change to better support workforce readiness—I was pulled in. This is a topic that never fails to hold my interest. I thought I’d respond to a few of the original author’s points, as well as some of AK’s critiques.

In Schroeder’s piece he frames the problem with,

Enrollments at American colleges and universities have been on a decade-long skid. This past year, enrollments dropped by 600,000 or 3.5 percent. While some of those drops may have been prompted by the pandemic, the trend is clear—fewer and fewer students are entering college.

I’m not sure if 3.5% is a lot, let alone a lot during a pandemic, given all the restrictions that are being placed on education at this time in the interests of public health. But I read this and ask myself, how is the issue of college affordability and student debt not even acknowledged in this statement?

AK starts his critique with the following passage,

Ray asks (and answers): “Are we teaching the competencies and emphases that will be required to thrive in 2025? I fear not”

My answer is “colleges probably believe they do” and that they compete for students on the applicability of their curriculum in preparing their students for the future. Whether or not they are achieving said goal, in a healthy institution, should constantly be debated.

AK continues on to address Schroeder’s theme of moving academic records onto the blockchain,

There is nothing out there that prevents people from creating their own learner record (“transcript”). In fact, things like LinkedIn, CVs, and other means already exist to document your learning. We don’t need blockchain for it.

The thing blockchain could bring here is verifiability. No employer wants to call universities to verify if someone actually holds a degree from them. So they mostly trust the résumé, and applicants have known to falsify them.

That said, I mostly see blockchain as a solution looking for a problem… yet it is a solution with tremendous promise, once it matures from this Wild West stage, especially with its horrific environmental footprint.

AK shifts to job requirements,

The idea that a college degree isn’t required at high-tech companies (and by extension companies in general) is a myth.

It’s more and more common in design and development roles at tech companies that you’ll see “BA or equivalent experience” in job listings. I know successful developers who did not complete a college program, and have advanced their careers based solely on their body of professional work. The trick is how do you get industry experience without getting it in school? Some applicants complete online coursework and create personal, pro-bono, and solo consulting work to demonstrate their abilities to perform the role.

Finally, Ray says: “It seems that the “clients” of higher education—both the students and the employers—recognize that the baccalaureate is too long and all too often teaches dated material rather than preparing students for the future […]”

I think this is a poor combination of multiple things each with grains of truth. Based on what I’ve observed in the tech industry, in tech and design, colleges cannot typically keep up with the latest tech skills in their courses. Last I heard, the fastest a college course can have its curriculum adjusted and approved is 2 years. For a developer, that’s going to be stale knowledge.

But, there is an enormous body of work that is evergreen, and I think colleges should focus on that—more on this in a moment.

[Schroeder] Shorter, just-in-time sequences of courses could better address the emerging needs

One thing colleges could consider for degrees that service faster moving industries is to put your evergreen core education courses into 3 years for your BA, and then provide a fourth year that follows something akin to the “tech bootcamp” model, which goes all-in on industry-specific skill building. Perhaps the bootcamp is on-campus, perhaps its in another location. Perhaps it is co-run with industry partners.

I cannot handle people equating a college degree with job training. Whenever you join a company they are going to have particular methods that new employees need to learn. No school will produce a new hire that will be ready to contribute at 100% within their first month of employment. This is not the intent of college.

My team recently hired a new, but well-experienced designer who will start with us soon. We have been planning future projects with his contributions in mind, and we cannot expect him to perform to his full potential for… a while! We are fooling ourselves if we think recent graduates should be able to be immediately impactful. My company has what appears to be a fantastic internal “boot camp” specific to the industries we serve and the toolset we work with. On our team, we have an “early career professional”—the IBM jargon for someone who is in their first professional role after college—who participated in this 6-week intensive, and he had high praise for the program.

Many, perhaps most, companies are not setup to be able to train new hires at this level. As such you may see unreasonable requests from hiring managers. It’s a core requirement of companies to avoid spending their money, so they’re going to try to externalization all costs. That said, I think students would benefit from getting all the life-improving benefits of a traditional college experience and to be ensured that when they start in that first gig, that they have some experience with the latest trends in their field.

Perhaps we could split the difference and businesses could hire students who show promise after 3 years, and pay for their 4th year if a student focuses on a role the employer needs.

To tie it together, I think college is still a relevant and special experience. There’s a reason it continues to exist after hundreds of years. We should do whatever we can to make sure every student—for whom college is the right choice—has the opportunity to attend without financial penalty.

I think some people who are too heads down in the high-tech industry (and perhaps other industries? I can only speak to my observations) undervalue the non-industry-specific skills a student learns in college… composition, scientific thinking, presentation skills, working on deadlines independently, and a general understanding of society. Students aren’t getting a lot of ethics courses in high school, for example. Goodness knows with AI, data mining, “free services”, and a myriad of privacy concerns, we could use more people who understand ethics, in tech. This is not to say college is the only way for every person to find their way to the type of career I have. Fortunately, Ray concedes this point in his piece,

There is still room for the liberal arts in developing critical perspectives, thought processes and essential skills and abilities.

Thanks, Ray. In closing, I agree that we should always be examining the relevance of coursework in higher ed, and we should change with the times. But retrospectives also seek to identify what things we should continue doing, and I think there’s still a lot of good happening in higher ed.

An IndieWeb Webring 🕸💍