How Colleges Fail Computer Science Students
Cybersecurity is an emerging field with huge potential for any university graduate or computer science autodidact. This is due to an increased reliance on computer systems and the growth of smart devices by companies worldwide. According to Cybercrime Magazine, there are 3.5 million cybersecurity job openings globally, of which 500,000 come from tech needs in the United States alone. This need will not be going away soon, either. The Bureau of Labor Statistics predicts a 28% increase in the demand for information security analysts between 2016 and 2026–an increase of 28,500 jobs. Now let's explore why colleges are not preparing undergraduates for these amazing opportunties.
How We Got Here?
Society has been revolutionized since the technological age was introduced. The possibilities now seem endless in what the world can bring. However, these endless possibilities also translate into the diversity of cyber threats we could see.
1970’s - The First Computer Worm
Researcher Bob Thomas created the Creeper which could move across the ARPANET’s network, leaving a trail as it moved. Ray Tomlinson, the inventor of email, wrote the programme Reaper, which chased and deleted the Creeper. Reaper was the very first example of antivirus software and the first self-replicating programme, making it the first-ever ‘computer worm’ (2).
1980s - Birth of the Commercial Antivirus
Andreas Lüning and Kai Figge released their first antivirus product for the Atari ST, which also saw the release of Ultimate Virus Killer in 1987 (2).
1990’s - The World Is Now Online
The world soon progressed to go online. Organized crime entities saw this digitalization as a potential source of revenue and started to steal data from people and governments via the web. By the middle of the 1990s, network security threats had increased exponentially. In response, firewalls and antivirus programmes had to be mass produced and integrated to protect the public (2).
2000’s - Threats Increase
Cyber attacks started to become more prevalent. Malicious attackers took advantage of the openings they saw as the government started to crack down on crimes. Information security started to become in high demand, and professionals had to step up to the new challenge.
Present
Cyber attacks are growing at a rapid speed, and everyday more companies are putting their operations in control of technology. This results in huge security risks if not properly managed.
Skills Employers Require
Basic understanding of application architecture (FE, BE, DB), administration and management of systems, networking, and virtualization software.
General programming and software development concepts and software analytics skills
Programming Languages - Java, C/C++, Scripting languages (PHP, Python, Perl, or Shell)
Bachelor's degree in Computer Science, Information Systems, or a related field
Generally, employers are unwilling to bring in those who want to learn and teach them on the job. This creates a problem because many skilled and experienced workers often choose more interesting emerging technologies, AI, ML, Data, etc.
Employer Hypocrisy
There were not prestigious cybersecurity courses and YouTube videos when the world first started seeing cyber threats. In the early days, cybersecurity used to be called “communications security”, which was later amended to “information assurance”. Many of the workers that would be in these fields would not have IA or Computer Science degrees.
“The history of cyber and its development could be likened to the development of the medical field. As technology improved, adversaries mutated and techniques developed to beat adversaries, the roles of cyber developed. Before the internet and cloud, we had mainframes, desktops, POTS lines and Isdn lines. The way people attacked those were more specific to the technology. We didn’t have the interconnectivity or speeds we have now. OSes we’re different and less intertwined with us as people.” says Anthony Dipietro, an IT specialist at the National Security Agency.
Jumping forward, workers in the 90’s and early 2000’s would be assessed more on their abilities to learn on the job, adapt, and analyze rather than their graduate degrees. It became important for employers to see the strengths of their employees in action through demonstration of skill.
Today, employers are now expecting students to already have experience before giving them entry level jobs. On top of this, colleges are not giving students the hands-on experience that would allow them to “immediately contribute solid code to a team after graduation”, says Michael Taylor, applications and product development lead at Rook Security.
Colleges Fail to Meet the Times
There is an incredible IT security skill gap, and the research proves it. According to a study done by Cloud Passage, “not one of the top 10 U.S. computer science programs (as ranked by the U.S. News & World Report in 2015) requires a single cybersecurity course for graduation”. These results get even more frightening as you move down to the top 36 computer science programs. Only one school (University of Michigan) requires computer science students to have at least one cybersecurity course upon graduation.
With the plethora of job opportunities for college graduates, it seems odd that colleges and universities still think of cybersecurity as an afterthought in education rather than the forefront. With cyber threats escalating everyday, we need to train developers in the beginnings of their careers to implement security into their skillset and project development.
Have an interest in cybersecurity and want to learn more? Let’s talk.