Do certifications matter?

Do certifications matter?

Technology has long been a permanent staple of sorts in people’s lives. Entire generation has been born and matured, not remembering what it’s like not to have a computer at home. A new generation is growing up, not even understanding the need for computers, as they view handheld devices, like cell phones and tablets, more than sufficient to address the needs of life. No website nowadays is only viewable on a PC, and popular content management systems , like WordPress and Joomla automatically create the mobile version of every site.

When technology advances happened every 3-5 years, it was possible to ‘keep up’. Technology was complex. Its terminology escaped the everyday life. People heard of ‘Oracle’ and viewed it as a prophet, not a relationship database. ‘Basic’ meant ‘simple’, not a programming language. When Microsoft beta-tested a new language at Boeing, capable of being visually alluring on a desktop, and called it ‘Visual Basic’, most people didn’t realize that their life has changed. When by mid-90s internet became commercialized, it helped spread technology. Phone and telecommunications companies saw an opportunity to make money, and began offering faster internet connections, at first through basic modems at speeds higher than 9,600 bits per second, then DSL and T-1s, and subsequently through newer technologies, like FIOS and sending internet signals through cable boxes, instead of telephone lines.

The internet became properly commercialized through the ‘.com boom’ in the late 90s, and average consumer found practical use for getting online, be it buying something through an e-commerce site, or finding information via a search engine. Suddenly the world of technology became drastically more complex. There were just a few languages in use: Java, C++, Visual Basic, and perhaps Perl. Basic got replaced by VB; RPG was specific to AS/400 only, which never properly replaced Unix; and Cobol and JCL belonged to mainframe systems and largely became obsolete after Y2K. There were few others, like MUMPS, LISP and FORTRAN, capable of greater things, but they never evolved and died out as well (except cache replacing MUMPS to low popularity). Suddenly, technology companies completely revolutionized technology itself. N-tier architecture was key to that. Microsoft got tired of selling its technology one piece at a time. At first it was about writing Active-X controls, creating COM objects, making them distributed, and if you were super cool, perhaps you’d engage the Microsoft Transact Server. Suddenly Bill Gates gave us the .net framework. Everything became n-tier’d automatically. About the same time in 2000, Oracle dropped Java Enterprise Edition. Java Beans became EJB; JSPs, and later Struts, became a must-have in Java internet development. Hibernate revolutionized the way Java connected to databases. Then Spring came. Before too long REST API became the agreed upon standard, Relational Databases lost its bragging rights to MySQL, thanks to Linux’s popularity; and later NoSQL (Mongo, I’m talking about the likes of you) offered focus on data, instead of tables’ complexity.

As recently as 20 years ago, an A+ certification guaranteed you a job with decent pay and career growth. A J2EE certification ensured that whether you needed to have a visa sponsored or wanted to completely telecommute, you’d still have your way. MCSEs were making $150/h on 3-year long contracts in major investment banks. CCIE was a golden ticket to pretty much rule the world and live to brag about it. Knowledge vs experience has always been a tough call, but when it came to new technologies, a certification was a fair indication that ‘you knew your stuff’. Wasn’t long before consulting companies caught up to it, and began shipping Indian, Chinese and Russian Americans wannabes to the US through the H-1B program, after getting them a valuable certification. ‘Point and click’ developers flooded the market. Missing the understanding of architecture behind their development efforts, and largely lacking the importance of the QA process, they gave certifications a bad name. Most hiring managers, who ever hired a developer, went through hundreds of resumes with all the right buzzwords on it and even valuable certifications, just to be shocked at how little the owner of such a resume understood about what they did.

To simplify this issue, the next advance in technology was the DevOps model. It was no longer about the autocomplete feature in a development notepad. The very purpose of it was to ‘automate everything’. Mindless developer wannabes familiarized themselves with certain steps, basic relevant cause and effect concepts, and began to call themselves ‘DevOps experts’. In the age when quants basically got replaced with AI/NLP techies, and HackerRank made even initial phone screen unnecessary, companies stopped caring about their employees personalities, desires and interests, and made it all about ‘can you do x, y and z?’ Firms’ kitchenettes got staffed with various flavor coffees, snacks, and flavorful ice-creams, and when someone complained about lack of knowledge in the development process, they got labeled ‘dinosaurs’.

Certifications stopped offering any advantage in the hiring process, short of a few exceptions here and there. It certainly still demonstrates a lot of dedication to certain technologies and made it easier to identify a more or less relevant short-term consultant. Yet, in the age when even a college degree doesn’t matter that much, why care about a certification? U.S. Bureau of Labor Statistics confirms that already more than a quarter of IT workers in the United States do not hold a bachelor’s degree or higher. About half of all IT job postings don’t require a degree and pay on average $83,000 per year. Long gone are the days when people spoke of Bill Gates building Microsoft without a college degree. Odds are, you’re working with an IT specialist in your firm without a degree and never had a single complaint about that person.

Look at the costs: exams are generally super cheap. Associate Level AWS Certification Exams cost $150, and the Professional Level exams run $300. A regular J2EE certification exam is $50. An MCSD certification exam fee is $165. Even a CCIE will cost you only $315 ($1,400 for the lab exam though). Lots of free training is available on the internet, either on the sites of firms offering the certifications, or through the ‘open-source’ tech forums. Even getting tech support is no longer a good way to spend your time, as Googling the question is far faster than trying to figure out the Dell prompts.

The only thing, which matters is experience, when looking for a job. Rapid and frequent technology advances, lack of genuine technical talent, and often senseless interview processes no longer care how pretty your certification logo looks on your pdf resume. It’s a new era and employers are mimicking Apostle Thomas in this day and age.