Why do software developers decide to invent new programming languages? Why do existing languages evolve over time? According to Rob Curtis, Senior Manager at CloudHealth by VMware and a speaker at the recent +Tech Literacy Download event, programming languages evolve to support the needs of the rapidly changing technology industry. In his session, Evolution of Coding Languages, Curtis detailed how programming languages and technologies have been created and modified over recent decades to optimize performance and allow developers to focus on business-specific logic.
The Early Years
The first widely used programming languages were Fortran and COBOL, which were revolutionary in that they abstracted away some of the machine-level code that developers previously had to use. These languages are used very rarely today, only in legacy systems and by very few programmers.
The next significant programming language to come along was C, developed in the early 1970s at Bell Labs. Today, C is one of the most popular languages according to the TIOBE index. While C is incredibly performant for low-level memory access, it lacks the organizational structure that developers were starting to need in the 1980s. This led to the development of C++, a language built upon C that brought Object-Oriented programming (OOP) into the mainstream. OOP is a concept commonly used in software development today, and allows developers to organize their code into reusable classes, each of which has its own specific methods and functions.
Web 1.0
Web 1.0 is the name given to the early Internet. In the early 1990s, the Internet was primarily simple, static web pages joined together by hyperlinks. As the Internet matured and gained users, the need for more complex control flow, visuals, and forms grew. This led to the development of PHP, a scripting language used in web development, that allowed for more dynamic page content. During this time, languages like Java and Python, which are still incredibly popular today, were also developed. Both languages are high-level, meaning that many of the machine-level details are abstracted away, so developers can focus on their specific business logic instead of the low-level details.
Web 2.0
In the early 2000s, the industry again changed dramatically. Social media sites like MySpace and Facebook were growing in popularity and web pages were becoming much more dynamic and richer in content. These changes led to the development of more sophisticated front-end languages like Javascript. Similarly in the back-end, implementations were becoming much more complex than in decades prior. Developers needed more structure to maintain increasingly complicated systems, which led to the rise of software patterns like Model-View-Controller (MVC) to separate the front-end interfaces from back-end databases and business logic. MVC frameworks like Ruby on Rails and Django allow developers to focus on their differentiated implementation without worrying about writing and maintaining boilerplate code.
Trends of Today
The current technologies and programming languages used are once again determined by industry requirements. The development of public clouds like Amazon Web Services (AWS) and Microsoft Azure were a response to companies’ growing needs to cheaply and rapidly scale services, without maintaining physical infrastructure. The development of the cloud also enables faster release cycles, with some companies deploying hundreds of changes per day. This allows users to receive new features and bug fixes quickly, without waiting for their annual update like in decades past.