In recent years, it’s become popular to call the act of programming computers “coding.” Some people claim that there are differences, that there are no differences, that it depends on the level of the language used, or that coding implies informality and therefore is less thoughtful or skilled than programming. Wikipedia seems to be trying to parse that difference in its definition of computer programming.
My personal experience being in software development over the time this vocabulary shift happened is that both the act and the terms slowly merged. When I started programing (back in the stone age) HTML and websites did not exist. My job title was software engineer and my job was programming computers. The term “coding” simply didn’t exist.
Programing a computer meant designing algorithms and creating the machine instructions that would react to the real world, do complex math or data manipulation, and output results. This applied to programming jet navigation software or programming games. (And I did both!)
After the web and HTML appeared, people were hired in technical positions to make websites. HTML is a markup language, not a programming language. HTML “marks up” the text, just like a human editor does, and controls how text is displayed, like making certain words bold. Way back when, it was pretty simple and making websites was called scripting or coding.
You programmed computers—you coded websites. I can’t say that in EVERY job in every industry this was true, but in my world at that time this was a big distinction in hiring, job descriptions, and pay.
As time went on, websites and the languages used to create them became more complex. Websites are no longer passive, simple text manipulation. The line between the network and computer became less distinct, and the functions, tools, and practices merged.
There was never one day when people said, OK, coding now equals programming, it just happened. Coding or programming? Whatever you choose, it’s a vocabulary shift that is here to stay.