While a computer is simply a machine that takes in a series of instructions and executes them in order, most people have a hard time understanding this fact. As Arthur C. Clarke wrote
Any sufficiently advanced technology is indistinguishable from magic.
My observations from my own experiences appear to stand as evidence of the above statement.
As a seasoned programmer, my experience with learning to program was interesting. I grew up with a small computer, the ZX Spectrum, which had a BASIC interpreter built in. While BASIC is probably outdated today, back then, it was a true beginner's language. The command line was king. Comparing that to the graphical interfaces of today, I am not surprised when people today shy away from the command line, instead opting to point and click. As computers became more powerful, our understanding of them became weaker.
As I grew up with computers, I learned their true power; it was not that they were smart, but that they were dumb and fast. Human beings tend to tire with repetitive tasks, but computers can follow the tiniest instructions to the letter. The disadvantage is that we need to specify these instructions with great detail. Let us take an example of how we would prepare a mug of hot cocoa. As a human, this is simple - Heat 1 cup milk, add 1 tbsp cocoa, 2 tsp sugar, stir. On the other hand, a computer needs these instructions specified as follows: Take mug, Take milk jug, Open milk jug, pour milk into cup, close milk jug, replace milk jug in fridge, and so on. I have simplified the instructions, but even then, I haven't come to the point where we have a mug of hot milk. People seem to have a hard time specifying the instructions to such great detail, and this is probably one of the prime reasons why programming seems so hard.
Early computers were even more literal, they had to be programmed by setting
switches; other machines used punched cards to provide the instructions. There
was no human-readable "language" to speak of, only a series of numbers, or more
accurately, a pattern of on and off, that told the computer what these
instructions were. Later, we developed assembly language, which was a way to
describe these sequences of on and off in a more human-readable fashion.
add b or
mul c would be common, and these corresponded
directly to the on-off pattern that executed the corresponding add or multiply
As time progressed, we developed high-level languages. Each line in these languages would correspond to several low-level instructions, so that we can now say print "Hello", instead of save string "Hello", copy string to output buffer, write output buffer to screen. Languages like Python make the task of programming much simpler, allowing the programmer to focus on the problem, not on the details.
Note: This is an essay I wrote for the Programming for Everyone course back in February