I remember back in the day I used to think computers were some magical entity that could do all kinds of stuff and worked on pure magic, and that only an incredible genius could ever hope to understand how they really work. Now that I'm not 8 years old and am in college learning electrical engineering, computers don't really seem all the complicated or difficult to understand at all anymore (at least not the basic principles).
Basically as was stated before, computers and people interface using what known as an "ISA", or Instruction Set Architecture, x86 is an example of one. On the human side what you see is a low level code like assemply, or a high level code like C++, either way it can be 'translated' into machine code (i guess origionally people did this by hand, but now computers do it for you), the machine code is something that a computer can understand, and its jsut binary. In a simple example you might think that it has 4 binary codes like the following:
1. instruction to perform (like add, subtract, etc...)
2. first operand
3. second operand
4. place to store the answer
So, the machine code is agreed to ahead of time to have these 4 codes in order. If you know anything about digital logic than you understand that you could take a digital code like "0100", and have it coorospond to a given unit, like an adder. So if we use the code "0100-1000,0111,0110" given our ISA (and lets assume 0100 means add), we could write it in words as "add memmory location '1000' to location '0111, and store it in memmory location '0110'''". On the chip itself the codes could be used to activate the different parts, so in the rest state everything is turned off, and when it gets the code it activates the read ports for the 2 operand, then activates the write port for the answer location, and then activate the digital adder. I dunno if that really helps or not, but it sorta shows how humans and computers can interface, once you know how to get the computer to do what you want, then you can introduce another layer of abstraction (like Windows) to get the computer to be more user friendly. So instead of typing in a million lies of code, you jsut click a button on your mouse, and the memmory on the computer alread yhas stored all the instruction that it should execute when you do that.
I think its probably best to try to learn it from one side to the other, so you can see how the layers of abstraction are built. For example, from the bottom up:
1. first understand how the physical porperties of semiconductors allow you to create electrical devices called transistors
2. understand how transistors can be used to create logic gates (NAND, NOR)
3. understnad how logic gates can be strung together to make functional units (registers, decoders, adders, caches)
4. understand how these units can be combined to create a functioning processor
5. learn how to control this processor using machine code (binary)
6. learn how to abstract binary machine codes to higher level languages like assembly, then even higher to C++, JAVA, etc...
7. learn how these high level languages can be written to create programs people can use without having to know any of the stuff mentioned before.