How does the Turing Machine form the basis of our computers?

Status
Not open for further replies.

yhelothar

Lifer
Dec 11, 2002
18,408
39
91
I'm trying to understand the big picture of how a CPU works so universally to do unlimited kinds of calculations.

The Turing Machine is supposed to be the most basic, prototypical version of this. It seems very simple to me that I'm having difficulty understanding how this simple machine can do such complex things.

So it can read, write, and store memory, and have a table of instructions on how to interpret the input.

I suppose the magic should be in the table, but reading the wiki page on it is totally alluding me now with all of its esoteric looking symbols.

Can anyone explain this to me like I'm five?
 

Murloc

Diamond Member
Jun 24, 2008
5,382
65
91
I don't really get what you mean by "alluding me".
Anyway in the basic computer architecture course I took the Turing machine was not even mentioned. I was taught the various stuff about algorithms, then how Von Neumann's machine (with a basic logic scheme of the CPU first, with a bus-based CPU architecture after) does that. We also skimmed a bit on the control unit inside the CPU, the thing that actually executes the steps, seeing the microprogrammed or hardwired solution. So now I have more or less an idea of how the stuff actually gets done, in a super simple non-existing CPU.

this page pretty much explains how it actually executes the instruction:
http://www.cdf.toronto.edu/~ajr/258/notes/micro/one-bus.html
 
Last edited:

iCyborg

Golden Member
Aug 8, 2008
1,327
52
91
I'm trying to understand the big picture of how a CPU works so universally to do unlimited kinds of calculations.
If you're going to mention theoretic concepts like Turing machine, then I have to point out that theoretically, and in a strong sense, CPUs can do very limited kinds of calculations. Just google about "Computable function".
To give you some idea: consider just integer functions. The number of these is uncountably infinite, i.e. much more numerous than the set of integers, the exact same kind of relation as the cardinalities of real numbers and integers.
You can also prove that the set of computable functions is finitely countable. This was quite surprising in the early-mid 20th century, but because of widespread use of computers, it's quite obvious in modern times: think of a computer program - it's stored as a binary file, which is just a bunch of 0s and 1s, so you can look at it as one (gigantic) integer number. For every computation and every program, you can map it to some integer, which means that the number of "kinds of calculations" that a CPU can do is countably infinite. So theoretically, almost all functions aren't computable, i.e. there's no algorithm to describe it.
See also "halting problem" if you're interested in this.

The Turing Machine is supposed to be the most basic, prototypical version of this. It seems very simple to me that I'm having difficulty understanding how this simple machine can do such complex things.

So it can read, write, and store memory, and have a table of instructions on how to interpret the input.

I suppose the magic should be in the table, but reading the wiki page on it is totally alluding me now with all of its esoteric looking symbols.

Can anyone explain this to me like I'm five?
Another surprising result of 1930s was that there were several models of these abstract machines: Turing machines, register machines, lambda calculus etc., and all of them are equivalent.

Yes, the Turing machine may seem a bit esoteric, but think of it this way: all you have to do to prove that say modern x86 CPUs are equivalent to a Turing machine, is to take all x86 instructions one by one and translate them to corresponding Turing machine programs. I remember we did that for register machines which are conceptually quite similar to modern computers. Equivalent here means you can do anything that an x86 CPU can, it has nothing to do with whether it's practical or fast. And this is not that hard once you get the hang of Turing machines, most college exam problems are more difficult than say using Turing machine to do XOR, multiplication, addition, etc.

It's probably not an explanation for a five-year old, but one cannot expect a textbook worth of material to be conveyed in a forum post. There's a reason these things are studied in college, and usually not in the first year or two...
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Starting with something as abstract as a Turing machine can actually be a very difficult way to learn how computers work. Do you know any programming languages (even BASIC or JavaScript will do)? If not, I think it's easier to learn a simple programming language (for example, write a program to make a ball bounce around the screen) and then learn how a computer runs those programs. This site uses an approach I really like, and covers everything from transistors up.
 

tcsenter

Lifer
Sep 7, 2001
18,418
293
126
Starting with something as abstract as a Turing machine can actually be a very difficult way to learn how contemporary computers work.
FTFY. It was a great way to learn how computers from the 1960s worked.

See, things build upon things, until the thing no longer resembles much of the thing.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
CTho9305 said:
Starting with something as abstract as a Turing machine can actually be a very difficult way to learn how contemporary computers work.

FTFY. It was a great way to learn how computers from the 1960s worked.

See, things build upon things, until the thing no longer resembles much of the thing.

I don't think that's really correct. The IBM System 360 was from 1964, and as far as I can tell, it is much closer to a modern computer than a Turing machine. The Universal Turing machine Wikipedia article contains this quote from the mid 1950s:
Turing's theory of computable functions antedated but has not much influenced the extensive actual construction of digital computers. These two aspects of theory and practice have been developed almost entirely independently of each other. The main reason is undoubtedly that logicians are interested in questions radically different from those with which the applied mathematicians and electrical engineers are primarily concerned. It cannot, however, fail to strike one as rather strange that often the same concepts are expressed by very different terms in the two developments." (Wang 1954, 1957:63)

So, I don't believe your statement is correct. That said, my knowledge on 50+-year-old computers comes from reading some books many years ago, and I haven't thought about Turing machines in any detail probably since college... so if I'm wrong, I'd be interested in learning more about it.
 
Last edited:

_Rick_

Diamond Member
Apr 20, 2012
3,937
69
91
The Turing machine is of purely theoretic interest.
It serves as a typical representative of a class of computers, that are sufficiently general to solve a certain class of problems, with a certain number of limitations.

A transistor/flip-flop register based CPU is an equivalent machine, but implemented in a completely different manner.

If you want to learn about computability, complexity, problem classes and similar theoretical constructs, a Turing machine is a convenient tool to formally work with these notions.

On the other hand, a CPU is for the most part an exercise in electrical engineering, where transistors are arranged such, that they perform specific functions, such as being registers, memory, algorithmic and logical units, floating point calculators, and many more.

As for why a Turing machine can solve the classes of problems it can solve, and not solve the classes of problems it cannot solve, and probably not solve the problems in other classes, well, that's around a semester's or two worth of reading.

Make sure you understand how a Turing machine differs from a deterministic automaton, and a Keller-automaton, to understand why it is more powerful than those. Then you only need to understand why there is no computing paradigm more powerful than a Turing machine.

Theoretical computer science can be really interesting and exciting, but you need to have a taste for mathematical levels of formality.

Also look at such concepts as quantum Turing machines, to understand the value of the Turing machine as a tool of describing the capabilities of a class of computers.
 

iCyborg

Golden Member
Aug 8, 2008
1,327
52
91
So, I don't believe your statement is correct. That said, my knowledge on 50+-year-old computers comes from reading some books many years ago, and I haven't thought about Turing machines in any detail probably since college... so if I'm wrong, I'd be interested in learning more about it.
I agree with this too: Turing machines are purely theoretical and abstract, and outside of some academic experiments (if that), no practical machines are based on TM.
I mentioned in my post above that register machines are the closest to the von Neumann architecture that most electronic computers use. It lists 4 subclasses, and it's one of these subclasses of register machines called RASP that are actually the closest. From the wiki:
wikipedia said:
The RASP is closest of all the abstract models to the common notion of computer.
 

SecurityTheatre

Senior member
Aug 14, 2011
672
0
0
To answer the question without going down the hole of what the name of architectures are and what decade they were designed in, here is a basic investigation.

1) Transistors can be arranged in various logic gates. These are AND OR XOR and NOT gates.

2) Combinations of these gates can be arranged to do arithmetic (addition, etc). For example, a series of AND and XOR gates can do bitwise addition and can be cascaded into an indefinite length (16 bits, 32 bits, etc) to add very large numbers.

3) CPUs are designed to do a number of operations, ranging from ADD, SUBTRACT, LOAD, STORE, SHIFT, etc, based on instructions. Very simple CPUs essentially perform ALL of these operations, but the opcode determines which is selected for output. More complex CPUs (with pipelining, etc) have more intelligence there.

When you start putting together algorithms of multiple ADD, SHIFT, LOAD, STORE, you can do complex math like multiplication, or floating point operations.

Is that the root of the question?
 

serpretetsky

Senior member
Jan 7, 2012
642
26
101
To answer the question without going down the hole of what the name of architectures are and what decade they were designed in, here is a basic investigation.

1) Transistors can be arranged in various logic gates. These are AND OR XOR and NOT gates.

2) Combinations of these gates can be arranged to do arithmetic (addition, etc). For example, a series of AND and XOR gates can do bitwise addition and can be cascaded into an indefinite length (16 bits, 32 bits, etc) to add very large numbers.

3) CPUs are designed to do a number of operations, ranging from ADD, SUBTRACT, LOAD, STORE, SHIFT, etc, based on instructions. Very simple CPUs essentially perform ALL of these operations, but the opcode determines which is selected for output. More complex CPUs (with pipelining, etc) have more intelligence there.

When you start putting together algorithms of multiple ADD, SHIFT, LOAD, STORE, you can do complex math like multiplication, or floating point operations.

Is that the root of the question?
I think that's pretty good. I would then add that

4) The processor is clocked, and so can perform those operations arbitrarily and sequentially based on a timed clock signal.

5) The processor has some method of feeding the output from one clocked operation, into the input of a clocked operation later in time (usually this is done by storing the results in some form of memory, like registers, cache, etc)
 

tcsenter

Lifer
Sep 7, 2001
18,418
293
126
I don't think that's really correct. The IBM System 360 was from 1964, and as far as I can tell, it is much closer to a modern computer than a Turing machine.
Yes, I tend to agree after reviewing, my understanding of a Turing machine was a bit off. Thanks for that.
 

Bootleg Betty

Member
Oct 28, 2010
99
0
0
Actually, for current computers, I'd go for PRAM as a model.

I know Deterministic Finite State Machines can recognize only Regular grammars, so they are weaker than Turing Machine. Nondeterministic FSM are the same strength as DFSM, if you don't care about efficiency. And Stack Automata are only one step stronger on the Chomsky's (that anarchist! ) hierarchy, so they should be weaker than Turing machine too, am I right?

I wonder what strength Petri nets are.
 

yhelothar

Lifer
Dec 11, 2002
18,408
39
91
Thanks for all the great responses. I think reading through a few of the basic logic gates that makes up some of the more basic instructions and how they can do boolean algebra gave me an overall sense of how CPUs work.
 

TuxDave

Lifer
Oct 8, 2002
10,572
3
71
Thanks for all the great responses. I think reading through a few of the basic logic gates that makes up some of the more basic instructions and how they can do boolean algebra gave me an overall sense of how CPUs work.

You could probably make a "turing machine" of real life. You could reduce motion of turn left/turn right/move forward/move backwards to just turn left and move forward. The other two are just derivations of that. Start reducing all actions you can do into the very few and basic operations. That's essentially the idea of the turing machine.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |