It's not exactly the CFLs because {0^n 1^n 2^n : n ≥ 0} is recognized by a TM in this model.

We get all of the CFLs as well (CFG in CNF?). However, we do not get all of the decidable

languages. Define the language A = {<B> | B is a TM in this model and <B> is not in L(B)}

Suppose A were decided by a machine M in this model. If <M> ∈ L(M), then <M> ∈ A,

proving that <M> is not in L(M). If <M> is not in L(M), then <M> is not in A, proving that <M> ∈ L(M).

Since a contradiction is reached, A is not decided by any TM in this model.

This may seem like a contradiction for normal TMs. However, we can decide A as follows:

if the tape head ever moves past the end of where the input is, reject. Suppose there are q

states, k tape symbols, and the input was of length n. The total number of configurations

is kqk^n. If on the (kqk^n + 1)-th transition the machine has not accepted or rejected yet, it

has repeated a configuration, and hence is in an infinite loop. Therefore, we can decide the

acceptance problem for this model. Randomness is very useful, but normal TMs don't have this property. So, let a Random Turing Machine (RTM) be a machine model like a normal TM where each transition has a certain probability of being executed. For a given state, and given tape symbol, there may be several possible transitions that can be executed; but the sum of their probabilities is always 1, and only one of those transitions is executed (and the transition is picked randomly based on the probabilities of the transitions). For example, we could have two transitions from the same state on the same tape symbol read, one with probability 0.7 and the other with 0.3; the first will be executed with 70% probability, and the other with 30%. Of course, if we feed the input to the machine several times, the answer of whether it accepts or not may change each time. We say that a string is accepted by an RTM if there is at least one computation that accepts that string, and the language being the set of all strings that are accepted by the RTM. How do the languages that RTMs recognize compare to that of the standard model?