Suppose we are given an NFA with δ(q0, ε) = {q1}, δ(q0, a) = {q2}, δ(q1, ε) = {q1}, δ(q2, ε) = {q3}, δ(q3, a) = {q1}, δ(q3, ε) = {q4}. What is E({q0, q2})? Randomness is very useful, but normal TMs don't have this property. So,

let a Random Turing Machine (RTM) be a machine model like a normal TM where each

transition has a certain probability of being executed. For a given state, and given tape

symbol, there may be several possible transitions that can be executed; but the sum of their

probabilities is always 1, and only one of those transitions is executed (and the transition is

picked randomly based on the probabilities of the transitions). For example, we could have

two transitions from the same state on the same tape symbol read, one with probability 0.7

and the other with 0.3; the first will be executed with 70% probability, and the other with

30%. Of course, if we feed the input to the machine several times, the answer of whether

it accepts or not may change each time. We say that a string is accepted by an RTM if

there is at least one computation that accepts that string, and the language being the set

of all strings that are accepted by the RTM. How do the languages that RTMs recognize

compare to that of the standard model?