Skip to content

Commit

Permalink
kb autocommit
Browse files Browse the repository at this point in the history
  • Loading branch information
Jemoka committed Feb 11, 2025
1 parent 4626992 commit 7b4f50e
Showing 1 changed file with 13 additions and 0 deletions.
13 changes: 13 additions & 0 deletions content/posts/KBhtopics_efficient_derandomization.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,16 @@ G : \qty {0,1}^{\log n} \to \qty {0,1}^{n}
Naively doing this is certainty doesn't result in a uniform output distribution; because \\(G\\) is wildly not [surjective]({{< relref "KBhsurjectivity.md" >}}) --- there are \\(2^{\log n} = n\\) possible input values, and \\(2^{n}\\) possible outputs.

We want \\(M\\) to not be able to tell the difference, in particular meaning we want \\(G\\) to be scrambly enough o that \\(M\\) can't tell the diffidence between these two pictures.

We can't technically do this but we need to construct \\(G\\) mapping \\(G: \qty {0,1}^{\log n} \to \qty {0,1}^{n}\\) with...

1. \\(G\\) is computer poly \\(\qty(n)\\) (in particular \\(n^{3}\\) time---i.e. more time than the underlying turing machine)
2. for all randomized \\(n^{2}\\) time, turing machine, \\(\forall x \in \qty {01}^{\*}\\), we desire:

\begin{equation}
Pr\_{r \sim \qty {0,1}^{n}}, \qty [M \qty(x,r), \text{accept}] = Pr\qty [M\qty(x, G\qty(s)))] \pm 0.1
\end{equation}

i.e. it's "fooled" by \\(G\\)

**insight**: randomness derives from the inability for \\(M\\) to compute harder than \\(n^{2}\\). "Randomness is in the eye of the beholder."

0 comments on commit 7b4f50e

Please sign in to comment.