Randomness vs Time: Derandomization under a Uniform Assumption
β Scribed by Russell Impagliazzo; Avi Wigderson
- Publisher
- Elsevier Science
- Year
- 2001
- Tongue
- English
- Weight
- 182 KB
- Volume
- 63
- Category
- Article
- ISSN
- 0022-0000
No coin nor oath required. For personal study only.
β¦ Synopsis
We prove that if BPP ] EXP, then every problem in BPP can be solved deterministically in subexponential time on almost every input (on every sampleable ensemble for infinitely many input sizes). This is the first derandomization result for BPP based on uniform, noncryptographic hardness assumptions. It implies the following gap in the deterministic average-case complexities of problems in BPP: either these complexities are always subexponential or they contain arbitrarily large exponential functions. We use a construction of a small ''pseudorandom'' set of strings from a ''hard function'' in EXP which is identical to that used in the analogous nonuniform results in L.
π SIMILAR VOLUMES
## Background: The optimum treatment frequency for narrowband (tl-01) ultraviolet b (nb-uvb) in psoriasis is not yet known. we have previously found three times weekly to be preferable to five times weekly treatment in our population. ## Objectives: To compare twice weekly with three times weekly
The present paper considers a random censorhip model in which the remission times of acute leukemia patients in a study are randomly censored according to a given probability distribution and constructs confidence intervals for the Relative Relapee Rate [RRR] of the placebo versus 6-Mercaptopurine g