
05595712
j
2009e.00220
Cobb, George W.
The introductory statistics course: a Ptolemaic curriculum?
Technol. Innov. Stat. Educ. 1, No. 1, 15 p., electronic only (2007).
2007
California Digital Library (CDL), University of California, Oakland, CA; University of California, Los Angeles (UCLA), Department of Statistics, Los Angeles, CA
EN
D35
curriculum
randomization
permutation
computing
sampling distribution
normal distribution
exact inference
Summary: As we begin the 21st century, the introductory statistics course appears healthy, with its emphasis on real examples, data production, and graphics for exploration and assumptionchecking. Without doubt this emphasis marks a major improvement over introductory courses of the 1960s, an improvement made possible by the vaunted ``computer revolution." Nevertheless, I argue that despite broad acceptance and rapid growth in enrollments, the consensus curriculum is still an unwitting prisoner of history. What we teach is largely the technical machinery of numerical approximations based on the normal distribution and its many subsidiary cogs. This machinery was once necessary, because the conceptually simpler alternative based on permutations was computationally beyond our reach. Before computers statisticians had no choice. These days we have no excuse. Randomizationbased inference makes a direct connection between data production and the logic of inference that deserves to be at the core of every introductory course. Technology allows us to do more with less: more ideas, less technique. We need to recognize that the computer revolution in statistics education is far from over.