Upload
others
View
7
Download
0
Embed Size (px)
Citation preview
Statistics:The Compass for Navigating
a Data-Centric World
Marie Davidian
Department of StatisticsNorth Carolina State University
January 11, 2013
Statistics2013 Video
Available at http://statistics2013.org
Statistics2013 Video
Available at http://statistics2013.org
Triumph of the geeks
Nate Silver predicted the outcome of the 2012 US presidentialelection in all 50 states
using . . .
Statisticshttp://fivethirtyeight.blogs.nytimes.com/
Silver used a statistical model to combine the results ofstate-by-state polls, weighting them according their previousaccuracy, and to simulate many elections and estimateprobabilities of the outcome
Triumph of the geeks
Nate Silver predicted the outcome of the 2012 US presidentialelection in all 50 states using . . .
Statistics
http://fivethirtyeight.blogs.nytimes.com/
Silver used a statistical model to combine the results ofstate-by-state polls, weighting them according their previousaccuracy, and to simulate many elections and estimateprobabilities of the outcome
Triumph of the geeks
Nate Silver predicted the outcome of the 2012 US presidentialelection in all 50 states using . . .
Statisticshttp://fivethirtyeight.blogs.nytimes.com/
Silver used a statistical model to combine the results ofstate-by-state polls, weighting them according their previousaccuracy, and to simulate many elections and estimateprobabilities of the outcome
Triumph of the geeks
Others did, too. . .
“Dynamic Bayesian forecasting of presidential elections in thestates,” by Drew A. Linzer, Journal of the American StatisticalAssociation, in press
Triumph of the geeks
“Nate Silver-led statistics men crush pundits in election”– Bloomberg Businessweek
“Nate Silver has made statistics sexy again”– Associated Press
“Drew Linzer: The stats man who predicted Obama’s win”– BBC News Magazine
“The allure of the statistics field grows”– Boston Globe
But the interest in statistics didn’t start with theUS elections. . .
Triumph of the geeks
“Nate Silver-led statistics men crush pundits in election”– Bloomberg Businessweek
“Nate Silver has made statistics sexy again”– Associated Press
“Drew Linzer: The stats man who predicted Obama’s win”– BBC News Magazine
“The allure of the statistics field grows”– Boston Globe
But the interest in statistics didn’t start with theUS elections. . .
Statistics in the news
New York Times, August 6, 2009
“I keep saying that the sexy job in the next 10 years will bestatisticians” – Hal Varian, Chief Economist, Google
Statistics in the news
New York Times, January 26, 2012
“I went to parties and heard a little groan when people heardwhat I did. Now they’re all excited to meet me” – Rob
Tibshirani, Department of Statistics, Stanford University
Statistics in the news
New York Times, February 11, 2012
“Statistics are interesting and fun. It’s cool now” – AndrewGelman, Department of Statistics, Columbia University
Statistics in the news
The Wall Street Journal, December 28, 2012
Carl Bialik, The Numbers Guy
Data, data, and more data
Why is there so much talk of statistics andstatisticians?
Data• Administrative (e.g., tax records), government surveys• Genomic, meteorological, air quality, seismic, . . .• Electronic medical records, health care databases• Credit card transactions, point-of-sale, mobile phone• Online search, social networks• Polls, voter registration records
A veritable tsunami/deluge/avalanche of data
Data, data, and more data
Why is there so much talk of statistics andstatisticians?
Data
• Administrative (e.g., tax records), government surveys• Genomic, meteorological, air quality, seismic, . . .• Electronic medical records, health care databases• Credit card transactions, point-of-sale, mobile phone• Online search, social networks• Polls, voter registration records
A veritable tsunami/deluge/avalanche of data
Data, data, and more data
Why is there so much talk of statistics andstatisticians?
Data• Administrative (e.g., tax records), government surveys• Genomic, meteorological, air quality, seismic, . . .• Electronic medical records, health care databases• Credit card transactions, point-of-sale, mobile phone• Online search, social networks• Polls, voter registration records
A veritable tsunami/deluge/avalanche of data
Demand
2011 McKinsey Global Institute report:
Big data: The next frontier for innovation,competition, and productivity
“A significant constraint. . . will be a shortage of . . . people withdeep expertise in statistics and data mining. . . a talent gap of
140K - 190K positions in 2018 (in the US)”
http://www.mckinsey.com/insights/mgi/research/technology and innovation/big data the next frontier for innovation
Opportunities and challenges
• Our ability to collect, store, access, and manipulate vastand complex data is ever-improving
• The potential benefits to science and society of learningfrom these data are enormous
• However, Big Data does not automatically meanBig Information
• Science, decision-making, and policy formulation requirenot only prediction and finding associations and patterns,but uncovering causal relationships
• Which, as we’ll discuss later, is not so easy. . .
Opportunities and challenges
• Our ability to collect, store, access, and manipulate vastand complex data is ever-improving
• The potential benefits to science and society of learningfrom these data are enormous
• However, Big Data does not automatically meanBig Information
• Science, decision-making, and policy formulation requirenot only prediction and finding associations and patterns,but uncovering causal relationships
• Which, as we’ll discuss later, is not so easy. . .
Perils
From “The Age of Big Data”With huge data sets and fine-grained measurement,. . . there isincreased risk of “false discoveries.” The trouble with seeking ameaningful needle in massive haystacks of data, says TrevorHastie, a statistics professor at Stanford, is that “many bits ofstraw look like needles.”
Big Data also supplies more raw material for statisticalshenanigans and biased fact-finding excursions. It offers ahigh-tech twist on an old trick: I know the facts, now let’s find’em. That is, says Rebecca Goldin, a mathematician at GeorgeMason University, “one of the most pernicious uses of data.”
Perils
From “The Age of Big Data”With huge data sets and fine-grained measurement,. . . there isincreased risk of “false discoveries.” The trouble with seeking ameaningful needle in massive haystacks of data, says TrevorHastie, a statistics professor at Stanford, is that “many bits ofstraw look like needles.”
Big Data also supplies more raw material for statisticalshenanigans and biased fact-finding excursions. It offers ahigh-tech twist on an old trick: I know the facts, now let’s find’em. That is, says Rebecca Goldin, a mathematician at GeorgeMason University, “one of the most pernicious uses of data.”
Critical need
Sound, objective methods for modeling,analysis, and interpretation
Statistics
While Big Data have inspired considerable current interest instatistics, statistics has been fundamental in numerous areas ofscience, business, and government for decades
Critical need
Sound, objective methods for modeling,analysis, and interpretation
Statistics
While Big Data have inspired considerable current interest instatistics, statistics has been fundamental in numerous areas ofscience, business, and government for decades
Critical need
Sound, objective methods for modeling,analysis, and interpretation
Statistics
While Big Data have inspired considerable current interest instatistics, statistics has been fundamental in numerous areas ofscience, business, and government for decades
Roadmap
• A brief history
• Statistical stories• Our data-rich future
Roadmap
• A brief history• Statistical stories
• Our data-rich future
Roadmap
• A brief history• Statistical stories• Our data-rich future
What is statistics?
Statistics: The science of learning from dataand of measuring, controlling, andcommunicating uncertainty
The path to what is now the formal discipline of statisticalscience is long and winding. . .
What is statistics?
Statistics: The science of learning from dataand of measuring, controlling, andcommunicating uncertainty
The path to what is now the formal discipline of statisticalscience is long and winding. . .
What is statistics?
Statistics: The science of learning from dataand of measuring, controlling, andcommunicating uncertainty
The path to what is now the formal discipline of statisticalscience is long and winding. . .
Origins – pre-1700
• Sporadic accounts of measurement and data collectionand interpretation date back as early as 5 B.C.
• But it was not until the the mid-1600s that themathematical notions of probability began to be developedby (mainly) mathematicians and physicists (e.g., BlaisePascal), often inspired by games of chance
• The first formal attempt to summarize and learn from datawas by John Graunt, who created a precursor to modernlife tables used in demography
• Christiaan Huygens was among the first to connect suchdata analysis to probability
Origins – pre-1700
• Sporadic accounts of measurement and data collectionand interpretation date back as early as 5 B.C.
• But it was not until the the mid-1600s that themathematical notions of probability began to be developedby (mainly) mathematicians and physicists (e.g., BlaisePascal), often inspired by games of chance
• The first formal attempt to summarize and learn from datawas by John Graunt, who created a precursor to modernlife tables used in demography
• Christiaan Huygens was among the first to connect suchdata analysis to probability
Origins – pre-1700
• Sporadic accounts of measurement and data collectionand interpretation date back as early as 5 B.C.
• But it was not until the the mid-1600s that themathematical notions of probability began to be developedby (mainly) mathematicians and physicists (e.g., BlaisePascal), often inspired by games of chance
• The first formal attempt to summarize and learn from datawas by John Graunt, who created a precursor to modernlife tables used in demography
• Christiaan Huygens was among the first to connect suchdata analysis to probability
Origins – 1700-1750
• From 1700 to 1750, many key results in classicalprobability that underlie statistical theory were derived
• Jakob Bernoulli– law of large numbers, the Bernoulli andbinomial probability distributions
• Abraham de Moivre – The Doctrine of Chances, precursorto the central limit theorem
• Daniel Bernoulli – expected utility, applications ofprobability to measurement problems in astronomy
Origins – 1700-1750
• From 1700 to 1750, many key results in classicalprobability that underlie statistical theory were derived
• Jakob Bernoulli– law of large numbers, the Bernoulli andbinomial probability distributions
• Abraham de Moivre – The Doctrine of Chances, precursorto the central limit theorem
• Daniel Bernoulli – expected utility, applications ofprobability to measurement problems in astronomy
Milestone events – 1750-1820
• Thomas Bayes’ 1763 An essay towards solving a problemin the Doctrine of Chances presented a special case ofBayes’ theorem (posthumously)
• Arien-Marie Legendre described the method of leastsquares in 1805
●
●
●
●
●
●
●
●
●
●
●
●
●
Milestone events – 1750-1820
• Thomas Bayes’ 1763 An essay towards solving a problemin the Doctrine of Chances presented a special case ofBayes’ theorem (posthumously)
• Arien-Marie Legendre described the method of leastsquares in 1805
●
●
●
●
●
●
●
●
●
●
●
●
●
Milestone events – 1750-1820
• Thomas Bayes’ 1763 An essay towards solving a problemin the Doctrine of Chances presented a special case ofBayes’ theorem (posthumously)
• Arien-Marie Legendre described the method of leastsquares in 1805
●
●
●
●
●
●
●
●
●
●
●
●
●
Milestone events – 1750-1820
• Thomas Bayes’ 1763 An essay towards solving a problemin the Doctrine of Chances presented a special case ofBayes’ theorem (pothumously)
• Arien-Marie Legendre described the method of leastsquares in 1805
●
●
●
●
●
●
●
●
●
●
●
●
●
Milestone events – 1750-1820
• Thomas Bayes’ 1763 An essay towards solving a problemin the Doctrine of Chances presented a special case ofBayes’ theorem (pothumously)
• Arien-Marie Legendre described the method of leastsquares in 1805
Milestone events – 1750-1820
• Carl Fredrich Gauss connected least squares to Bayestheorem in 1809
• Pierre-Simon Laplace derived the central limit theorem andconnected the normal probability distribution to leastsquares in 1810
Milestone events – 1750-1820
• Carl Fredrich Gauss connected least squares to Bayestheorem in 1809
• Pierre-Simon Laplace derived the central limit theorem andconnected the normal probability distribution to leastsquares in 1810
Milestone events – 1750-1820
• Carl Fredrich Gauss connected least squares to Bayestheorem in 1809
• Pierre-Simon Laplace derived the central limit theorem andconnected the normal probability distribution to leastsquares in 1810
Milestone events – 1750-1820
• Carl Fredrich Gauss connected least squares to Bayestheorem in 1809
• Pierre-Simon Laplace derived the central limit theorem andconnected the normal probability distribution to leastsquares in 1810
More milestones – 1820-1900
• Aldolphe Quetelet pioneered the statistical analysis ofsocial science data – the “average man” (1835) and thenormal distribution as a model for measurements (1842)
• The Royal Statistical Society (1834) and AmericanStatistical Association (1839) were founded
• Francis Galton introduced regression analysis (1885) andcorrelation (1888)
• Karl Pearson established the field of biometry anddeveloped fundamental methods, and founded the firststatistical journal, Biometrika (1901)
More milestones – 1820-1900
• Aldolphe Quetelet pioneered the statistical analysis ofsocial science data – the “average man” (1835) and thenormal distribution as a model for measurements (1842)
• The Royal Statistical Society (1834) and AmericanStatistical Association (1839) were founded
• Francis Galton introduced regression analysis (1885) andcorrelation (1888)
• Karl Pearson established the field of biometry anddeveloped fundamental methods, and founded the firststatistical journal, Biometrika (1901)
More milestones – 1820-1900
• Aldolphe Quetelet pioneered the statistical analysis ofsocial science data – the “average man” (1835) and thenormal distribution as a model for measurements (1842)
• The Royal Statistical Society (1834) and AmericanStatistical Association (1839) were founded
• Francis Galton introduced regression analysis (1885) andcorrelation (1888)
• Karl Pearson established the field of biometry anddeveloped fundamental methods, and founded the firststatistical journal, Biometrika (1901)
Modern statistics – 1900-1950s
The modern discipline of statistics was reallyestablished only in the twentieth century
• William Gosset (“Student”), a brewer for Guinness inDublin, derived the Student’s t distribution in 1908
• In the 1920s, Ronald Fisher developed many fundamentalconcepts, including the ideas of statistical models andrandomization, theory of experimental design, the methodof analysis of variance, and tests of significance
• In the 1930s, Jerzy Neyman and Egon Pearson developedthe theory of sampling, the competing approach ofhypothesis testing, and the concept of confidence intervals
• Experimental design became a mainstay of agriculturalresearch
Modern statistics – 1900-1950s
The modern discipline of statistics was reallyestablished only in the twentieth century• William Gosset (“Student”), a brewer for Guinness in
Dublin, derived the Student’s t distribution in 1908
• In the 1920s, Ronald Fisher developed many fundamentalconcepts, including the ideas of statistical models andrandomization, theory of experimental design, the methodof analysis of variance, and tests of significance
• In the 1930s, Jerzy Neyman and Egon Pearson developedthe theory of sampling, the competing approach ofhypothesis testing, and the concept of confidence intervals
• Experimental design became a mainstay of agriculturalresearch
Modern statistics – 1900-1950s
The modern discipline of statistics was reallyestablished only in the twentieth century• William Gosset (“Student”), a brewer for Guinness in
Dublin, derived the Student’s t distribution in 1908• In the 1920s, Ronald Fisher developed many fundamental
concepts, including the ideas of statistical models andrandomization, theory of experimental design, the methodof analysis of variance, and tests of significance
• In the 1930s, Jerzy Neyman and Egon Pearson developedthe theory of sampling, the competing approach ofhypothesis testing, and the concept of confidence intervals
• Experimental design became a mainstay of agriculturalresearch
Modern statistics – 1900-1950s
The modern discipline of statistics was reallyestablished only in the twentieth century• William Gosset (“Student”), a brewer for Guinness in
Dublin, derived the Student’s t distribution in 1908• In the 1920s, Ronald Fisher developed many fundamental
concepts, including the ideas of statistical models andrandomization, theory of experimental design, the methodof analysis of variance, and tests of significance
• In the 1930s, Jerzy Neyman and Egon Pearson developedthe theory of sampling, the competing approach ofhypothesis testing, and the concept of confidence intervals
• Experimental design became a mainstay of agriculturalresearch
Modern statistics – 1900-1950s
The modern discipline of statistics was reallyestablished only in the twentieth century• William Gosset (“Student”), a brewer for Guinness in
Dublin, derived the Student’s t distribution in 1908• In the 1920s, Ronald Fisher developed many fundamental
concepts, including the ideas of statistical models andrandomization, theory of experimental design, the methodof analysis of variance, and tests of significance
• In the 1930s, Jerzy Neyman and Egon Pearson developedthe theory of sampling, the competing approach ofhypothesis testing, and the concept of confidence intervals
• Experimental design became a mainstay of agriculturalresearch
Modern statistics – 1900-1950s
• Fisher/Neyman-Pearson established the paradigm offrequentist statistical inference that is used today
• Also in the 1930s, Bayesian statistical inference wasdeveloped by Bruno de Finetti and others
• In the 1940s, many departments of statistics wereestablished at universities in the US and Europe
• And fundamental theory of statistical inference waspursued by Wald, Cramer, Rao and many others
Modern statistics – 1900-1950s
• Fisher/Neyman-Pearson established the paradigm offrequentist statistical inference that is used today
• Also in the 1930s, Bayesian statistical inference wasdeveloped by Bruno de Finetti and others
• In the 1940s, many departments of statistics wereestablished at universities in the US and Europe
• And fundamental theory of statistical inference waspursued by Wald, Cramer, Rao and many others
Modern statistics – 1900-1950s
• Fisher/Neyman-Pearson established the paradigm offrequentist statistical inference that is used today
• Also in the 1930s, Bayesian statistical inference wasdeveloped by Bruno de Finetti and others
• In the 1940s, many departments of statistics wereestablished at universities in the US and Europe
• And fundamental theory of statistical inference waspursued by Wald, Cramer, Rao and many others
Modern statistics to the present
From the 1950s on, there were numerousadvances in theory, methods, and application• The advent of medical statistics and epidemiological
methods (Richard Doll, Austin Bradford Hill)• The development of methods for analysis of censored
time-to-event data (Paul Meier, D.R. Cox)• The use of the theory of sampling to design surveys and
the US census (Jerzy Neyman, Morris Hansen)• The adoption of statistical quality control and experimental
design in industry (W. Edwards Deming, George Box)• Exploratory data analysis (John Tukey)• And many, many more. . .
Modern statistics to the present
Computing fundamentally altered the field ofstatistics forever• Complex calculations became feasible• Much larger and more complicated data sets could be
created and analyzed• Sophisticated models and methods could be applied
• Statistical software implementing popular methods becamewidespread (e.g., SAS, developed at NC State in the1960s/70s)
• Simulation to investigate performance of statisticalmethods became possible
• Bayesian statistical methods became feasible in complexsettings (Markov chain Monte Carlo – MCMC)
Modern statistics to the present
Computing fundamentally altered the field ofstatistics forever• Complex calculations became feasible• Much larger and more complicated data sets could be
created and analyzed• Sophisticated models and methods could be applied• Statistical software implementing popular methods became
widespread (e.g., SAS, developed at NC State in the1960s/70s)
• Simulation to investigate performance of statisticalmethods became possible
• Bayesian statistical methods became feasible in complexsettings (Markov chain Monte Carlo – MCMC)
Modern statistics to the present
Computing fundamentally altered the field ofstatistics forever• Complex calculations became feasible• Much larger and more complicated data sets could be
created and analyzed• Sophisticated models and methods could be applied• Statistical software implementing popular methods became
widespread (e.g., SAS, developed at NC State in the1960s/70s)
• Simulation to investigate performance of statisticalmethods became possible
• Bayesian statistical methods became feasible in complexsettings (Markov chain Monte Carlo – MCMC)
Today
Statistical methods are used routinely inscience, industry/business, and government• Pharmaceutical companies employ statisticians, who work
in all stages of drug development
• Statisticians are ubiquitous in medical and public healthresearch, working with health sciences researchers todesign studies, analyze data, and draw conclusions
• Google, Facebook, LinkedIn, credit card companies, globalretailers employ statisticians to develop and implementmethods to mine their vast data
• Government science, regulatory, and statistical agenciesemploy statisticians to design surveys, make forecasts,develop estimates of income, review new drug applications,assess evidence of health effects of pollutants, . . .
Today
Statistical methods are used routinely inscience, industry/business, and government• Pharmaceutical companies employ statisticians, who work
in all stages of drug development• Statisticians are ubiquitous in medical and public health
research, working with health sciences researchers todesign studies, analyze data, and draw conclusions
• Google, Facebook, LinkedIn, credit card companies, globalretailers employ statisticians to develop and implementmethods to mine their vast data
• Government science, regulatory, and statistical agenciesemploy statisticians to design surveys, make forecasts,develop estimates of income, review new drug applications,assess evidence of health effects of pollutants, . . .
Today
Statistical methods are used routinely inscience, industry/business, and government• Pharmaceutical companies employ statisticians, who work
in all stages of drug development• Statisticians are ubiquitous in medical and public health
research, working with health sciences researchers todesign studies, analyze data, and draw conclusions
• Google, Facebook, LinkedIn, credit card companies, globalretailers employ statisticians to develop and implementmethods to mine their vast data
• Government science, regulatory, and statistical agenciesemploy statisticians to design surveys, make forecasts,develop estimates of income, review new drug applications,assess evidence of health effects of pollutants, . . .
Today
Statistical methods are used routinely inscience, industry/business, and government• Pharmaceutical companies employ statisticians, who work
in all stages of drug development• Statisticians are ubiquitous in medical and public health
research, working with health sciences researchers todesign studies, analyze data, and draw conclusions
• Google, Facebook, LinkedIn, credit card companies, globalretailers employ statisticians to develop and implementmethods to mine their vast data
• Government science, regulatory, and statistical agenciesemploy statisticians to design surveys, make forecasts,develop estimates of income, review new drug applications,assess evidence of health effects of pollutants, . . .
Statistical stories
Some diverse examples where statistics andstatisticians are essential. . .
The controlled clinical trial
The gold standard study for comparison oftreatments (a question of cause and effect)
• An experiment designed to compare a new treatment to acontrol treatment
• Subjects are randomized to receive one treatment or theother⇒ unbiased, fair comparison using statisticalmethods (hypothesis testing)
• In addition, blinding, placebo• The first such clinical trial was conducted in the UK by the
Medical Research Council in 1948, comparingstreptomycin+bed rest to bed rest alone in tuberculosis
• In 1954, 800K children in the US were randomized to theSalk polio vaccine or placebo to assess the vaccine’seffectiveness in preventing paralytic polio
The controlled clinical trial
The gold standard study for comparison oftreatments (a question of cause and effect)• An experiment designed to compare a new treatment to a
control treatment• Subjects are randomized to receive one treatment or the
other⇒ unbiased, fair comparison using statisticalmethods (hypothesis testing)
• In addition, blinding, placebo• The first such clinical trial was conducted in the UK by the
Medical Research Council in 1948, comparingstreptomycin+bed rest to bed rest alone in tuberculosis
• In 1954, 800K children in the US were randomized to theSalk polio vaccine or placebo to assess the vaccine’seffectiveness in preventing paralytic polio
The controlled clinical trial
The gold standard study for comparison oftreatments (a question of cause and effect)• An experiment designed to compare a new treatment to a
control treatment• Subjects are randomized to receive one treatment or the
other⇒ unbiased, fair comparison using statisticalmethods (hypothesis testing)
• In addition, blinding, placebo• The first such clinical trial was conducted in the UK by the
Medical Research Council in 1948, comparingstreptomycin+bed rest to bed rest alone in tuberculosis
• In 1954, 800K children in the US were randomized to theSalk polio vaccine or placebo to assess the vaccine’seffectiveness in preventing paralytic polio
The controlled clinical trial
• In 1969, evidence from a randomized clinical trial becamemandatory for a new product to receive approval from theUS Food and Drug Administration (FDA)
• Because a trial involves only a sample of patients from theentire population, the results are subject to uncertainty
• Statistical methods are critical for determining the samplesize required to ensure that a real difference can bedetected with a specified degree of confidence
• Which is why regulatory bodies like the FDA employ 100sof statisticians
• In the last 4 decades, statisticians have developed newmethods to handle ethical and practical considerations
• E.g., group sequential trials that allow interim analyses atwhich the trial can be stopped early without compromisingthe ability to make a valid comparison
The controlled clinical trial
• In 1969, evidence from a randomized clinical trial becamemandatory for a new product to receive approval from theUS Food and Drug Administration (FDA)
• Because a trial involves only a sample of patients from theentire population, the results are subject to uncertainty
• Statistical methods are critical for determining the samplesize required to ensure that a real difference can bedetected with a specified degree of confidence
• Which is why regulatory bodies like the FDA employ 100sof statisticians
• In the last 4 decades, statisticians have developed newmethods to handle ethical and practical considerations
• E.g., group sequential trials that allow interim analyses atwhich the trial can be stopped early without compromisingthe ability to make a valid comparison
The controlled clinical trial
• In 1969, evidence from a randomized clinical trial becamemandatory for a new product to receive approval from theUS Food and Drug Administration (FDA)
• Because a trial involves only a sample of patients from theentire population, the results are subject to uncertainty
• Statistical methods are critical for determining the samplesize required to ensure that a real difference can bedetected with a specified degree of confidence
• Which is why regulatory bodies like the FDA employ 100sof statisticians
• In the last 4 decades, statisticians have developed newmethods to handle ethical and practical considerations
• E.g., group sequential trials that allow interim analyses atwhich the trial can be stopped early without compromisingthe ability to make a valid comparison
The controlled clinical trial
• In 1969, evidence from a randomized clinical trial becamemandatory for a new product to receive approval from theUS Food and Drug Administration (FDA)
• Because a trial involves only a sample of patients from theentire population, the results are subject to uncertainty
• Statistical methods are critical for determining the samplesize required to ensure that a real difference can bedetected with a specified degree of confidence
• Which is why regulatory bodies like the FDA employ 100sof statisticians
• In the last 4 decades, statisticians have developed newmethods to handle ethical and practical considerations
• E.g., group sequential trials that allow interim analyses atwhich the trial can be stopped early without compromisingthe ability to make a valid comparison
The controlled clinical trial
National forest inventory
Next stop, Bhutan• The Kingdom of Bhutan, in South Asia, transitioned to a
constitutional democracy in 2008• The new constitution mandates that Bhutan maintain 60%
forest cover in perpetuity• A National Forest Inventory was called for. . .
• My friend Tim Gregoire of Yale University, an expert inforest biometry, was consulted to help plan and implementBhutan’s comprehensive NFI
National forest inventory
Next stop, Bhutan• The Kingdom of Bhutan, in South Asia, transitioned to a
constitutional democracy in 2008• The new constitution mandates that Bhutan maintain 60%
forest cover in perpetuity• A National Forest Inventory was called for. . .• My friend Tim Gregoire of Yale University, an expert in
forest biometry, was consulted to help plan and implementBhutan’s comprehensive NFI
National forest inventory
National forest inventory
A NFI is an assessment based on statisticalsampling and estimation of the forest resourcesof a nation• Set policy on forest resource management• Monitor biodiversity, habitat type and extent, land
conversion rates• Measure quantity/quality of wood fiber for commodities• Measure non-wood forest products• Measure carbon storage and change• Reference spatially where resources are located
Statistics is critical to developing the sampling plan for bothremote sensing and field data and to estimation of abundanceof resources based on 100s of measurements
National forest inventory
A NFI is an assessment based on statisticalsampling and estimation of the forest resourcesof a nation• Set policy on forest resource management• Monitor biodiversity, habitat type and extent, land
conversion rates• Measure quantity/quality of wood fiber for commodities• Measure non-wood forest products• Measure carbon storage and change• Reference spatially where resources are located
Statistics is critical to developing the sampling plan for bothremote sensing and field data and to estimation of abundanceof resources based on 100s of measurements
National forest inventory
Pharmacokinetics
What’s behind a drug label?• A drug should be safe and effective• Labeling provides guidance on dose, conditions under
which a drug should/should not be taken• Partly behind this – pharmacokinetics (PK), the science of
“what the body does to the drug”
• Key: Understanding Absorption, Distribution, Metabolism,Excretion in the population and how these processes varyacross patients and are altered by conditions
• Statistical modeling is an integral part of the science
Pharmacokinetics
What’s behind a drug label?• A drug should be safe and effective• Labeling provides guidance on dose, conditions under
which a drug should/should not be taken• Partly behind this – pharmacokinetics (PK), the science of
“what the body does to the drug”• Key: Understanding Absorption, Distribution, Metabolism,
Excretion in the population and how these processes varyacross patients and are altered by conditions
• Statistical modeling is an integral part of the science
Pharmacokinetics
A hierarchical statistical model that allows these processes tovary across patients and conditions is fitted to drugconcentration-time data
Pharmacokinetics
Conc(t) =ka Dose
V (ka − Cl/V )[exp{−(Cl/V )t} − exp(−kat)]
ka = absorption rate, V = volume of distribution, Cl = clearance
Forensic science
An area where statisticians and better statisticsare desperately needed!• Fingerprints, DNA analysis, bite marks, firearm toolmarks,
hair specimens, writing samples, toxicological analysis,. . .• Laboratory- or expert interpretation-based• 2009 US National Academy of Sciences report
• The report cites examples of lack of sufficient recognitionof sources of variability and their effects on uncertainties inmany types of forensic science analyses. . .
Forensic science
“With the exception of nuclear DNA analysis, however, no forensicmethod has been rigorously shown to have the capacity toconsistently, and with a high degree of certainty, demonstrate aconnection between evidence and a specific individual or source.”
“A body of research is required to establish the limits and measuresof performance and to address the impact of sources of variabilityand potential bias.”
“The development of quantifiable measures of uncertainty in theconclusions of forensic analyses . . . and of quantifiable measures ofthe reliability and accuracy of forensic analyses (are needed).”
Basically, the report recommends that current and new forensicpractices should be developed and assessed using properlydesigned experiments and statistical methods!
The hazards of haphazard data
When data are simply observed and collected,without a principled design and randomization,be wary!• Investigations of causal relationships can be compromised
by confounding• E.g., comparison of the effects of competing treatments• When individual patients and their providers decide which
treatment to take, there may be factors that are associatedwith both the choice of treatment and outcome
• Failure to recognize/identify such confounding factors canlead to misleading conclusions
Simpson’s paradox
Data on 2 treatments from a healthcare database
Ave
rage
Out
com
e
Avg Trt A
Avg Trt B
Simpson’s paradox
Data on 2 treatments from a healthcare database
Ave
rage
Out
com
e
Trt A
Trt A
Trt B
Trt B
Male Female
Simpson’s paradox
Data on 2 treatments from a healthcare database
Ave
rage
Out
com
e
Trt A
Trt A
Trt B
Trt B
Avg Trt A
Avg Trt B
A: 80%/20% M/F B: 20%/80% M/F
Male Female
Confounding and other threats
• Statistical methods are available to take confounding intoappropriate account
• . . . but the confounding factors must be recorded in thedatabase!
Other threats• Missing information – why are some factors not recorded
for some individuals?• Drop out – sicker patients may disappear sooner in a
longitudinal study• Etc
Comparative effectiveness research, which strivesto recommend best uses for existing treatment throughanalyses of such databases, requires statistics!
Confronting our data-rich future
I hope I have convinced you that statistics and statisticians areessential to our data-rich future!
Big Data have enormous potential for new generating newknowledge and improving human welfare. However, Big Datawithout statistics have enormous potential to mislead.
“The future demands that scientists, policy-makers, and thepublic be able to interpret increasingly complex information andrecognize both the benefits and pitfalls of statistical analysis.Embedding statistics in science and society will pave the routeto a data-informed future, and statisticians must lead thischarge.”– Davidian and Louis, Science, April 6, 2012
Confronting our data-rich future
I hope I have convinced you that statistics and statisticians areessential to our data-rich future!
Big Data have enormous potential for new generating newknowledge and improving human welfare. However, Big Datawithout statistics have enormous potential to mislead.
“The future demands that scientists, policy-makers, and thepublic be able to interpret increasingly complex information andrecognize both the benefits and pitfalls of statistical analysis.Embedding statistics in science and society will pave the routeto a data-informed future, and statisticians must lead thischarge.”– Davidian and Louis, Science, April 6, 2012
Confronting our data-rich future
I hope I have convinced you that statistics and statisticians areessential to our data-rich future!
Big Data have enormous potential for new generating newknowledge and improving human welfare. However, Big Datawithout statistics have enormous potential to mislead.
“The future demands that scientists, policy-makers, and thepublic be able to interpret increasingly complex information andrecognize both the benefits and pitfalls of statistical analysis.Embedding statistics in science and society will pave the routeto a data-informed future, and statisticians must lead thischarge.”– Davidian and Louis, Science, April 6, 2012
2013 – the International Year of Statistics
A celebration of the contributions of statistics islong overdue!
http://statistics2013.org
References and further reading
Aldrich, J. Figures from the history of probability and statistics.http://www.economics.soton.ac.uk/staff/aldrich/Figures.htm
Davidian, M. and Louis, T.A. (2012). Why statistics? Science, 336, 12.
Feinberg, S.E. (1992). A brief history of statistics in three and onehalf chapters: A review essay. Statistical Science, 7, 208–225.
Stigler, S.M. (1986). The History of Statistics: The Measurement ofUncertainty Before 1900. Harvard University Press.