{"id":740,"date":"2016-04-21T22:43:36","date_gmt":"2016-04-21T22:43:36","guid":{"rendered":"https:\/\/courses.lumenlearning.com\/introstats1xmaster\/?post_type=chapter&#038;p=740"},"modified":"2021-06-24T20:48:27","modified_gmt":"2021-06-24T20:48:27","slug":"the-f-distribution-and-the-f-ratio","status":"publish","type":"chapter","link":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/chapter\/the-f-distribution-and-the-f-ratio\/","title":{"raw":"13.2: The F Distribution and the F-Ratio","rendered":"13.2: The F Distribution and the F-Ratio"},"content":{"raw":"<div class=\"textbox learning-objectives\">\r\n<h3>Learning Outcomes<\/h3>\r\n<section>\r\n<ul id=\"fs-idp124304720\">\r\n \t<li>Interpret the <em data-effect=\"italics\">F<\/em> probability distribution as the number of groups and the sample size change<\/li>\r\n<\/ul>\r\n<\/section><\/div>\r\nThe distribution used for the hypothesis test is a new one. It is called the\u00a0<em>F <\/em>distribution, named after Sir Ronald Fisher, an English statistician. The <em>F<\/em> statistic is a ratio (a fraction). There are two sets of degrees of freedom; one for the numerator and one for the denominator.\r\n\r\nFor example, if \u00a0<em>F<\/em> follows an <em>F<\/em> distribution and the number of degrees of freedom for the numerator is four, and the number of degrees of freedom for the denominator is ten, then <em>F<\/em> ~ <em>F<sub data-redactor-tag=\"sub\">4,10<\/sub><\/em>.\r\n\r\n<hr \/>\r\n\r\n<h4>Note<\/h4>\r\nThe\u00a0<em>F<\/em> distribution is derived from the Student's t-distribution. The values of the <em>F<\/em> distribution are squares of the corresponding values of the <em>t<\/em>-distribution. One-Way ANOVA expands the <em>t<\/em>-test for comparing more than two groups. The scope of that derivation is beyond the level of this course.\r\n\r\n<hr \/>\r\n\r\nTo calculate the\u00a0<em>F<\/em> ratio, two estimates of the variance are made.\r\n<ol>\r\n \t<li><strong>Variance between samples:<\/strong> An estimate of <em>\u03c3<\/em><sup>2<\/sup> that is the variance of the sample means multiplied by <em>n<\/em> (when the sample sizes are the same.). If the samples are different sizes, the variance between samples is weighted to account for the different sample sizes. The variance is also called <strong>variation due to treatment or explained variation<\/strong>.<\/li>\r\n \t<li><strong>Variance within samples:<\/strong> An estimate of <em>\u03c3<\/em><sup>2<\/sup> that is the average of the sample variances (also known as a pooled variance). When the sample sizes are different, the variance within samples is weighted. The variance is also called the <strong>variation due to error or unexplained variation<\/strong>.<\/li>\r\n<\/ol>\r\n<ul>\r\n \t<li><em>SS<\/em><sub>between<\/sub> = the sum of squares that represents the variation among the different samples<\/li>\r\n \t<li><em>SS<\/em><sub>within<\/sub> = the sum of squares that represents the variation within samples that is due to chance.<\/li>\r\n<\/ul>\r\nTo find a \"sum of squares\" means to add together squared quantities that, in some cases, may be weighted.\r\n\r\n<em>MS<\/em> means \"<strong>mean square<\/strong>.\" <em>MS<\/em><sub>between<\/sub> is the variance between groups, and <em>MS<\/em><sub>within<\/sub> is the variance within groups.\r\n<h3>Calculation of Sum of Squares and Mean Square<\/h3>\r\n<em>k<\/em> = the number of different groups\r\n\r\n<em>nj<\/em> = the size of the <em>jth<\/em> group\r\n\r\n<em>sj<\/em> = the sum of the values in the <em>jth<\/em> group\r\n\r\n<em>n<\/em> = total number of all the values combined (total sample size: \u2211<em>n<sub data-redactor-tag=\"sub\">j<\/sub><\/em>)\r\n\r\n<em>x<\/em> = one value: \u2211<em>x<\/em> = \u2211<em>s<sub data-redactor-tag=\"sub\">j<\/sub><\/em>\r\n\r\nSum of squares of all values from every group combined: \u2211\r\n<em>x<\/em><sup>2<\/sup>\r\n\r\nBetween group variability:\r\n<em>SS<\/em><sub>total<\/sub> = [latex]\\displaystyle\\sum{{x}^{{2}}}-\\frac{{\\sum{x}^{{2}}}}{{n}}[\/latex]\r\n\r\nTotal sum of squares:\r\n[latex]\\displaystyle\\sum{x}^{{2}}-\\frac{{{(\\sum{x})}^{{2}}}}{{n}}[\/latex]\r\n\r\nExplained variation: sum of squares representing variation among the different samples:\r\n[latex]\\displaystyle{S}{S}_{{\\text{between}}}=\\sum{[\\frac{{({s}{j})}^{{2}}}{{n}_{{j}}}]}-\\frac{{(\\sum{s}_{{j}})}^{{2}}}{{n}}[\/latex]\r\n\r\nUnexplained variation: sum of squares representing variation within samples due to chance:\r\n[latex]\\displaystyle{S}{S}_{{\\text{within}}}={S}{S}_{{\\text{total}}}-{S}{S}_{{\\text{between}}}[\/latex]\r\n\r\n<em>df<\/em>'s for different groups (<em>df<\/em>'s for the numerator): <em>df<\/em> = <em>k<\/em> \u2013 1\r\n\r\nEquation for errors within samples (<em>df<\/em>'s for the denominator):\r\n<p style=\"text-align: center;\"><em>df<\/em><sub>within<\/sub> = <em>n<\/em> \u2013 <em>k<\/em><\/p>\r\nMean square (variance estimate) explained by the different groups:\r\n[latex]\\displaystyle{M}{S}_{{\\text{between}}}=\\frac{{{S}{S}_{{\\text{between}}}}}{{{d}{f}_{{\\text{between}}}}}[\/latex]\r\n\r\nMean square (variance estimate) that is due to chance (unexplained):\r\n[latex]\\displaystyle{M}{S}_{{\\text{within}}}=\\frac{{{S}{S}_{{\\text{within}}}}}{{{d}{f}_{{\\text{within}}}}}[\/latex]\r\n\r\n<em>MS<\/em><sub>between<\/sub> and <em>MS<\/em><sub>within<\/sub> can be written as follows:\r\n<ul>\r\n \t<li>[latex]\\displaystyle{M}{S}_{{\\text{between}}}=\\frac{{{S}{S}_{{\\text{between}}}}}{{{d}{f}_{{\\text{between}}}}}=\\frac{{{S}{S}_{{\\text{between}}}}}{{{k}-{1}}}[\/latex]<\/li>\r\n \t<li>[latex]\\displaystyle{M}{S}_{{\\text{within}}}=\\frac{{{S}{S}_{{\\text{within}}}}}{{{d}{f}_{{\\text{within}}}}}=\\frac{{{S}{S}_{{\\text{within}}}}}{{{n}-{k}}}[\/latex]<\/li>\r\n<\/ul>\r\nThe one-way ANOVA test depends on the fact that\r\n<em>MS<\/em><sub>between<\/sub> can be influenced by population differences among means of the several groups. Since <em>MS<\/em><sub>within<\/sub> compares values of each group to its own group mean, the fact that group means might be different does not affect <em>MS<\/em><sub>within<\/sub>.\r\n\r\nThe null hypothesis says that all groups are samples from populations having the same normal distribution. The alternate hypothesis says that at least two of the sample groups come from populations with different normal distributions. If the null hypothesis is true,\r\n<em>MS<\/em><sub>between<\/sub> and <em>MS<\/em><sub>within<\/sub> should both estimate the same value.\r\n\r\n<hr \/>\r\n\r\n<h4>Note<\/h4>\r\nThe null hypothesis says that all the group population means are equal. The hypothesis of equal means implies that the populations have the same normal distribution, because it is assumed that the populations are normal and that they have equal variances.\r\n\r\n<hr \/>\r\n\r\n<h3>F-Ratio or F Statistic<\/h3>\r\n[latex]\\displaystyle{F}=\\frac{{{M}{S}_{{\\text{between}}}}}{{{M}{S}_{{\\text{within}}}}}[\/latex]\r\n\r\nIf\r\n<em>MS<\/em><sub>between<\/sub> and <em>MS<\/em><sub>within<\/sub> estimate the same value (following the belief that <em>H0<\/em> is true), then the <em>F<\/em>-ratio should be approximately equal to one. Mostly, just sampling errors would contribute to variations away from one. As it turns out, <em>MS<\/em><sub>between<\/sub> consists of the population variance plus a variance produced from the differences between the samples. <em>MS<\/em><sub>within<\/sub> is an estimate of the population variance. Since variances are always positive, if the null hypothesis is false, <em>MS<\/em><sub>between<\/sub> will generally be larger than <em>MS<\/em><sub>within<\/sub>.Then the <em>F<\/em>-ratio will be larger than one. However, if the population effect is small, it is not unlikely that <em>MS<\/em><sub>within<\/sub> will be larger in a given sample.\r\n\r\nThe foregoing calculations were done with groups of different sizes. If the groups are the same size, the calculations simplify somewhat and the\r\n<em>F<\/em>-ratio can be written as:\r\n<h4>F-Ratio Formula when the groups are the same size<\/h4>\r\n[latex]\\displaystyle{F}=\\frac{{{n}\\cdot{{s}_{\\overline{{x}}}^{{ {2}}}}}}{{{{s}_{{\\text{pooled}}}^{{2}}}}}[\/latex]\r\n\r\nwhere ...\r\n<ul>\r\n \t<li><em>n<\/em> = the sample size<\/li>\r\n \t<li><em>df<\/em><sub>numerator<\/sub> = <em>k<\/em> \u2013 1<\/li>\r\n \t<li><em>df<\/em><sub>denominator<\/sub> = <em>n<\/em> \u2013 <em>k<\/em><\/li>\r\n \t<li><em>s<\/em><sup>2<\/sup><sub>pooled<\/sub> = the mean of the sample variances (pooled variance)<\/li>\r\n \t<li>[latex]\\displaystyle{{s}_{\\overline{{x}}}^{{ {2}}}}[\/latex] = the variance of the sample means<\/li>\r\n<\/ul>\r\nData are typically put into a table for easy viewing. One-Way ANOVA results are often displayed in this manner by computer software.\r\n\r\n&nbsp;\r\n<table>\r\n<thead>\r\n<tr>\r\n<th>Source of Variation<\/th>\r\n<th>Sum of Squares (\r\n<em>SS<\/em>)<\/th>\r\n<th>Degrees of Freedom (\r\n<em>df<\/em>)<\/th>\r\n<th>Mean Square (\r\n<em>MS<\/em>)<\/th>\r\n<th><em>F<\/em><\/th>\r\n<\/tr>\r\n<\/thead>\r\n<tbody>\r\n<tr>\r\n<td>Factor(Between)<\/td>\r\n<td><em>SS<\/em>(Factor)<\/td>\r\n<td><em>k<\/em> \u2013 1<\/td>\r\n<td><em>MS<\/em>(Factor) =<em>SS<\/em>(Factor)\/(<em>k<\/em> \u2013 1)<\/td>\r\n<td><em>F<\/em> =<em>MS<\/em>(Factor)\/<em>MS<\/em>(Error)<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>Error(Within)<\/td>\r\n<td><em>SS<\/em>(Error)<\/td>\r\n<td><em>n<\/em> \u2013 <em>k<\/em><\/td>\r\n<td><em>MS<\/em>(Error) =<em>SS<\/em>(Error)\/(<em>n<\/em> \u2013<em>k<\/em>)<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>Total<\/td>\r\n<td><em>SS<\/em>(Total)<\/td>\r\n<td><em>n<\/em> \u2013 1<\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n<div class=\"textbox exercises\">\r\n<h3>Example<\/h3>\r\nThree different diet plans are to be tested for mean weight loss. The entries in the table are the weight losses for the different plans. The one-way ANOVA results are shown in in the table here.\r\n\r\n3.5\r\n<table>\r\n<thead>\r\n<tr>\r\n<th>Plan 1:\r\n<em>n<\/em><sub>1<\/sub> = 4<\/th>\r\n<th>Plan 2:\r\n<em>n<\/em><sub>2<\/sub> = 3<\/th>\r\n<th>Plan 3:\r\n<em>n<\/em><sub>3<\/sub> = 3<\/th>\r\n<\/tr>\r\n<\/thead>\r\n<tbody>\r\n<tr>\r\n<td>5<\/td>\r\n<td>3.5<\/td>\r\n<td>8<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>4.5<\/td>\r\n<td>7<\/td>\r\n<td>4<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>4<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>3<\/td>\r\n<td>4.5<\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n<em>s<\/em><sub>1<\/sub> = 16.5, <em>s<\/em><sub>2<\/sub> =15, <em>s<\/em><sub>3<\/sub> = 15.7\r\n\r\nFollowing are the calculations needed to fill in the one-way ANOVA table. The table is used to conduct a hypothesis test.\r\n\r\n[latex]\\displaystyle{{S}{S}}_{{\\text{between}}}=\\sum{\\left[\\frac{{{({s}_{j})}^{2}}}{{{n}_{j}}}\\right]}-\\frac{{(\\sum{{s}_{j})}^{2}}}{{n}}[\/latex]\r\n\r\n[latex]\\displaystyle=\\frac{{{{s}_{1}}^{2}}}{{4}}+\\frac{{{{s}_{2}}^{2}}}{{3}}+\\frac{{{{s}_{3}}^{2}}}{{3}}[\/latex]\r\n\r\nwhere\r\n\r\n<em>n<\/em><sub>1<\/sub> = 4, <em>n<\/em><sub>2<\/sub> = 3, <em>n<\/em><sub>3<\/sub> = 3 and <em>n<\/em> = <em>n<\/em><sub>1<\/sub> + <em>n<\/em><sub>2<\/sub> + <em>n<\/em><sub>3<\/sub> = 10\r\n\r\n[latex]\\displaystyle=\\frac{{({16.5})^{2}}}{{4}}+\\frac{{({15})^{2}}}{{3}}+\\frac{{ ({5.5})^{2}}}{{3}}-\\frac{{ {({16.5}+{15}+{15.5})}^{2}}}{{10}}[\/latex]\r\n\r\n[latex]\\displaystyle{{S}{S}}_{{\\text{between}}}={2.2458}{S}_{{\\text{total}}}=\\sum{x}^{2}-\\frac{{{(\\sum{x})}^{2}}}{{n}}[\/latex]\r\n\r\n[latex]\\displaystyle=\\left({5}^{2}+{4.5}^{2}+{4}^{2}+{3}^{2}+{3.5}^{2}+{7}^{2}+{4.5}^{2}+{8}^{2}+{4}^{2}+{3.5}^{2}\\right)[\/latex]\r\n\r\n[latex]\\displaystyle{-}\\frac{{{\\left({5}+{4.5}+{4}+{3}+{3.5}+{7}+{4.5}+{8}+{4}+{3.5}\\right)}^{2}}}{{10}}[\/latex]\r\n\r\n[latex]\\displaystyle={244}-\\frac{{{47}^{2}}}{{10}}={244}-{220.9}[\/latex]\r\n\r\n<\/div>\r\n<h4>Using a Calculator<\/h4>\r\nOne-Way ANOVA Table: The formulas for\r\n<em>SS<\/em>(Total), <em>SS<\/em>(Factor) = <em>SS<\/em>(Between) and<em>SS<\/em>(Error) = <em>SS<\/em>(Within) as shown previously.\r\n\r\nThe same information is provided by the TI calculator hypothesis test function ANOVA in STAT TESTS (syntax is ANOVA(L1, L2, L3) where L1, L2, L3 have the data from Plan 1, Plan 2, Plan 3 respectively).\r\n<table>\r\n<thead>\r\n<tr>\r\n<th>Source of Variation<\/th>\r\n<th>Sum of Squares (\r\n<em>SS<\/em>)<\/th>\r\n<th>Degrees of Freedom (\r\n<em>df<\/em>)<\/th>\r\n<th>Mean Square (\r\n<em>MS<\/em>)<\/th>\r\n<th><em>F<\/em><\/th>\r\n<\/tr>\r\n<\/thead>\r\n<tbody>\r\n<tr>\r\n<td>Factor(Between)<\/td>\r\n<td><em>SS<\/em>(Factor)=<em>SS<\/em>(Between)= 2.2458<\/td>\r\n<td><em>k<\/em> \u2013 1= 3 groups \u2013 1= 2<\/td>\r\n<td><em>MS<\/em>(Factor)=<em>SS<\/em>(Factor)\/(<em>k<\/em>\u2013 1)= 2.2458\/2= 1.1229<\/td>\r\n<td><em>F<\/em> =<em>MS<\/em>(Factor)\/<em>MS<\/em>(Error)= 1.1229\/2.9792= 0.3769<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>Error(Within)<\/td>\r\n<td><em>SS<\/em>(Error)= <em>SS<\/em>(Within)= 20.8542<\/td>\r\n<td><em>n<\/em> \u2013 <em>k<\/em>= 10 total data \u2013 3 groups= 7<\/td>\r\n<td><em>MS<\/em>(Error)=<em>SS<\/em>(Error)\/(<em>n<\/em>\u2013 <em>k<\/em>)= 20.8542\/7= 2.9792<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>Total<\/td>\r\n<td><em>SS<\/em>(Total)= 2.2458 + 20.8542= 23.1<\/td>\r\n<td><em>n<\/em> \u2013 1= 10 total data \u2013 1= 9<\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n\r\n<hr \/>\r\n\r\n<div class=\"textbox key-takeaways\">\r\n<h3>Try it<\/h3>\r\nAs part of an experiment to see how different types of soil cover would affect slicing tomato production, Marist College students grew tomato plants under different soil cover conditions. Groups of three plants each had one of the following treatments\r\n<ul>\r\n \t<li>bare soil<\/li>\r\n \t<li>a commercial ground cover<\/li>\r\n \t<li>black plastic<\/li>\r\n \t<li>straw<\/li>\r\n \t<li>compost<\/li>\r\n<\/ul>\r\nAll plants grew under the same conditions and were the same variety. Students recorded the weight (in grams) of tomatoes produced by each of the\u00a0<em>n<\/em> = 15 plants:\r\n<table>\r\n<thead>\r\n<tr>\r\n<th>Bare:\r\n<em>n<\/em>1 = 3<\/th>\r\n<th>Ground Cover:\r\n<em>n<\/em>2 = 3<\/th>\r\n<th>Plastic:\r\n<em>n<\/em>3 = 3<\/th>\r\n<th>Straw:\r\n<em>n<\/em>4 = 3<\/th>\r\n<th>Compost:\r\n<em>n<\/em>5 = 3<\/th>\r\n<\/tr>\r\n<\/thead>\r\n<tbody>\r\n<tr>\r\n<td>2,625<\/td>\r\n<td>5,348<\/td>\r\n<td>6,583<\/td>\r\n<td>7,285<\/td>\r\n<td>6,277<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>2,997<\/td>\r\n<td>5,682<\/td>\r\n<td>8,560<\/td>\r\n<td>6,897<\/td>\r\n<td>7,818<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>4,915<\/td>\r\n<td>5,482<\/td>\r\n<td>3,830<\/td>\r\n<td>9,230<\/td>\r\n<td>8,677<\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\nCreate the one-way ANOVA table.\r\n\r\nEnter the data into lists L1, L2, L3, L4 and L5. Press STAT and arrow over to TESTS. Arrow down to ANOVA. Press ENTER and enter L1, L2, L3, L4, L5). Press ENTER. The table was filled in with the results from the calculator.\r\n\r\n<\/div>\r\nOne-Way ANOVA table:\r\n<table>\r\n<thead>\r\n<tr>\r\n<th>Source of Variation<\/th>\r\n<th>Sum of Squares (\r\n<em>SS<\/em>)<\/th>\r\n<th>Degrees of Freedom (\r\n<em>df<\/em>)<\/th>\r\n<th>Mean Square (\r\n<em>MS<\/em>)<\/th>\r\n<th><em>F<\/em><\/th>\r\n<\/tr>\r\n<\/thead>\r\n<tbody>\r\n<tr>\r\n<td>Factor (Between)<\/td>\r\n<td>36,648,561<\/td>\r\n<td>5 \u2013 1 = 4<\/td>\r\n<td>[latex]\\displaystyle\\frac{{{36},{648},{561}}}{{4}}={9},{162},{140}[\/latex]<\/td>\r\n<td>[latex]\\displaystyle\\frac{{{9},{162},{140}}}{{{2},{044},{672.6}}}={4.4810}[\/latex]<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>Error (Within)<\/td>\r\n<td>20,446,726<\/td>\r\n<td>15 \u2013 5 = 10<\/td>\r\n<td>[latex]\\displaystyle\\frac{{{20},{446},{726}}}{{10}}={2},{044},{672.6}[\/latex]<\/td>\r\n<\/tr>\r\n<tr>\r\n<td>Total<\/td>\r\n<td>57,095,287<\/td>\r\n<td>15 \u2013 1 = 14<\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\nThe one-way ANOVA hypothesis test is always right-tailed because larger\r\n<em>F<\/em>-values are way out in the right tail of the <em>F<\/em>-distribution curve and tend to make us reject <em>H<sub data-redactor-tag=\"sub\">0<\/sub><\/em>.\r\n<h2>Notation<\/h2>\r\nThe notation for the\u00a0<em>F<\/em> distribution is <em>F<\/em> ~ <em>F<\/em><sub><em data-redactor-tag=\"em\">df<\/em>(<em>num<\/em>),<em>df<\/em>(<em>denom<\/em>)<\/sub>\r\n\r\nwhere\u00a0<em>df<\/em>(<em>num<\/em>) = <em>df<\/em><sub>between<\/sub> and <em>df<\/em>(<em>denom<\/em>) = <em>df<\/em><sub>within<\/sub>\r\n\r\nThe mean for the\u00a0<em>F<\/em> distribution is [latex]\\displaystyle\\mu=\\frac{{{d}{f}{(\\text{num})}}}{{{d}{f}{(\\text{denom})}}}-{1}[\/latex]\r\n<h2>References<\/h2>\r\nTomato Data, Marist College School of Science (unpublished student research)\r\n<h2>Concept Review<\/h2>\r\nAnalysis of variance compares the means of a response variable for several groups. ANOVA compares the variation within each group to the variation of the mean of each group. The ratio of these two is the\r\n<em>F<\/em> statistic from an <em>F<\/em> distribution with (number of groups \u2013 1) as the numerator degrees of freedom and (number of observations \u2013 number of groups) as the denominator degrees of freedom. These statistics are summarized in the ANOVA table.\r\n<h2>Formula Review<\/h2>\r\n[latex]\\displaystyle{S}{S}_{{\\text{between}}}=\\sum{[\\frac{{({s}{j})}^{{2}}}{{n}_{{j}}}]}-\\frac{{(\\sum{s}_{{j}})}^{{2}}}{{n}}[\/latex]\r\n\r\n<em>SS<\/em><sub>total<\/sub> = [latex]\\displaystyle\\sum{{x}^{{2}}}-\\frac{{\\sum{x}^{{2}}}}{{n}}[\/latex]\r\n\r\n[latex]\\displaystyle{S}{S}_{{\\text{within}}}={S}{S}_{{\\text{total}}}-{S}{S}_{{\\text{between}}}[\/latex]\r\n\r\ndf<sub>between<\/sub> = df(num) = k \u2013 1\r\n\r\n<em>df<\/em><sub>within<\/sub> = <em>df(denom)<\/em> = <em>n<\/em> \u2013 <em>k<\/em>\r\n\r\n[latex]\\displaystyle{M}{S}_{{\\text{between}}}=\\frac{{{S}{S}_{{\\text{between}}}}}{{{d}{f}_{{\\text{between}}}}}[\/latex]\r\n\r\n[latex]\\displaystyle{M}{S}_{{\\text{within}}}=\\frac{{{S}{S}_{{\\text{within}}}}}{{{d}{f}_{{\\text{within}}}}}[\/latex]\r\n\r\n[latex]\\displaystyle{F}=\\frac{{{M}{S}_{{\\text{between}}}}}{{{M}{S}_{{\\text{within}}}}}[\/latex]\r\n\r\n<em>F<\/em> ratio when the groups are the same size: [latex]\\displaystyle{F}=\\frac{{{n}{{s}_{\\overline{{x}}}^{{ {2}}}}}}{{{s}_{{\\text{pooled}}}^{{2}}}}[\/latex]\r\n\r\nMean of the\u00a0<em>F<\/em> distribution:[latex]\\displaystyle\\mu=\\frac{{{d}{f}{(\\text{num})}}}{{{d}{f}{(\\text{denom})}}}-{1}[\/latex]\r\n\r\nwhere:\r\n<em>k<\/em> = the number of groups <em>n<sub data-redactor-tag=\"sub\">j<\/sub><\/em> = the size of the <em>jth<\/em> group <em>s<sub data-redactor-tag=\"sub\">j<\/sub><\/em> = the sum of the values in the <em>jth<\/em> group <em>n<\/em> = the total number of all values (observations) combined <em>x <\/em>= one value (one observation) from the data [latex]\\displaystyle{{s}_{\\overline{{x}}}^{{ {2}}}}[\/latex] = the mean of the sample variances (pooled variance)","rendered":"<div class=\"textbox learning-objectives\">\n<h3>Learning Outcomes<\/h3>\n<section>\n<ul id=\"fs-idp124304720\">\n<li>Interpret the <em data-effect=\"italics\">F<\/em> probability distribution as the number of groups and the sample size change<\/li>\n<\/ul>\n<\/section>\n<\/div>\n<p>The distribution used for the hypothesis test is a new one. It is called the\u00a0<em>F <\/em>distribution, named after Sir Ronald Fisher, an English statistician. The <em>F<\/em> statistic is a ratio (a fraction). There are two sets of degrees of freedom; one for the numerator and one for the denominator.<\/p>\n<p>For example, if \u00a0<em>F<\/em> follows an <em>F<\/em> distribution and the number of degrees of freedom for the numerator is four, and the number of degrees of freedom for the denominator is ten, then <em>F<\/em> ~ <em>F<sub data-redactor-tag=\"sub\">4,10<\/sub><\/em>.<\/p>\n<hr \/>\n<h4>Note<\/h4>\n<p>The\u00a0<em>F<\/em> distribution is derived from the Student&#8217;s t-distribution. The values of the <em>F<\/em> distribution are squares of the corresponding values of the <em>t<\/em>-distribution. One-Way ANOVA expands the <em>t<\/em>-test for comparing more than two groups. The scope of that derivation is beyond the level of this course.<\/p>\n<hr \/>\n<p>To calculate the\u00a0<em>F<\/em> ratio, two estimates of the variance are made.<\/p>\n<ol>\n<li><strong>Variance between samples:<\/strong> An estimate of <em>\u03c3<\/em><sup>2<\/sup> that is the variance of the sample means multiplied by <em>n<\/em> (when the sample sizes are the same.). If the samples are different sizes, the variance between samples is weighted to account for the different sample sizes. The variance is also called <strong>variation due to treatment or explained variation<\/strong>.<\/li>\n<li><strong>Variance within samples:<\/strong> An estimate of <em>\u03c3<\/em><sup>2<\/sup> that is the average of the sample variances (also known as a pooled variance). When the sample sizes are different, the variance within samples is weighted. The variance is also called the <strong>variation due to error or unexplained variation<\/strong>.<\/li>\n<\/ol>\n<ul>\n<li><em>SS<\/em><sub>between<\/sub> = the sum of squares that represents the variation among the different samples<\/li>\n<li><em>SS<\/em><sub>within<\/sub> = the sum of squares that represents the variation within samples that is due to chance.<\/li>\n<\/ul>\n<p>To find a &#8220;sum of squares&#8221; means to add together squared quantities that, in some cases, may be weighted.<\/p>\n<p><em>MS<\/em> means &#8220;<strong>mean square<\/strong>.&#8221; <em>MS<\/em><sub>between<\/sub> is the variance between groups, and <em>MS<\/em><sub>within<\/sub> is the variance within groups.<\/p>\n<h3>Calculation of Sum of Squares and Mean Square<\/h3>\n<p><em>k<\/em> = the number of different groups<\/p>\n<p><em>nj<\/em> = the size of the <em>jth<\/em> group<\/p>\n<p><em>sj<\/em> = the sum of the values in the <em>jth<\/em> group<\/p>\n<p><em>n<\/em> = total number of all the values combined (total sample size: \u2211<em>n<sub data-redactor-tag=\"sub\">j<\/sub><\/em>)<\/p>\n<p><em>x<\/em> = one value: \u2211<em>x<\/em> = \u2211<em>s<sub data-redactor-tag=\"sub\">j<\/sub><\/em><\/p>\n<p>Sum of squares of all values from every group combined: \u2211<br \/>\n<em>x<\/em><sup>2<\/sup><\/p>\n<p>Between group variability:<br \/>\n<em>SS<\/em><sub>total<\/sub> = [latex]\\displaystyle\\sum{{x}^{{2}}}-\\frac{{\\sum{x}^{{2}}}}{{n}}[\/latex]<\/p>\n<p>Total sum of squares:<br \/>\n[latex]\\displaystyle\\sum{x}^{{2}}-\\frac{{{(\\sum{x})}^{{2}}}}{{n}}[\/latex]<\/p>\n<p>Explained variation: sum of squares representing variation among the different samples:<br \/>\n[latex]\\displaystyle{S}{S}_{{\\text{between}}}=\\sum{[\\frac{{({s}{j})}^{{2}}}{{n}_{{j}}}]}-\\frac{{(\\sum{s}_{{j}})}^{{2}}}{{n}}[\/latex]<\/p>\n<p>Unexplained variation: sum of squares representing variation within samples due to chance:<br \/>\n[latex]\\displaystyle{S}{S}_{{\\text{within}}}={S}{S}_{{\\text{total}}}-{S}{S}_{{\\text{between}}}[\/latex]<\/p>\n<p><em>df<\/em>&#8216;s for different groups (<em>df<\/em>&#8216;s for the numerator): <em>df<\/em> = <em>k<\/em> \u2013 1<\/p>\n<p>Equation for errors within samples (<em>df<\/em>&#8216;s for the denominator):<\/p>\n<p style=\"text-align: center;\"><em>df<\/em><sub>within<\/sub> = <em>n<\/em> \u2013 <em>k<\/em><\/p>\n<p>Mean square (variance estimate) explained by the different groups:<br \/>\n[latex]\\displaystyle{M}{S}_{{\\text{between}}}=\\frac{{{S}{S}_{{\\text{between}}}}}{{{d}{f}_{{\\text{between}}}}}[\/latex]<\/p>\n<p>Mean square (variance estimate) that is due to chance (unexplained):<br \/>\n[latex]\\displaystyle{M}{S}_{{\\text{within}}}=\\frac{{{S}{S}_{{\\text{within}}}}}{{{d}{f}_{{\\text{within}}}}}[\/latex]<\/p>\n<p><em>MS<\/em><sub>between<\/sub> and <em>MS<\/em><sub>within<\/sub> can be written as follows:<\/p>\n<ul>\n<li>[latex]\\displaystyle{M}{S}_{{\\text{between}}}=\\frac{{{S}{S}_{{\\text{between}}}}}{{{d}{f}_{{\\text{between}}}}}=\\frac{{{S}{S}_{{\\text{between}}}}}{{{k}-{1}}}[\/latex]<\/li>\n<li>[latex]\\displaystyle{M}{S}_{{\\text{within}}}=\\frac{{{S}{S}_{{\\text{within}}}}}{{{d}{f}_{{\\text{within}}}}}=\\frac{{{S}{S}_{{\\text{within}}}}}{{{n}-{k}}}[\/latex]<\/li>\n<\/ul>\n<p>The one-way ANOVA test depends on the fact that<br \/>\n<em>MS<\/em><sub>between<\/sub> can be influenced by population differences among means of the several groups. Since <em>MS<\/em><sub>within<\/sub> compares values of each group to its own group mean, the fact that group means might be different does not affect <em>MS<\/em><sub>within<\/sub>.<\/p>\n<p>The null hypothesis says that all groups are samples from populations having the same normal distribution. The alternate hypothesis says that at least two of the sample groups come from populations with different normal distributions. If the null hypothesis is true,<br \/>\n<em>MS<\/em><sub>between<\/sub> and <em>MS<\/em><sub>within<\/sub> should both estimate the same value.<\/p>\n<hr \/>\n<h4>Note<\/h4>\n<p>The null hypothesis says that all the group population means are equal. The hypothesis of equal means implies that the populations have the same normal distribution, because it is assumed that the populations are normal and that they have equal variances.<\/p>\n<hr \/>\n<h3>F-Ratio or F Statistic<\/h3>\n<p>[latex]\\displaystyle{F}=\\frac{{{M}{S}_{{\\text{between}}}}}{{{M}{S}_{{\\text{within}}}}}[\/latex]<\/p>\n<p>If<br \/>\n<em>MS<\/em><sub>between<\/sub> and <em>MS<\/em><sub>within<\/sub> estimate the same value (following the belief that <em>H0<\/em> is true), then the <em>F<\/em>-ratio should be approximately equal to one. Mostly, just sampling errors would contribute to variations away from one. As it turns out, <em>MS<\/em><sub>between<\/sub> consists of the population variance plus a variance produced from the differences between the samples. <em>MS<\/em><sub>within<\/sub> is an estimate of the population variance. Since variances are always positive, if the null hypothesis is false, <em>MS<\/em><sub>between<\/sub> will generally be larger than <em>MS<\/em><sub>within<\/sub>.Then the <em>F<\/em>-ratio will be larger than one. However, if the population effect is small, it is not unlikely that <em>MS<\/em><sub>within<\/sub> will be larger in a given sample.<\/p>\n<p>The foregoing calculations were done with groups of different sizes. If the groups are the same size, the calculations simplify somewhat and the<br \/>\n<em>F<\/em>-ratio can be written as:<\/p>\n<h4>F-Ratio Formula when the groups are the same size<\/h4>\n<p>[latex]\\displaystyle{F}=\\frac{{{n}\\cdot{{s}_{\\overline{{x}}}^{{ {2}}}}}}{{{{s}_{{\\text{pooled}}}^{{2}}}}}[\/latex]<\/p>\n<p>where &#8230;<\/p>\n<ul>\n<li><em>n<\/em> = the sample size<\/li>\n<li><em>df<\/em><sub>numerator<\/sub> = <em>k<\/em> \u2013 1<\/li>\n<li><em>df<\/em><sub>denominator<\/sub> = <em>n<\/em> \u2013 <em>k<\/em><\/li>\n<li><em>s<\/em><sup>2<\/sup><sub>pooled<\/sub> = the mean of the sample variances (pooled variance)<\/li>\n<li>[latex]\\displaystyle{{s}_{\\overline{{x}}}^{{ {2}}}}[\/latex] = the variance of the sample means<\/li>\n<\/ul>\n<p>Data are typically put into a table for easy viewing. One-Way ANOVA results are often displayed in this manner by computer software.<\/p>\n<p>&nbsp;<\/p>\n<table>\n<thead>\n<tr>\n<th>Source of Variation<\/th>\n<th>Sum of Squares (<br \/>\n<em>SS<\/em>)<\/th>\n<th>Degrees of Freedom (<br \/>\n<em>df<\/em>)<\/th>\n<th>Mean Square (<br \/>\n<em>MS<\/em>)<\/th>\n<th><em>F<\/em><\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>Factor(Between)<\/td>\n<td><em>SS<\/em>(Factor)<\/td>\n<td><em>k<\/em> \u2013 1<\/td>\n<td><em>MS<\/em>(Factor) =<em>SS<\/em>(Factor)\/(<em>k<\/em> \u2013 1)<\/td>\n<td><em>F<\/em> =<em>MS<\/em>(Factor)\/<em>MS<\/em>(Error)<\/td>\n<\/tr>\n<tr>\n<td>Error(Within)<\/td>\n<td><em>SS<\/em>(Error)<\/td>\n<td><em>n<\/em> \u2013 <em>k<\/em><\/td>\n<td><em>MS<\/em>(Error) =<em>SS<\/em>(Error)\/(<em>n<\/em> \u2013<em>k<\/em>)<\/td>\n<\/tr>\n<tr>\n<td>Total<\/td>\n<td><em>SS<\/em>(Total)<\/td>\n<td><em>n<\/em> \u2013 1<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<div class=\"textbox exercises\">\n<h3>Example<\/h3>\n<p>Three different diet plans are to be tested for mean weight loss. The entries in the table are the weight losses for the different plans. The one-way ANOVA results are shown in in the table here.<\/p>\n<p>3.5<\/p>\n<table>\n<thead>\n<tr>\n<th>Plan 1:<br \/>\n<em>n<\/em><sub>1<\/sub> = 4<\/th>\n<th>Plan 2:<br \/>\n<em>n<\/em><sub>2<\/sub> = 3<\/th>\n<th>Plan 3:<br \/>\n<em>n<\/em><sub>3<\/sub> = 3<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>5<\/td>\n<td>3.5<\/td>\n<td>8<\/td>\n<\/tr>\n<tr>\n<td>4.5<\/td>\n<td>7<\/td>\n<td>4<\/td>\n<\/tr>\n<tr>\n<td>4<\/td>\n<\/tr>\n<tr>\n<td>3<\/td>\n<td>4.5<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p><em>s<\/em><sub>1<\/sub> = 16.5, <em>s<\/em><sub>2<\/sub> =15, <em>s<\/em><sub>3<\/sub> = 15.7<\/p>\n<p>Following are the calculations needed to fill in the one-way ANOVA table. The table is used to conduct a hypothesis test.<\/p>\n<p>[latex]\\displaystyle{{S}{S}}_{{\\text{between}}}=\\sum{\\left[\\frac{{{({s}_{j})}^{2}}}{{{n}_{j}}}\\right]}-\\frac{{(\\sum{{s}_{j})}^{2}}}{{n}}[\/latex]<\/p>\n<p>[latex]\\displaystyle=\\frac{{{{s}_{1}}^{2}}}{{4}}+\\frac{{{{s}_{2}}^{2}}}{{3}}+\\frac{{{{s}_{3}}^{2}}}{{3}}[\/latex]<\/p>\n<p>where<\/p>\n<p><em>n<\/em><sub>1<\/sub> = 4, <em>n<\/em><sub>2<\/sub> = 3, <em>n<\/em><sub>3<\/sub> = 3 and <em>n<\/em> = <em>n<\/em><sub>1<\/sub> + <em>n<\/em><sub>2<\/sub> + <em>n<\/em><sub>3<\/sub> = 10<\/p>\n<p>[latex]\\displaystyle=\\frac{{({16.5})^{2}}}{{4}}+\\frac{{({15})^{2}}}{{3}}+\\frac{{ ({5.5})^{2}}}{{3}}-\\frac{{ {({16.5}+{15}+{15.5})}^{2}}}{{10}}[\/latex]<\/p>\n<p>[latex]\\displaystyle{{S}{S}}_{{\\text{between}}}={2.2458}{S}_{{\\text{total}}}=\\sum{x}^{2}-\\frac{{{(\\sum{x})}^{2}}}{{n}}[\/latex]<\/p>\n<p>[latex]\\displaystyle=\\left({5}^{2}+{4.5}^{2}+{4}^{2}+{3}^{2}+{3.5}^{2}+{7}^{2}+{4.5}^{2}+{8}^{2}+{4}^{2}+{3.5}^{2}\\right)[\/latex]<\/p>\n<p>[latex]\\displaystyle{-}\\frac{{{\\left({5}+{4.5}+{4}+{3}+{3.5}+{7}+{4.5}+{8}+{4}+{3.5}\\right)}^{2}}}{{10}}[\/latex]<\/p>\n<p>[latex]\\displaystyle={244}-\\frac{{{47}^{2}}}{{10}}={244}-{220.9}[\/latex]<\/p>\n<\/div>\n<h4>Using a Calculator<\/h4>\n<p>One-Way ANOVA Table: The formulas for<br \/>\n<em>SS<\/em>(Total), <em>SS<\/em>(Factor) = <em>SS<\/em>(Between) and<em>SS<\/em>(Error) = <em>SS<\/em>(Within) as shown previously.<\/p>\n<p>The same information is provided by the TI calculator hypothesis test function ANOVA in STAT TESTS (syntax is ANOVA(L1, L2, L3) where L1, L2, L3 have the data from Plan 1, Plan 2, Plan 3 respectively).<\/p>\n<table>\n<thead>\n<tr>\n<th>Source of Variation<\/th>\n<th>Sum of Squares (<br \/>\n<em>SS<\/em>)<\/th>\n<th>Degrees of Freedom (<br \/>\n<em>df<\/em>)<\/th>\n<th>Mean Square (<br \/>\n<em>MS<\/em>)<\/th>\n<th><em>F<\/em><\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>Factor(Between)<\/td>\n<td><em>SS<\/em>(Factor)=<em>SS<\/em>(Between)= 2.2458<\/td>\n<td><em>k<\/em> \u2013 1= 3 groups \u2013 1= 2<\/td>\n<td><em>MS<\/em>(Factor)=<em>SS<\/em>(Factor)\/(<em>k<\/em>\u2013 1)= 2.2458\/2= 1.1229<\/td>\n<td><em>F<\/em> =<em>MS<\/em>(Factor)\/<em>MS<\/em>(Error)= 1.1229\/2.9792= 0.3769<\/td>\n<\/tr>\n<tr>\n<td>Error(Within)<\/td>\n<td><em>SS<\/em>(Error)= <em>SS<\/em>(Within)= 20.8542<\/td>\n<td><em>n<\/em> \u2013 <em>k<\/em>= 10 total data \u2013 3 groups= 7<\/td>\n<td><em>MS<\/em>(Error)=<em>SS<\/em>(Error)\/(<em>n<\/em>\u2013 <em>k<\/em>)= 20.8542\/7= 2.9792<\/td>\n<\/tr>\n<tr>\n<td>Total<\/td>\n<td><em>SS<\/em>(Total)= 2.2458 + 20.8542= 23.1<\/td>\n<td><em>n<\/em> \u2013 1= 10 total data \u2013 1= 9<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<hr \/>\n<div class=\"textbox key-takeaways\">\n<h3>Try it<\/h3>\n<p>As part of an experiment to see how different types of soil cover would affect slicing tomato production, Marist College students grew tomato plants under different soil cover conditions. Groups of three plants each had one of the following treatments<\/p>\n<ul>\n<li>bare soil<\/li>\n<li>a commercial ground cover<\/li>\n<li>black plastic<\/li>\n<li>straw<\/li>\n<li>compost<\/li>\n<\/ul>\n<p>All plants grew under the same conditions and were the same variety. Students recorded the weight (in grams) of tomatoes produced by each of the\u00a0<em>n<\/em> = 15 plants:<\/p>\n<table>\n<thead>\n<tr>\n<th>Bare:<br \/>\n<em>n<\/em>1 = 3<\/th>\n<th>Ground Cover:<br \/>\n<em>n<\/em>2 = 3<\/th>\n<th>Plastic:<br \/>\n<em>n<\/em>3 = 3<\/th>\n<th>Straw:<br \/>\n<em>n<\/em>4 = 3<\/th>\n<th>Compost:<br \/>\n<em>n<\/em>5 = 3<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>2,625<\/td>\n<td>5,348<\/td>\n<td>6,583<\/td>\n<td>7,285<\/td>\n<td>6,277<\/td>\n<\/tr>\n<tr>\n<td>2,997<\/td>\n<td>5,682<\/td>\n<td>8,560<\/td>\n<td>6,897<\/td>\n<td>7,818<\/td>\n<\/tr>\n<tr>\n<td>4,915<\/td>\n<td>5,482<\/td>\n<td>3,830<\/td>\n<td>9,230<\/td>\n<td>8,677<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>Create the one-way ANOVA table.<\/p>\n<p>Enter the data into lists L1, L2, L3, L4 and L5. Press STAT and arrow over to TESTS. Arrow down to ANOVA. Press ENTER and enter L1, L2, L3, L4, L5). Press ENTER. The table was filled in with the results from the calculator.<\/p>\n<\/div>\n<p>One-Way ANOVA table:<\/p>\n<table>\n<thead>\n<tr>\n<th>Source of Variation<\/th>\n<th>Sum of Squares (<br \/>\n<em>SS<\/em>)<\/th>\n<th>Degrees of Freedom (<br \/>\n<em>df<\/em>)<\/th>\n<th>Mean Square (<br \/>\n<em>MS<\/em>)<\/th>\n<th><em>F<\/em><\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>Factor (Between)<\/td>\n<td>36,648,561<\/td>\n<td>5 \u2013 1 = 4<\/td>\n<td>[latex]\\displaystyle\\frac{{{36},{648},{561}}}{{4}}={9},{162},{140}[\/latex]<\/td>\n<td>[latex]\\displaystyle\\frac{{{9},{162},{140}}}{{{2},{044},{672.6}}}={4.4810}[\/latex]<\/td>\n<\/tr>\n<tr>\n<td>Error (Within)<\/td>\n<td>20,446,726<\/td>\n<td>15 \u2013 5 = 10<\/td>\n<td>[latex]\\displaystyle\\frac{{{20},{446},{726}}}{{10}}={2},{044},{672.6}[\/latex]<\/td>\n<\/tr>\n<tr>\n<td>Total<\/td>\n<td>57,095,287<\/td>\n<td>15 \u2013 1 = 14<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>The one-way ANOVA hypothesis test is always right-tailed because larger<br \/>\n<em>F<\/em>-values are way out in the right tail of the <em>F<\/em>-distribution curve and tend to make us reject <em>H<sub data-redactor-tag=\"sub\">0<\/sub><\/em>.<\/p>\n<h2>Notation<\/h2>\n<p>The notation for the\u00a0<em>F<\/em> distribution is <em>F<\/em> ~ <em>F<\/em><sub><em data-redactor-tag=\"em\">df<\/em>(<em>num<\/em>),<em>df<\/em>(<em>denom<\/em>)<\/sub><\/p>\n<p>where\u00a0<em>df<\/em>(<em>num<\/em>) = <em>df<\/em><sub>between<\/sub> and <em>df<\/em>(<em>denom<\/em>) = <em>df<\/em><sub>within<\/sub><\/p>\n<p>The mean for the\u00a0<em>F<\/em> distribution is [latex]\\displaystyle\\mu=\\frac{{{d}{f}{(\\text{num})}}}{{{d}{f}{(\\text{denom})}}}-{1}[\/latex]<\/p>\n<h2>References<\/h2>\n<p>Tomato Data, Marist College School of Science (unpublished student research)<\/p>\n<h2>Concept Review<\/h2>\n<p>Analysis of variance compares the means of a response variable for several groups. ANOVA compares the variation within each group to the variation of the mean of each group. The ratio of these two is the<br \/>\n<em>F<\/em> statistic from an <em>F<\/em> distribution with (number of groups \u2013 1) as the numerator degrees of freedom and (number of observations \u2013 number of groups) as the denominator degrees of freedom. These statistics are summarized in the ANOVA table.<\/p>\n<h2>Formula Review<\/h2>\n<p>[latex]\\displaystyle{S}{S}_{{\\text{between}}}=\\sum{[\\frac{{({s}{j})}^{{2}}}{{n}_{{j}}}]}-\\frac{{(\\sum{s}_{{j}})}^{{2}}}{{n}}[\/latex]<\/p>\n<p><em>SS<\/em><sub>total<\/sub> = [latex]\\displaystyle\\sum{{x}^{{2}}}-\\frac{{\\sum{x}^{{2}}}}{{n}}[\/latex]<\/p>\n<p>[latex]\\displaystyle{S}{S}_{{\\text{within}}}={S}{S}_{{\\text{total}}}-{S}{S}_{{\\text{between}}}[\/latex]<\/p>\n<p>df<sub>between<\/sub> = df(num) = k \u2013 1<\/p>\n<p><em>df<\/em><sub>within<\/sub> = <em>df(denom)<\/em> = <em>n<\/em> \u2013 <em>k<\/em><\/p>\n<p>[latex]\\displaystyle{M}{S}_{{\\text{between}}}=\\frac{{{S}{S}_{{\\text{between}}}}}{{{d}{f}_{{\\text{between}}}}}[\/latex]<\/p>\n<p>[latex]\\displaystyle{M}{S}_{{\\text{within}}}=\\frac{{{S}{S}_{{\\text{within}}}}}{{{d}{f}_{{\\text{within}}}}}[\/latex]<\/p>\n<p>[latex]\\displaystyle{F}=\\frac{{{M}{S}_{{\\text{between}}}}}{{{M}{S}_{{\\text{within}}}}}[\/latex]<\/p>\n<p><em>F<\/em> ratio when the groups are the same size: [latex]\\displaystyle{F}=\\frac{{{n}{{s}_{\\overline{{x}}}^{{ {2}}}}}}{{{s}_{{\\text{pooled}}}^{{2}}}}[\/latex]<\/p>\n<p>Mean of the\u00a0<em>F<\/em> distribution:[latex]\\displaystyle\\mu=\\frac{{{d}{f}{(\\text{num})}}}{{{d}{f}{(\\text{denom})}}}-{1}[\/latex]<\/p>\n<p>where:<br \/>\n<em>k<\/em> = the number of groups <em>n<sub data-redactor-tag=\"sub\">j<\/sub><\/em> = the size of the <em>jth<\/em> group <em>s<sub data-redactor-tag=\"sub\">j<\/sub><\/em> = the sum of the values in the <em>jth<\/em> group <em>n<\/em> = the total number of all values (observations) combined <em>x <\/em>= one value (one observation) from the data [latex]\\displaystyle{{s}_{\\overline{{x}}}^{{ {2}}}}[\/latex] = the mean of the sample variances (pooled variance)<\/p>\n\n\t\t\t <section class=\"citations-section\" role=\"contentinfo\">\n\t\t\t <h3>Candela Citations<\/h3>\n\t\t\t\t\t <div>\n\t\t\t\t\t\t <div id=\"citation-list-740\">\n\t\t\t\t\t\t\t <div class=\"licensing\"><div class=\"license-attribution-dropdown-subheading\">CC licensed content, Shared previously<\/div><ul class=\"citation-list\"><li>OpenStax, Statistics, The F Distribution and the F-Ratio. <strong>Located at<\/strong>: <a target=\"_blank\" href=\"\"><\/a>. <strong>License<\/strong>: <em><a target=\"_blank\" rel=\"license\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\">CC BY: Attribution<\/a><\/em><\/li><li>Introductory Statistics . <strong>Authored by<\/strong>: Barbara Illowski, Susan Dean. <strong>Provided by<\/strong>: Open Stax. <strong>Located at<\/strong>: <a target=\"_blank\" href=\"http:\/\/cnx.org\/contents\/30189442-6998-4686-ac05-ed152b91b9de@17.44\">http:\/\/cnx.org\/contents\/30189442-6998-4686-ac05-ed152b91b9de@17.44<\/a>. <strong>License<\/strong>: <em><a target=\"_blank\" rel=\"license\" href=\"https:\/\/creativecommons.org\/licenses\/by\/4.0\/\">CC BY: Attribution<\/a><\/em>. <strong>License Terms<\/strong>: Download for free at http:\/\/cnx.org\/contents\/30189442-6998-4686-ac05-ed152b91b9de@17.44<\/li><\/ul><\/div>\n\t\t\t\t\t\t <\/div>\n\t\t\t\t\t <\/div>\n\t\t\t <\/section>","protected":false},"author":21,"menu_order":3,"template":"","meta":{"_candela_citation":"[{\"type\":\"cc\",\"description\":\"OpenStax, Statistics, The F Distribution and the F-Ratio\",\"author\":\"\",\"organization\":\"\",\"url\":\"Download for free at http:\/\/cnx.org\/contents\/30189442-6998-4686-ac05-ed152b91b9de@17.44\",\"project\":\"\",\"license\":\"cc-by\",\"license_terms\":\"\"},{\"type\":\"cc\",\"description\":\"Introductory Statistics \",\"author\":\"Barbara Illowski, Susan Dean\",\"organization\":\"Open Stax\",\"url\":\"http:\/\/cnx.org\/contents\/30189442-6998-4686-ac05-ed152b91b9de@17.44\",\"project\":\"\",\"license\":\"cc-by\",\"license_terms\":\"Download for free at http:\/\/cnx.org\/contents\/30189442-6998-4686-ac05-ed152b91b9de@17.44\"}]","CANDELA_OUTCOMES_GUID":"","pb_show_title":"on","pb_short_title":"","pb_subtitle":"","pb_authors":[],"pb_section_license":""},"chapter-type":[],"contributor":[],"license":[],"class_list":["post-740","chapter","type-chapter","status-publish","hentry"],"part":733,"_links":{"self":[{"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/pressbooks\/v2\/chapters\/740","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/pressbooks\/v2\/chapters"}],"about":[{"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/wp\/v2\/types\/chapter"}],"author":[{"embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/wp\/v2\/users\/21"}],"version-history":[{"count":6,"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/pressbooks\/v2\/chapters\/740\/revisions"}],"predecessor-version":[{"id":2043,"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/pressbooks\/v2\/chapters\/740\/revisions\/2043"}],"part":[{"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/pressbooks\/v2\/parts\/733"}],"metadata":[{"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/pressbooks\/v2\/chapters\/740\/metadata\/"}],"wp:attachment":[{"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/wp\/v2\/media?parent=740"}],"wp:term":[{"taxonomy":"chapter-type","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/pressbooks\/v2\/chapter-type?post=740"},{"taxonomy":"contributor","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/wp\/v2\/contributor?post=740"},{"taxonomy":"license","embeddable":true,"href":"https:\/\/courses.lumenlearning.com\/ntcc-introstats1\/wp-json\/wp\/v2\/license?post=740"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}