{VERSION 4 0 "IBM INTEL NT" "4.0" }
{USTYLETAB {CSTYLE "Maple Input" -1 0 "Courier" 0 1 255 0 0 1 0 1 0 0
1 0 0 0 0 1 }{CSTYLE "2D Math" -1 2 "Times" 0 1 0 0 0 0 0 0 2 0 0 0 0
0 0 1 }{CSTYLE "2D Comment" 2 18 "" 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 }
{CSTYLE "" -1 256 "" 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 1 }{CSTYLE "" -1
257 "" 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 1 }{CSTYLE "" -1 258 "Geneva" 1
10 0 0 0 1 0 0 0 0 0 0 0 0 0 1 }{CSTYLE "" -1 259 "Geneva" 1 10 0 0 0
1 0 0 0 0 0 0 0 0 0 1 }{CSTYLE "" -1 260 "Geneva" 1 10 0 0 0 1 0 0 0
0 0 0 0 0 0 1 }{CSTYLE "" -1 261 "Geneva" 1 10 0 0 0 1 0 0 0 0 0 0 0
0 0 1 }{CSTYLE "" -1 262 "Geneva" 1 10 0 0 0 1 0 0 0 0 0 0 0 0 0 1 }
{CSTYLE "" -1 263 "Geneva" 1 10 0 0 0 1 0 0 0 0 0 0 0 0 0 1 }{CSTYLE "
" -1 264 "Geneva" 1 10 0 0 0 1 0 0 0 0 0 0 0 0 0 1 }{CSTYLE "" -1 265
"Geneva" 1 10 0 0 0 1 0 0 0 0 0 0 0 0 0 1 }{CSTYLE "" -1 266 "" 0 1 0
0 0 0 1 0 0 0 0 0 0 0 0 1 }{PSTYLE "Normal" -1 0 1 {CSTYLE "" -1 -1 "
" 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 }0 0 0 -1 -1 -1 0 0 0 0 0 0 -1 0 }
{PSTYLE "Heading 1" 0 3 1 {CSTYLE "" -1 -1 "" 1 18 0 0 0 0 0 1 0 0 0
0 0 0 0 1 }1 0 0 0 6 6 0 0 0 0 0 0 -1 0 }{PSTYLE "" 3 256 1 {CSTYLE "
" -1 -1 "" 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 }3 0 0 -1 -1 -1 0 0 0 0 0
0 -1 0 }{PSTYLE "" 0 257 1 {CSTYLE "" -1 -1 "" 0 1 0 0 0 0 0 0 0 0 0
0 0 0 0 1 }3 0 0 -1 -1 -1 0 0 0 0 0 0 -1 0 }{PSTYLE "" 0 258 1
{CSTYLE "" -1 -1 "" 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 }3 0 0 -1 -1 -1 0
0 0 0 0 0 -1 0 }{PSTYLE "" 0 259 1 {CSTYLE "" -1 -1 "" 0 1 0 0 0 0 0
0 0 0 0 0 0 0 0 1 }3 0 0 -1 -1 -1 0 0 0 0 0 0 -1 0 }{PSTYLE "" 0 260
1 {CSTYLE "" -1 -1 "" 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 }3 0 0 -1 -1 -1
0 0 0 0 0 0 -1 0 }{PSTYLE "R3 Font 0" -1 261 1 {CSTYLE "" -1 -1 "Monac
o" 1 9 127 0 255 1 2 2 2 0 0 0 0 0 0 1 }0 0 0 -1 -1 -1 0 0 0 0 0 0 -1
0 }}
{SECT 0 {PARA 256 "" 0 "" {TEXT -1 23 "Exponential or Logistic" }}
{PARA 257 "" 0 "" {TEXT -1 9 "Jim Herod" }}{PARA 258 "" 0 "" {TEXT -1
21 "School of Mathematics" }}{PARA 259 "" 0 "" {TEXT -1 12 "Georgia Te
ch" }}{PARA 260 "" 0 "" {TEXT -1 21 "herod@math.gatech.edu" }}{PARA 0
"" 0 "" {TEXT -1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 0 "" }}{PARA 0 "" 0 "
" {TEXT -1 168 " We consider the U. S. Census Data during the 20th
Century and examine whether the data can be approximated better by an
exponential fit or a logistic fit. With an " }{TEXT 256 15 "exponenti
al fit" }{TEXT -1 123 ", we expect to find numbers c and r so that the
data is approximated by solutions for y ' = r y, with y(1990) = c. W
ith a " }{TEXT 257 12 "logistic fit" }{TEXT -1 38 ", we expect to find
numbers c, r, and " }{XPPEDIT 18 0 "alpha" "6#%&alphaG" }{TEXT -1 52
" so that the data is approximated by solutions for " }{XPPEDIT 18 0
"y" "6#%\"yG" }{TEXT -1 5 " ' = " }{XPPEDIT 18 0 "r*y-alpha*y^2" "6#,&
*&%\"rG\"\"\"%\"yGF&F&*&%&alphaGF&*$F'\"\"#F&!\"\"" }{TEXT -1 19 ", wi
th y(1990) = c." }}{PARA 0 "" 0 "" {TEXT -1 0 "" }}{PARA 0 "" 0 ""
{TEXT -1 77 " Here is the 20th Century data which we wish to appro
ximate. (Reference: " }{TEXT 266 51 "Differential Equations and Bounda
ry Values Problems" }{TEXT -1 71 ", Edwards and Penny, Prentice Hall (
ISBN: 0 -13 - 382094 - 7), page 79." }}{PARA 0 "" 0 "" {TEXT -1 0 "" }
}{PARA 0 "" 0 "" {TEXT -1 69 " Year Millions \+
Year Millions" }}{PARA 0 "" 0 "" {TEXT -1 70 " \+
1900 76.2 1950 151.3 " }}
{PARA 0 "" 0 "" {TEXT -1 68 " 1910 92.2 \+
1960 179.3" }}{PARA 0 "" 0 "" {TEXT -1 67 " \+
1920 106.0 1970 203.3" }}{PARA 0 "
" 0 "" {TEXT -1 67 " 1930 123.2 \+
1980 225.6" }}{PARA 0 "" 0 "" {TEXT -1 67 " 1940 \+
132.2 1990 248.7" }}{PARA 0 "" 0 ""
{TEXT -1 0 "" }}{SECT 1 {PARA 3 "" 0 "" {TEXT -1 59 "An Exponential Fi
t for this Century's U. S. Population Data" }}{PARA 0 "" 0 "" {TEXT
-1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 104 " We seek number r and c so
that we can approximate the above data with a function which has the \+
form" }}{PARA 0 "" 0 "" {TEXT -1 31 " "
}{XPPEDIT 18 0 "y(t) = c*exp(r*t)" "6#/-%\"yG6#%\"tG*&%\"cG\"\"\"-%$ex
pG6#*&%\"rGF*F'F*F*" }{TEXT -1 1 "." }}{PARA 0 "" 0 "" {TEXT -1 268 "F
irst, we examine that data to see that it has the characteristic growt
h commonly referred to as exponential growth. To keep the numbers smal
l, we take t = 0 at 1990 and plot the data in decades. That is, 1910 a
nd 1920 are represented as t = 1 and t = 2, respectively." }}{PARA 0 "
" 0 "" {TEXT -1 0 "" }}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 91 "T:=[0
,1,2,3,4,5,6,7,8,9];\npop:=[76.2,92.2,106.0,123.2,132.2,151.3,179.3,20
3.3,225.6,248.7];" }}}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 38 "points
1:=[seq([T[i],pop[i]],i=1..10)];" }}}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 26 "plot(points1,style=POINT);" }}}{PARA 0 "" 0 "" {TEXT
-1 146 " It is striking to note a re-allignment of the data at abo
ut 1940. It is easy to speculate on why this apparent break in the pat
tern occurred." }}{PARA 0 "" 0 "" {TEXT -1 102 " To make a fit for
the data in the form specified above, we take the logarithm of both s
ides. Thus" }}{PARA 0 "" 0 "" {TEXT -1 54 " \+
ln( y(t) ) = r t + ln(c)." }}{PARA 0 "" 0 "" {TEXT -1 281 "Hence, if
we take the logarithm of the census data, we can find a candidate for
r and ln(c) from a linear, least squares fit for the logarithm of the
data. Here, we take the logarithm of the data, and plot the results i
n order to see that a linear fit for this data is reasonable. " }}
{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 19 "lnpop:=map(ln,pop);" }}}
{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 38 "lnpts:=[seq([T[i],lnpop[i]],
i=1..10)];" }}}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 24 "plot(lnpts,st
yle=POINT);" }}}{PARA 0 "" 0 "" {TEXT -1 46 " We get a least squar
es fit for this data." }}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 12 "wit
h(stats):" }}}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 43 "fit[leastsquar
e[[t,y],y=r*t+b]]([T,lnpop]);" }}}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1
0 21 "z:=unapply(rhs(%),t);" }}{PARA 0 "> " 0 "" {MPLTEXT 1 0 0 "" }}
{PARA 0 "> " 0 "" {MPLTEXT 1 0 0 "" }}}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 106 "with(plots):\n J:=plot(exp(z(t)),t=0..11):\n K:=pl
ot(points1,style=POINT,symbol=CIRCLE):\n display(\{J,K\});" }}}
{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 0 "" }}}}{PARA 0 "" 0 "" {TEXT
-1 0 "" }}{SECT 1 {PARA 3 "" 0 "" {TEXT -1 55 "A Logistic Fit for this
Century's U. S. Population Data" }}{PARA 0 "" 0 "" {TEXT -1 0 "" }}
{PARA 0 "" 0 "" {TEXT -1 78 " For the purpose of making this secti
on complete, we repeat the data here." }}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 91 "T:=[0,1,2,3,4,5,6,7,8,9];\npop:=[76.2,92.2,106.0,123.
2,132.2,151.3,179.3,203.3,225.6,248.7];" }}}{PARA 0 "" 0 "" {TEXT -1
0 "" }}{PARA 0 "" 0 "" {TEXT -1 104 " To begin the analysis of a l
ogistic fit for this dara, observe with Edwards and Penny, page 77-78,
" }{TEXT 258 94 "that y'/y should be nearly a straight line. Afterall
, the logistic differential equation gives" }}{PARA 0 "" 0 "" {TEXT
-1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 56 " \+
y' / y = " }{XPPEDIT 18 0 "r-alpha*y" "6#,&%\"rG\"\"\"
*&%&alphaGF%%\"yGF%!\"\"" }{TEXT -1 1 "." }}{PARA 0 "" 0 "" {TEXT -1
0 "" }}{PARA 0 "" 0 "" {TEXT 262 38 " Also, argue that the calculation
for " }{XPPEDIT 18 0 "y[i[" "6#&%\"yG6#&%\"iG6\"" }{TEXT -1 3 "' " }
{TEXT 259 15 "can be taken as" }}{PARA 0 "" 0 "" {TEXT -1 0 "" }}
{PARA 0 "" 0 "" {TEXT 263 41 " \+
" }{XPPEDIT 18 0 "y[i+1]-y[i-1])/2" "6#*&,&&%\"yG6#,&%\"iG\"\"\"F*F*F
*&F&6#,&F)F*F*!\"\"F.F*\"\"#F." }{TEXT -1 2 ". " }{TEXT 260 2 " " }}
{PARA 0 "" 0 "" {TEXT -1 0 "" }}{PARA 0 "" 0 "" {TEXT 264 13 "For exam
ple, " }{XPPEDIT 18 0 "y[2]" "6#&%\"yG6#\"\"#" }{TEXT -1 3 "' " }
{TEXT 261 10 " should be" }}{PARA 0 "" 0 "" {TEXT 265 50 " \+
" }{XPPEDIT 18 0 "(pop[3]-pop[1]
)/2" "6#*&,&&%$popG6#\"\"$\"\"\"&F&6#F)!\"\"F)\"\"#F," }{TEXT -1 2 ". \+
" }}{PARA 0 "" 0 "" {TEXT -1 13 "This leads to" }}{PARA 0 "" 0 ""
{TEXT -1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 37 " \+
" }{XPPEDIT 18 0 "y[2]" "6#&%\"yG6#\"\"#" }{TEXT -1 5 " ' \+
/ " }{XPPEDIT 18 0 "y[2]" "6#&%\"yG6#\"\"#" }{TEXT -1 26 " is appro
ximated by " }{XPPEDIT 18 0 "(pop[3]-pop[1])/(2*pop[2]" "6#*&,&&%$p
opG6#\"\"$\"\"\"&F&6#F)!\"\"F)*&\"\"#F)&F&6#F.F)F," }{TEXT -1 1 "." }}
{PARA 0 "" 0 "" {TEXT -1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 86 "In order \+
to be sure we get the quotients computed correctly below, we make this
check." }}{PARA 0 "" 0 "" {TEXT -1 0 "" }}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 27 "(pop[3]-pop[1])/(2*pop[2]);" }}}{PARA 0 "" 0 ""
{TEXT -1 176 "Now, we define the sequence of quotients, and check that
the first one is as computed above. Note that there must be two less \+
points in this set than the number of data points." }}{PARA 0 "" 0 ""
{TEXT -1 0 "" }}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 52 "diffpop:=seq
((pop[i+1]-pop[i-1])/(2*pop[i]),i=2..9);" }}}{PARA 0 "" 0 "" {TEXT -1
0 "" }}{PARA 0 "" 0 "" {TEXT -1 184 " This last is the data for y \+
' / y. This should be nearly straight line as a function of y. To see \+
the extent to which this is true, we plot this data for [ y(t), y '(t
) / y(t) ]." }}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 45 "points2:=[seq
([pop[n+1],diffpop[n]],n=1..8)];" }}}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 48 "plot(points2,style=POINT,view=[90..250,0..0.5]);" }}}
{PARA 0 "" 0 "" {TEXT -1 257 "Here is a computation of the linear leas
t squares fit for these computed data points. To compute the least squ
ares fit, we need the data in two sets: the first is the sequence of f
irst coordinates and the second is the value of the function at these \+
points." }}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 12 "with(stats):" }}}
{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 64 "first:=[seq(pop[n+1],n=1..8)
];\nsecond:=[seq(diffpop[n],n=1..8)];" }}}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 48 "fit[leastsquare[[y,q],q=m*y+b]]([first,second]);" }}}
{PARA 0 "" 0 "" {TEXT -1 22 "We now identify r and " }{XPPEDIT 18 0 "a
lpha" "6#%&alphaG" }{TEXT -1 3 ". " }}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 40 "r:=op(2,rhs(%));\nalpha:=op(1,rhs(%%))/y;" }}}{PARA
0 "" 0 "" {TEXT -1 44 "We see how the resulting line fits the data." }
}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 152 "with(plots):\n J:=plot(al
pha*y+r,y=90..250,view=[90..250,0..0.5]):\n K:=plot(points2,style=POI
NT,symbol=CIRCLE,view=[90..250,0..0.5]):\n display(\{J,K\});" }}}
{PARA 0 "" 0 "" {TEXT -1 116 " Now that we have the the coefficien
ts for the differential equation, we can solve it and compare with the
data." }}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 58 "dsolve(\{diff(y(t)
,t)=y(t)*(r+alpha*y(t)),y(0)=76.2\},y(t));" }}}{EXCHG {PARA 0 "> " 0 "
" {MPLTEXT 1 0 26 "ylogis:=unapply(rhs(%),t);" }}{PARA 0 "> " 0 ""
{MPLTEXT 1 0 0 "" }}}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 106 "with(p
lots):\n J:=plot(ylogis(t),t=0..10):\n K:=plot(points1,style=POINT,s
ymbol=CIRCLE):\n display(\{J,K\});" }}}{PARA 0 "" 0 "" {TEXT -1 184 "
One quality of the solution for the logistic equation that did not hol
d true for the exponential fit, is that the solution for the logistic \+
equation predicts a finite population level." }}{EXCHG {PARA 0 "> " 0
"" {MPLTEXT 1 0 28 "limit(ylogis(t),t=infinity);" }}}{EXCHG {PARA 0 ">
" 0 "" {MPLTEXT 1 0 0 "" }}}}{PARA 0 "" 0 "" {TEXT -1 0 "" }}{SECT 1
{PARA 3 "" 0 "" {TEXT -1 29 "A Comparison of the Two Fits." }}{PARA 0
"" 0 "" {TEXT -1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 193 "In working throu
gh this study, you likely have computed both fits for the U. S. popula
tion data for this century. You might be interested in a standard comp
arison of the error for the two fits." }}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 50 "errexp:=sqrt(sum((pop[i]-exp(z(i-1)))^2,i=1..10));" }
}}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT 1 0 57 "errlog:=evalf(sqrt(sum((po
p[i]-ylogis(i-1))^2,i=1..10)));" }}}{EXCHG {PARA 0 "> " 0 "" {MPLTEXT
1 0 30 "evalf(ylogis(10));\nexp(z(10));" }}}{EXCHG {PARA 0 "> " 0 ""
{MPLTEXT 1 0 0 "" }}}}{PARA 0 "" 0 "" {TEXT -1 0 "" }}{SECT 1 {PARA 3
"" 0 "" {TEXT -1 23 "Exercise for the reader" }}{PARA 0 "" 0 "" {TEXT
-1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 89 "1. Repeat this study, except ta
ke the U. S. population data since the American Civil War." }}{PARA 0
"" 0 "" {TEXT -1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 69 " Year
Millions Year Millions" }}{PARA 0 "" 0 "
" {TEXT -1 69 " 1860 31.4 1880
50.2 " }}{PARA 0 "" 0 "" {TEXT -1 67 " 1870 \+
39.8 1890 62.9" }}{PARA 0 "" 0 "" {TEXT
-1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 93 "2. Using the logistic fit for t
his expanded data, estimate the count of the Year 2000 Census." }}}
{PARA 0 "" 0 "" {TEXT -1 0 "" }}{PARA 0 "" 0 "" {TEXT -1 0 "" }}}
{MARK "0 0" 0 }{VIEWOPTS 1 1 0 1 1 1803 1 1 1 1 }{PAGENUMBERS 0 1 2
33 1 1 }