# An amount of $100 being increased every day by 1%, in how many days will it reach $1000 and how did you calculate this?

### 2 Answers

- RogerLv 74 weeks ago
$1,000 = $100(1+ 0.01)^d where d is the number of days

10 = (1.01)^d

Log 10 = d(log 1.01)

1 = d(0.00432)

d = 231 days

- husoskiLv 74 weeks ago
An increase of 1% is the same as multiplying by 1.01, so the value after n days is:

A_n = 100 * 1.01^n

You want to find the value of n that makes A_n equal to 1000:

100 * 1.01^n = 1000

1.01 ^ n = 10 Divide both sides by 100

log (1.01^n) = log 10 Take logarithms

n * (log 1.01) = log 10 Use log (a^b) = b * log a property

n = (log 10) / (log 1.01) ...and divide by (log 1.01)

That works for any logarithm base, so long as you use the same one every time. If "log" is a base ten ("common") logarithm then that simplifies to 1/(log 1.01).

My calculator says that makes n about 231.41, so the first time the amount gets to $1000 or more is at the end of the 232nd day.