Welcome :: Homework Help and Answers :: Mathskey.com
Welcome to Mathskey.com Question & Answers Community. Ask any math/science homework question and receive answers from other members of the community.

13,459 questions

17,854 answers

1,446 comments

811,114 users

Use the Mean Value Theorem to show that sqrt(1 + x) < 1 + (1/ 2)x if x > 0.

0 votes
Use the Mean Value Theorem to show that  sqrt(1 + x) < 1 + (1/ 2)x if x > 0. (Hint: Apply the Mean Value Theorem to the function

f(x) = sqrt(1 + x))
asked Mar 27, 2015 in CALCULUS by anonymous

2 Answers

0 votes

Step 1:

The function is  .

Mean value theorem :

Let f be a function that satisfies the following three hypotheses :

1. f is continuous on .

2. f is differentiable on .

Then there is a number c in such that  .

Step 2:

Let x > 0 , .

Let us consider a number c in .

Since the hypotheses of the Mean Value Theorem are satisfied,

we get  .

Differentiate  with respect to .

answered Mar 27, 2015 by yamin_math Mentor
0 votes

Contd.....

Step 3:

Substitute image in the mean value theorem.

image

If x > 0, then c > 0 and, therefore, image.

So image.

image

Solution:

image

answered Mar 27, 2015 by yamin_math Mentor

Related questions

...