Is Something Continuous if It Goes to Infinity
You should upgrade or use an alternative browser.
- Forums
- Homework Help
- Calculus and Beyond Homework Help
Limit of derivative as x goes to infinity
- Thread starter Adorno
- Start date
Homework Statement
Suppose that [itex]f[/itex] and [itex]f'[/itex] are continuous functions on [itex]\mathbb{R}[/itex], and that [itex]\displaystyle\lim_{x\to\infty}f(x)[/itex] and [itex]\displaystyle\lim_{x\to\infty}f'(x)[/itex] exist. Show that [itex]\displaystyle\lim_{x\to\infty}f'(x) = 0[/itex].
Homework Equations
Definition of derivative: [itex]f'(x) = \displaystyle\lim_{h\to0}\frac{f(x+h) - f(x)}{h}[/itex]
Fundamental theorem of calculus: [itex]f(x) = \frac{d}{dx}\displaystyle\int^x_a f(t)dt[/itex]
The Attempt at a Solution
At first I just wrote it in terms of the definition of the derivative:[tex]\displaystyle\lim_{x\to\infty}f'(x) = \displaystyle\lim_{x\to\infty}(\displaystyle\lim_{h\to0}\frac{f(x+h) - f(x)}{h})[/tex] Then I thought that you could change the order of the limits (since both limits exist and the function [itex]\frac{f(x+h) - f(x)}{h}[/itex] is continuous right?):
[tex]\displaystyle\lim_{x\to\infty}f'(x) = \displaystyle\lim_{h\to0}( \displaystyle\lim_{x\to\infty} \frac{f(x+h) - f(x)}{h} )[/tex] And then since [itex]h[/itex] is just a constant it should follow that [itex]\displaystyle\lim_{x\to\infty}f(x+h) = \displaystyle\lim_{x\to\infty}f(x) = c[/itex], so that [itex]\displaystyle\lim_{x\to\infty}(f(x+h) - f(x)) = c - c = 0[/itex]. Then we have [tex]\displaystyle\lim_{x\to\infty}f'(x) = \displaystyle\lim_{h\to0}0 = 0.[/tex] I'm not sure about this though. It seems a little too simple and doesn't seem to use all of the information given. Also, I'm not sure if I'm allowed to change the order of the limits, so maybe this doesn't work at all. Could anyone help?
Answers and Replies
Then I thought that you could change the order of the limits (since both limits exist and the function [itex]\frac{f(x+h) - f(x)}{h}[/itex] is continuous right?)
This seems like a mighty big leap of logic. Maybe it's valid, maybe it's not, but either way it's not obvious. Can you say exactly what theorem you are using?
Offhand I would think that you should start with the fact that
[tex]\lim_{x \rightarrow \infty} f'(x)[/tex]
exists. Call the limit L. Then see if you can obtain a contradiction if you assume that L > 0 or L < 0.
[tex]\lim_{x\rightarrow +\infty}{f^\prime(x)}~\text{exists}[/tex]
is necessary. If the assumption does not hold, then
[tex]f(x)=\frac{\sin(x^3)}{x}[/tex]
is a counterexample. I'm just telling this because finding that very counterexample was once one of my exam questions, and I thought you might find it interesting
TylerH and jbunniii: Yeah, I thought about that as well, but I'm not sure where the contradiction would come from. As for a definition of convergence, I take it you mean this: if [itex]\displaystyle\lim_{x\to\infty}f(x) = L[/itex] then given any [itex]\epsilon > 0[/itex] there exists some [itex]N[/itex] such that [itex]|f(x) - L| < \epsilon[/itex] [itex]\forall x > N[/itex]. I don't know about using this because it doesn't involve the derivative at all.
Also, yes, I think my initial method was totally wrong.
[tex]\lim_{x \rightarrow \infty} f'(x) = L[/tex].
If L > 0, then there is some N such that
[tex]f'(x) > L/2[/tex]
for all x > N.
Now what happens if you integrate? Can you get a contradiction?
I see. So if we integrate we get something like [itex] f(x) > (L/2)x + c[/itex]. And the right-hand side goes to infinity as [itex] x \to \infty [/itex], which is a contradiction since [itex]\displaystyle\lim_{x \to \infty}f(x)[/itex] was assumed to exist. Is that right?
[tex]\int_0^x{f(x)dx}>\int_0^x{L/2dx}[/tex]
It merely means that [itex]\geq[/itex] holds. But your proof still holds...
I see. So if we integrate we get something like [itex] f(x) > (L/2)x + c[/itex]. And the right-hand side goes to infinity as [itex] x \to \infty [/itex], which is a contradiction since [itex]\displaystyle\lim_{x \to \infty}f(x)[/itex] was assumed to exist. Is that right?
Yes, more or less. Just be careful about the interval over which you integrate. The inequality only holds for x > N.
Note that you use the continuity of [itex]f'[/itex] in order to integrate it and to invoke the fundamental theorem of calculus.
By the way, I don't think the continuity of [itex]f'[/itex] is required, as long as all the other assumptions are satisfied. Integrability of [itex]f'[/itex] and the fact that [itex]f'[/itex] has [itex]f[/itex] as an antiderivative should suffice. You also don't need the assumption that [itex]f[/itex] is continuous, as it's automatically true given the existence of [itex]f'[/itex] (at least for sufficiently large x).
Suggested for: Limit of derivative as x goes to infinity
- Last Post
- Last Post
- Last Post
- Last Post
- Last Post
- Last Post
- Last Post
- Last Post
- Last Post
- Forums
- Homework Help
- Calculus and Beyond Homework Help
parkinsontorned98.blogspot.com
Source: https://www.physicsforums.com/threads/limit-of-derivative-as-x-goes-to-infinity.502972/
0 Response to "Is Something Continuous if It Goes to Infinity"
Post a Comment