We'll have to determine the indefinite integral of the given
function.
For the beginning, we'll re-write the function as an
algebraic sum of elementary fractions:
2x/(x+1)*(x^2+1) = A/(x+1) +
(Bx + C)/(x^2 + 1)
2x = Ax^2 + A + Bx^2 + Bx + Cx +
C
2x = x^2(A + B) + x(B+C) +
A+C
Comparing, we'll get:
A+B = 0
> A = -B (1)
B+C = 2 (2)
A+C = 0
<=>-B + C = 0 (3)
(2)+(3) => 2C = 2 =>
C=1 => A=-1=>B =1
2x/(x+1)*(x^2+1) = -1/(x+1) + (x +
1)/(x^2 + 1)
Int f(x)dx = Int-dx/(x+1) + Int xdx/(x^2 + 1) + Int
dx/(x^2 + 1)
Int-dx/(x+1) = -ln|x+1| +
C
Int xdx/(x^2 + 1) = Int dt/2t = ln|t|/2 +
C
x^2 + 1 = t
2xdx =
dt
xdx = dt/2
Int dx/(x^2 + 1) = arctan
x + C
Int f(x)dx = -ln|x+1| + ln|t|/2 + arctan x +
C
The antiderivative of the given function is: Int
f(x)dx = -ln|x+1| + ln|t|/2 + arctan x + C.
No comments:
Post a Comment