This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
0.
Since E = L2(Sl), then II F-k-1 XkIIE > dn112 for all integers n, where d > 0 is a constant. Using (11), (2) and (3), we obtain for the r.v.s {Xk}kci and n large enough BE,,E, E2,(2 = n1/p, where 1/00 = 0. From here and (5) II Ek-i XkIIE < b2n11p. Since 2 < p < oo, this bound contradicts the previous.
Proposition 6. Let the conditions of Theorem 2 hold, E1 = l,,. and E2 = 12. Then Ei J E2 = E = L2(Sl).
Proof: Using (3), (5) and the arguments of the proof of Lemma 2, we get bi Max S OE.
\n/
7 1/2OE2
< b2 max S OEi
\n/ } C OE (t)
n1120Ea :
,
\n/
(20)
where 0 < i < 1, n = 1, 2.... and bi, b'2 are positive constants. Since El E K, then E1 # Lo,, (1l) and OE, (r) ---> 0 as r , 0 [48], which gives us bi
n1120E2
Cnl
J
OF, (t) < b2 lim n112OE, () n-oo J
Rearrangement Invariant Spaces
30
Putting t = 1, we obtain cin1/2 < OE,(1/n) < c2n112, where cl and c2 are positive constants. Using (8) and reasoning as in the proof of Lemma 2, we get
(0 < t < 1), where positive dl and d2 are independent oft. d1t1/2 < OE2(t) < d2t112
(21)
Let X be defined by the formula (9) and hj,k, where 1 < j < m and 1 < k < n, be mutually disjoint sets such that m
P(hj,k) = m-1P(hk).
U hj,k = hk,
j=1
Then
mom n
X=
[)
E aklh,,..
j=1 k=1
According to (2) and (21), we have for {aklh,,,,} d1IIXIIL2(n) <
d2IIX IIL2(n)
In addition
AE1.1, = 1
for sufficiently large m. Theorem 1.1 yields that estimates of the type (5) are true for mutually disjoint r.v.s, which implies v1IIXIIL2(a) < IIXIIE < v2IIXIIL2(n) for some positive constants vl and v2. So, E = L2(0)According to Proposition 5,E2 C El. Hence IIXIIE, < CII X IIE2 for each X E E2, where C is a constant. From (2), (3) and (5) b1IIXIIE2 <_ IIXIIE <_ b2(1+C)IIXIIE2 for all X E L,,.(1). So E2 = E = L2(S2).
Proof of Lemma 3: We may assume that £i = lp (2 < p < oo) and £2 = 12. If p = oo, then Proposition 6 yields E2 C El. Since 12 C loo, then (E2, £2) C (El, £l ), which contradicts the conditions of Theorem 2. So,
p
Lemma 4. Let the conditions of Theorem 2 hold and £1 = 1p (2 < p < 00), C2 = 12. Then there are positive constants u, v, ul, v1i v2 such that for
0
1Lillp < qE(t) < vtllp, .altllp < h,(t) < vltllp,
(22) (23)
OE2(t) < v2t1l2.
(24)
The proof is based on some auxiliary assertions.
2. Inequalities for sums of independent random variables
31
Proposition 7. There exist positive constants wl and w2 such that for all
0
wl max (
l
2r) 1/p OE, (r) ,
< w2 max ( G)
1/P
(i-) 1/2 ¢E, (r) I < OE(t)
OE, (2r)
,
(,-) 1/2
OE, (2r) j
(25)
Proof: For 0 < r < t < 1/2 there is an integer n such that r < t/n < 2r < 1. From here t/(2r) < n < t/r. Using (8) and the arguments of the proof of Lemma 2 we obtain (25).
Proposition S. If lim t-1/POiE(t) = 0, t-o
then E = L2(0)-
Proof: First we obtain some bounds. We may choose t \ 0 such that to "POE (t,,) - 0. From (25) for 0 < r < t and n = 1, 2, .. . r-1/pab E. (r)
<
21/p
w1
to 1/POE(tn)
,
which yields
lim r-1/POE1(r) = 0.
(26)
Putting t = 1/2 and letting r tend to zero, we obtain from (25) lim r-1/20E,(r) > 0. r-0
It follows from (25) that OE, (r) < Cr1/2 for 0 < r < 1/2 and some constant C. So, we get (21).
Let r -> 0. Then (21), (25) and (26) imply cltl/2 < ¢E(t) < c2t1/2 for some positive constants cl and c2. Putting in (25) t = r, we get 4'E, (t) < (21/p/wl)OE(t) < wt1/2, where w is a constant.
Now we may show that E = L2(f ). Let the r.v. X be defined by the formula (9). For the r.v.s {ak1hk}'k_1 the following estimates are true: 1/p
AE1,I, = (IakrElPhky') k=1
1/2
akw2P(hk) I k=1
1/2
akOE, (P(hk)) k=1
11/ n
n
<
= w11X IlL,(st)
2
Rearrangement Invariant Spaces
32
From (21) d1II X IIL,(S2) < ALi',,1, < d2IIX IIL,(n) Using (3), we obtain
d1IIX IIL,(n) C BE1,I,,E,,1, < (d2 + w)IIXIIL,(a).
Theorem 1.1 (see the section 1.6) and the last bounds imply E = L2(0).
Proof of Lemma 4: First we obtain the upper estimates. Putting in (25) t = 1/2, we get (24) and the right-hand side inequality in (23). Hence OE (t) < w2 max { 21/Pvltl/P 21/2v2t1/21
Since p > 2, the upper estimate in (22) is proved. We have lim t-1/PSE(t) > 0.
(27)
t-o
For if not, then according to Proposition 8 E = L2(S2). Proposition 5 yields
El J E2. Since I,, J 12 for p > 2, this contradicts the assumptions of Theorem 2.
The relation (27) yields the left-hand side of (22). Putting in (25) t = r and using (22) and (24), we obtain ult1/P < w2 max {OE, (2t) , V22 1/2tl/2 I Since p > 2, then
lim t-1/pY'E1 (t) > t-+0
2
.
1/p1 W2
> 0,
which implies the left-hand side of (23).
5. Proof of Theorem 2. We may assume £1 = 1p (2 < p < oo) and £2 = 12. First we show that E = Lp(S2). Let X be defined by the formula (9). Applying (23) and (2), we obtain for the r.v.s {aklh,, }k-1
CIIIXIIL,(n) < AE,,I < C2IIL,(n), where C1 and C2 are positive constants. From (24) AE2,12 < DII X IIL,(n)
(D = const.).
Since p > 2, then according to (3)
C1IIXIIL,(n)
BE1,I,,E,,12 < (C2 + D)IIX IIL,(n)
Using Theorem 1.1 and (5), we conclude, as above, that DIIIXIIL,(n)
IIXIIE C D2IIXIIL,(n),
2. Inequalities for sums of independent random variables
33
where D1 and D2 are positive constants. So, E = LP(f2)Now we show that El CE2. If X E E1, then for all x > 0 IIXIIE, >_ II xI{IXI>ZI IIE, = x$E, (P{IXI >_ x}).
From here and (23) P{IXI > x} < C(II X II E, Using (24) and the condition p > 2, we get
(28)
)Px-P for some constant C.
00
i
cE,\P{IXI > x})dx < oo.
According to Proposition 1.6, X E E2. Hence E1 C E2 and IIXIIE, < BII X IIE, for a constant B. Applying (5) to one r.v. X, we get b1IIXIIEI < IIXIIE < b2(B + 1)IIXIIE,
Therefore El = E = LP(S2). Now we show that E2 = L2(S2). Let X E
and {Xk}k 1 be independent r.v.s equidistributed with X. Put Sn = n-1/2>k_1Xk and Dn = max {nl/P-112IIXIILP(n) , IIXIIE,}
From (3) and (5) b1 D,, < IISnIIL,(c) < b2Dn
(n = 1,2,...).
Let v2 = EX2 and let the r.v. Z have the normal distribution with the parameters (0, 1). According to the well known Bernstein's theorem [3] IISnhIL,(S2)
II0ZIIL,(-) = IIXIIL,(f) IIZIIL,(f)
as n - oo. Since p > 2, then Dn -> IIXIIE, Letting n -* oo, we get b1 IIXIIE, < IIXIIL,(n) IIZIIL,(Cl) < b2 IIXIIE,
Hence E2 = L2(0). 0
2. Estimates of the von Bahr-Esseen type 1. Introduction and results. Von Bahr and Esseen proved the following inequality [2]. Let 1 < p < 2 and let {Xk}k_1 C Lp(f) be independent r.v.s, EXk = 0 (1 < k < n). Then < Lp
(2IXk) k.1
Rearrangement Invariant Spaces
34
Definition 5. We say that a r.i. space E has the von Bahr and Esseen p-property (E E (BE)p) if n
1/p
n
EX, k=1
<_ B E IIXk IIE
(29)
k=1
E
for all independent r.v.s {Xk}k=1 C E, EXk = 0, where B doesn't depend on Xk.
The estimate (29) may be fulfilled for p < 2 only. Indeed, let's consider independent r.v.s {Uk}°k°__1 with the symmetric Bernoulli distribution. Paley and Zygmund's inequality yields IIEk-1 UkIIE ? C1n1/2, where C1 is a pos-
itive constant. Using (29), we obtain IIEk=1 UkIIE < Cnl/p (n = 1, 2.... ). So, p < 2.
If p = 1, then (29) turns into the triangle inequality and we suppose in the sequel that 1 < p < 2.
Theorem 3. Let 1 < p < 2 and E E K. Then E E (BE)p if and only if E satisfies the upper p-estimate.
Theorem 4. Let E E K. Then E E (BE)2 if and only if E satisfies the upper 2-estimate and E C L2(f2)-
The condition E E K is essential. Indeed, L,,.(SZ) satisfies the upper pestimate for all p > 1 and L,, (S2) 0 K. One may easily verify that L,., (0) (BE)p if p > 1.
2. Some estimates. Proposition 9. Suppose E satisfies the upper p-estimate. Then 4'E(t) > atl/p (0 < t < 1), where a is a positive constant.
Proof: Let 0 < x < 1. For an integer n we may choose mutually disjoint sets {hk}k=1 such that P(hk) = x/n. We have n
OE(x) _
Ih,,
k=1
E
1/p
; IIIh.IIE)
= An1IpclE (n/l
k=1
'
where A is a constant. For every t E (0, 1) there is an integer n such that 1/2 < to < 1. Putting x = in, we get OE(t)
A-'n-1/p0E(x)
Using (28), we obtain the following statement.
A-1gE(1/2)t1/P.
2. Inequalities for sums of independent random variables
35
Proposition 10. Under the assumptions of Proposition 9 P{IXI > x} < bx-p IIXIIE
for every X E E and all x > 0, where b is a constant.
Proposition 11. Suppose IXI < a and 1 < p < 2. Then 22--p
EX2 <
.)p
p (IIXIIp
Proof: Put F(x) = P{X < x). Since IXI < a, then a x2dF(x) _
EX2 =
ja
x2d(1 - F(x) + F(-x)).
fa
Integrating by parts and taking into account that the term outside of the integral is non-positive, we obtain EX2
<21 xP{IXI > x}dx.
According to (1.2), P{IXI > x} < (IIXIIp .)p x-p. From here and the previous the needed inequality follows.
Lemma 5. Let {Uk}k=1 be independent r.v.s such that IUk1 < a and EUk =
0 (1 < k < n). Let a r.i. space E satisfy the upper p-estimate (1 < p < 2) and if p = 2, then E C L2(S2). Put u = Ek=1 IIUkIIE Then for every x > 0 n Uk
>x} <2exp(-2alog(1+ry ))
(30)
k=1
where -y > 0 depends on a, p and E only.
Proof: Let's put a2 = Ek=1 EUk . If p < 2, then from (1.2) and Propooo < blip IIXIIE for each X E E. Proposition 11 yields v2 < a2-pu/(2 - p). If p = 2, then E C L2(S2). Hence EX2 < AIIXIIE
sition 10 IIXIIp
for some constant A, which yields a2 < Au. Now we apply Prokhorov's "arcsinh" inequality (see the section 1.4). We have aresinh(t) > log(1+t), which
yields arcsinh (ax/(2u2)) > log(1 + yx/u), where y = a/(2A) for p = 2 and ry = ap-1(2 - p)/(2b) if 1 < p < 2. From here (30) follows. 3. Proof of Theorem 3. Sufficiency. First we prove (29) for symmetric r.v.s. Let {Xk}k=1 C E be independent symmetric r.v.s. We may assume n
E IIXkIIE = I. k=1
(31)
Rearrangement Invariant Spaces
36
Let's put
a = bl/p
(31)
where b is the constant from Proposition 10, and Uk = XkI{IXkI
Then
,
V k = Xk - Uk.
(33)
E Vk
(34)
n
EXk
IIk=lUkllE-
< We estimate each term in the right-hand side. E
k=1
k=1
E
Let Y be a r.v. with the Poisson distribution. The condition E E K implies Y E E. It is well known that P{Y > x} > C exp (-x log(1 + x)), where C > 0 is a constant. Lemma 5, (31) and (32) imply (30) with u = 1. Hence there is a constant v > 0 such that for all x > 0 the estimate P{IEk=1 UkI
x} < P{vY > x} holds. From here and Proposition 1.2 n
E Uk k=1
(35)
E
Proposition 10 and the relations (31)-(33) imply n
n
n
EP{Vk # 0} _ E P{ IXk I > a} < ba E IIXkIIE k=1
k=1
=1.
(36)
k=1
From here and Lemma 1.4, n
n
< B(E) E Vk
Vk
k=1
E
k=1
E
Since E satisfies the upper p-estimate, the term in the right-hand side is not greater than 1/p
1/p
AB(E)
E IIXkIIE= AB(E), 1: IIVkiIEAB(E) (k=" (k="
1
1
where A is a constant. So, n
E Xk k=1
E
(37)
2. Inequalities for sums of independent random variables
37
Now we consider the general case. Let {Yk}k=1 be an independent copy Of {Xk}k=1 and Zk = Xk - Yk. Then IIZkIIE <- IIXkIIE + IIYkIIE = 2IIXkIIE and Proposition 1.11 gives us n
n
E Zk k=1
> C(E)
EXk k=1
E
E
From here (29) follows with B = 2H/C(E). 4.
Proof of Theorem 3. Necessity. Let E E K and E E (BE)p.
Let {Xk}k=1 C E be mutually disjoint r.v's. We may suppose Xk to be symmetric. Applying (29) and Theorem 1.1, we get the desired statement. 5.
Proof of Theorem 4. The sufficiency is proved as above. The
inequality (35) follows from the condition E C L2(&) and the bound (30) for
p=2. Now we turn to the necessity. The upper 2-estimate is proved as above. To show E C L2(S2) we need the next assertions [18].
Lemma 6. Let {Xk}k 1 be equidistributed independent r.v.s such that n
C - supE n -1/2 E Xk
(38)
< Oo .
k=1
Then EX 1 = 0 and EX I < oo.
Proof: We have Cn1/2 > E IEk=1 XkI > IEEk=1 XkI = n IEX1I. From here EX1 = 0. Let's put Yk = X2k - X2k_1. The r.v.s {Yk}' 1 are symmetric and independent. For a > 0 we put Yk,a = YkI{Iyi I
From symmetry Uk,a
,
Uk,a = Yk - 2Yk,a.
Yk and, therefore, n
=
E k=1
lE 2
E (Yk - Uk,a) k=1
<E
t
Yk
Hence for {Yk,a}' 1 the estimate (38) holds with the same constant. Put a2(a) = EY1 a and let Z be a r.v. with the normal distribution and the parameters (0, 1). The Central Limit Theorem [35] yields n
Y
lim E I n-1121: k ,al = o,(a)E IZI. n-oo k=1
Rearrangement Invariant Spaces
38
Therefore cr(a)EIZI < C for each a > 0. Letting a -> oo, we obtain EYZ < oo, which implies EXi < oo. We continue to prove Theorem 4. Let X E E, EX = 0 and {Xk}k 1 be independent r.v.s equidistributed with X. Since E E (BE)2, the estimate (29) implies n k=1
1/2
n-1/2 i Xk
BIIXIIE,
k=1
E
which together with Proposition 1.1 gives (38). According to Lemma 6, X E L2(0) and E C L2(fl).
3. Upper estimates of the Rosenthal type 1.
Introduction and results. Here we'll study the question of the
upper estimate in (5). We assume that E = El, E2 = L2(S2), E1 = lp and E2 = 12, where 2 < p < oo. For r.v.s {Xk }k=1 C E n L2 (1) we put
E
1/r
n
BE,p = max
IIXk IIE f
r
/
k=1
1/2
k=n
(EX2/
(39)
1
Definition 6. We say that a r.i. space E has the weak Rosenthal p-property (B
(WR)p) if n
EXk k=1
< DBE,p
(40)
E
for all independent r.v.s {Xk}k=1 C E(lL2(S2) with mean zero and some constant D depending on E and p only.
Theorem 5. Suppose E E K and E satisfies the upper p-estimate. Then E E (WR)p. The condition E E K is essential since the space Li(st) satisfies the upper p-estimate for every p > 1 and L,,(St) (WR)p for p > 2. The reverse assertion is not true. Indeed, L2(0) E (WR)p and L2(&1) does not satisfy the upper p-estimate for p > 2. We recall that /j(E) is the upper Boyd index of E (see the section 1.2).
Theorem 6. Let E E K
, 3(E) < 1/2 and E E (WR)p, where p > 2. Then E satisfies the upper p-estimate.
Let's consider the case p = 2. If E J L2(St), then E E (WR)2. Indeed this assumption yields IIXIIE
C IIXIIL2(n)
2. Inequalities for sums of independent random variables
39
for each X E L2(Sl) where C is independent of X. From here (40) follows.
Now we show that if E C L2(0), then the conditions E E (BE)2 and E E (WR)2 are equivalent. It is obvious that the first of them implies the second one. Let E E (WR)2. Since E C L2(fl), then (41)
IIXIIL2(n) < D IIXIIE
where D is a constant. Therefore n
BE 2 < maX{1, D}
)1/2
(IlxkIIE k=1
and E E (BE)2. Applying Theorems 3-6 to the Lorentz spaces, which has the Kruglov property (Theorem 1.3), we obtain the folowing result.
Theorem 7. Let 1 < p, q < oo and r = min{p, q}. Then 1) if r < 2, then Lp,q(fl) E (BE),. and Lp q(Q) (BE), for s > r; 2) if r > 2, then Lp,q(c) E (WR), and Lp,q(Sl) (WR), for s > r; 3) if q = 2 < p, then Lp,q(0) E (BE)2 and Lp,q(Sl) 0 (WR), for p > 2
ands>2;
4) if p = 2 < q, then Lp,q(Sl)
(BE)2 and Lp,q(fl) E (BE), for s < 2.
2. Proof of Theorem 5. Let {Xk}k=1 C EnL2(Sl) be independent r.v.s such that BE,p = 1.
(42)
We reason the way we did for the proof of Theorem 3. Let r.v.s Uk and Vk be defined by (33). From (39) and (42) a2 = Ek=1 EUk < 1. Using Prokhorov's "aresinh" inequality (see the section 1.4), we obtain (30) with u = 1, which implies (35). Applying the upper p-estimate, we get the bound for the second term in the right-hand side of (34). So, IIEk=1 XkIIE < H, where H depends on E only, which yields (40). D
3. Proof of Theorem 6. We begin with an auxiliary inequality. Proposition 12. Let E C L2(0), X E E /and1 P{X # 0} = r. Then IIXIIL2(n) <
Dr112,YE Cr) IIXIIE
,
(43)
where D = D(E) is a constant.
Proof: Let b > 0. For a r.v. Y denote by Y(b) a r.v. such that for every
x>0 P{Y(b) > x} = bP{Y > x}
,
ply(b) < -x} = bP{Y < -x}.
Rearrangement Invariant Spaces
40
It is easy to see that Y(b) exists if and only if P{Y # 0} < 1/b. Since r < 1, a < then X (X(1/'))(') From here and (41) IIXIIL,(sa) = rl/2 IIX(1/r)II Dr1/2
IIX(1/r)IIE < Dr1/2ryE (1/r) IIXIIE
Let's turn to the proof of Theorem 6. Suppose /3(E) < 1/2, E E K and E E (WR)p, but E does not satisfy the upper p-estimate. Then for every sequence ak / oo we may choose r.v.s {Xk,n}k(i) which are mutually disjoint for each n E N and such that P
m(n)
> an E I IXk,n I IE E
k=1
Let 0 < rn < 1. There are mutually disjoint r.v.s {Zk,n}k(i) for which Zk,n
(1 < k < m(n)).
Xk »)
It is easy to verify that m(n)
If 0 < r < 1, then X
(rn)
m(n)
E Zk,n
E Xk n
k=1
k=1
d
(X(r))(1/r) Using Proposition 1.7, we get IIXIIE C 7 (1/r) IIX(r)IIE < r-1 IIX(r)IIE, which yields m(n)
P
> rn
E Zk,n k=1
p
m(n)
anrn
Xk,n
E
k=1
m(n)
E
IIXk,nhlE
k=1
m(n)
anrn
II7'k,nIIE'
(44)
k=1
We use the inequality IIXk,nlIE
0
II7'k,nlIE' which follows from the condition
Since 9(E) < 1/2, then Proposition 1.9 implies E C L2(S2). According to Proposition 1.8, there are constants p E (3(E), 1/2) and v > 0 such that 7E(t) < tµ for t > u. Choose rn under the condition rn 1 > max{1, v}.
(45)
Then (43) implies II7k,n IIL3(n)
Drj2-µ IIZk,n IIE .
(46)
2. Inequalities for sums of independent random variables
41
Without loss of generality we may assume the r.v.s Zk,n to be symmetric. a Let Yk,n be independent r.v.s, Yk,n Zk,n. According to (46), we have 1/P
m(n)
BE,P < max
m(n)
1: IIYk,fIIE
Dr1/2-µ
,
k=1
1/2 IIYk,n112 E
k=1
where BE,p determines by the r.v.s {Xk,n}k (i). Let's put
rmn) 21 1/2 Lrk=1 xk cn = sup
k=( -++1 (L.
1/P'
Ip
Ixk
/
where supremum is taken over all non-zero vectors on Rm(n). Then 1/P
m(»)
BE,p < max { 1, cn
Dr,z/2-µ }
.
IIYk,nIIE
k=1
Theorem 1.1 and (44) give us P
m(n) Yk,n
k=1
P
m(n) 1
E Zk,n
4
k=1
E
MM
>
anrp E IIZk,nIIE
E
k=1
m(n)
1
damn E IIYk,nIIE' k=1
Therefore
m(n)
> bnBE.P,
E Yk,n k=1
(47)
E
where 11/P
bn = rn
G1
an)
min {1, r,'y-1/2(Dcn)-1 }
.
Let's choose rn under the conditions (45) and rn(Dcn)-1 > 1. Since p < 1/2, this is possible. Putting an = 4(n/rn)P, we get bn = n and (47) gives the bound P
m(n)
> nBE,P
Yk,n
k=1
HE
(n = 1, 2, ...) .
Rearrangement Invariant Spaces
42
This contradicts the condition E E (WR)p and implies the theorem.
4. Proof of Theorem 7. Since a(LP,q) = /I (LP,q) = 1 > 0,
(48)
then Lp q(t2) E K. Theorem 3 and Proposition 1.12 imply 1). If r = min 1p, q} > 2, then from (48) q
1
13(LP,q) < 2,
(49)
Using Theorems 5 and 6 and Proposition 1.12, we obtain 2). For q = 2 < p we have Lp,q(t2) C L2(S2). Applying Proposition 1.12 and Theorem 4, we get
Lp,q(1) E (BE)2. If the strict inequality q = 2 < p holds, then, according to Proposition 1.12, the space Lp q(t2) does not satisfy the upper s-estimate for s > 2. So, (49) and Theorem 6 yield Lp,q(t2) 0 (WR),. Now let's consider the case p = 2 < q. Then the space Lp,q(t2) is not contained in L2(t2). Theorem 4 gives us Lp,q(t2) (BE)2. According to Proposition 1.12, Lp,q(t2) satisfies the upper 2-estimate, which yields the upper s-estimate for s < 2. Theorem 3 implies that Lp,q(t2) E (BE)3.
4. Estimates in exponential Orlicz spaces 1. Results. Let's put for p > 0 (1`/P1 Ixlkp
Np(x)=exp(Ixlp)- L,
kt
k=0
where [x] denotes the integer part of x. This function is convex and even, N(0) = 0. Hence it determines the Orlicz space LNp()). In this section we'll consider the question of the von Bahr-Esseen and Rosenthal properties for these spaces.
If 0 < p < 1, then Kruglov's Theorem (see the section 1.6) gives us LNp (Q) E K. Using the results of the sections 2 and 3 and Proposition 1.13, we conclude that LNp(t2) E (WR)q for every q > 2.
If p > 1, then the space LNp(t2) does not have the Kruglov property. Indeed, let F be the distribution of the r.v. X =_ 1. Then H(F) is the Poisson distribution with the parameter .\ = 1. It is easy to verify that LNp (0) (p > 1) does not contain a r.v. Y E C(H(F)), which yields LN,(t2) 0 K. Thus, the results of the previous sections cannot be applied to the spaces in question. We denote the norm on the space LNp (t2) by I IX I I(p) As usual p' _ pl(p - 1) for p > 1. For {Xk }n=1 C LNp (t2) we put
2. Inequalities for sums of independent random variables n
A(P)
43
1/p'
IIXkII(p))
(50)
k=1
B(P) = max
A(P)
,
(Ex
(51)
k
k=1
and
H(P)
_ f B(P) if l < p < 2,
(52)
l A0') ifp>2. Theorem 8. There exist positive constants C(p) and D(p) such that for every independent r.v.s {Xk}k=1 C LNp(S2) with mean zero and all x > 0
P{
E Xk > xH(P)1 < C(p) exp (-D(p)xP) .
(53)
k=1
From here and Proposition 1.2 the next statement follows.
Theorem 9. If 1 < p < 2, then LNp (f2) E (WR)p, and if p > 2, then LNp(f) E (BE)p,. Let's consider independent symmetric r.v.s {Yk} 1 such that for every
x>0andkEN
P {IYk I > x} = exp (-x1') ,
(54)
where p > 0. It is clear that Yk E LNp (0). Put r(p) = 2 if 0 < p < 2, and
r(p)=p' ifp> 2. Theorem 10. There exist constants Cj = C, (p) > 0 (j = 1, 2) such that
for all ak ERand nEN n
C,
1/r(p)
akYk k=1
1/r(P)
n
n
(IakIT)
(Ir(P))
< C2
I
(p)
k=1
1
According to Theorem 10, LNp (f) (BE), for p > 2 and s > p'. To prove these results we need some lemmas.
2. Estimates for characteristic functions. Let X be a r.v. and P1 IX I > x} < bexp (-cx1)
(55)
for every x > 0, where p > 1 and b, c > 0 are constants. Then the corresponding characteristic function is extended to the entire function (see [34]). The following assertion is well known [34].
Rearrangement Invariant Spaces
44
Lemma 7. The estimate (55) holds if and only if there are positive constants ,3 and ry such that If(z)I < exp(3 IzIp) (56)
for all complex z, IzI > y. The constants Q and ' are determined by b, c and p and vice versa. If X E LN, (0) and I IX I I(p) = 1, then one may easily verify that (55) holds
with the constants b = 2 and c = 21/P. Hence the related characteristic function f (z) is entire. Let's put for m E N
Q..(X,z) = E
it zJ EX I
(57)
j=1
Lemma 8. Let X E LNp(52), IIXII(p) = 1 and m = [p']. Let ry be the related number from Lemma 7. Then the corresponding characteristic function is represented in the form f (z) = 1 + Qm (X, z) + v(z) Iz 1--f 2. p'}
where sup{Iv(z)I : IzI < y, z E C} < A(p,7) < co and A(p,-y) depends on p and ry only.
Proof: Taylor's formula and the well known equality EX' = ik f (k)(0) give
us f (z) = 1 + Q(X, z) + T(z). The remainder term is represented in the form
T(z) _
f(m+1)(u(z))zm+l
(m + 1)!
where u(z) belongs to the segment joining 0 and z. Applying Lemma 7, the formula f(m+1)(,u) = im+1 E xm+leiuxdF(r), 00
where F(x) = P{X < x}, and the bound (55) with b = 2 and c = 21/p, we obtain the estimate If(m+l)(u(z))I IXIm+l
r,
G
eryxdF(x) < B(p,'Y) < oo,
where IzI < y and B(p,-y) depends on p and ry only. Putting
v(z) = T(z) we obtain the needed representation. 0
IzI-max{2,p'}
2. Inequalities for sums of independent random variables
45
Let 0 < r1 < ... < r < oo. Then n
trk < C(tr, + t,,,) k=1
for all t > 0, where C depends on these exponents only and the inequality n
EEIXIrk
Lemma 9. Let X E LNp(S2), EX = 0 and 1 < p < 2. Then
If(z)I <exp (C(p) (IzI2 EX2+Izip' IIXII(p))) for every complex z. If p > 2, then If(z)I < exp (C(p) min {IzI2 IIXII2)
,
Iz' ' IIXII(p)})
Proof: Assume IIXII(p) = 1. Then (55) holds, where b = 2 and c = 21/p. According to Lemma 7 there are positive constants /3(p) and 7(p) such that
(56) holds if IzI > y(p). Let 1 < p < 2. Since EX = 0, we get from the previous m <EEIj! IQm(X,z)I
+EIzXip'
j=2
There is a constant D = D(p) such that E IYI'' < D IIXII(p) for every Y E LN, (S2). Hence
IQ-(X,z)I <_ Cl (IzI2EIXI2+IzI IIXII(p))
.
The condition IIXII(p) = 1, Lemma 8 and the inequality 1+x < exp(x) imply If(z)I < exp (C2(p)
(IzI2
EX2 + IzIp'))
'
where IzI < 7(p). From here and Lemma 7 the needed bound follows. If p > 2, then m = 1. Since EX = 0, we get Q,,., (X, z) - 0. From Lemma 8, if IzI < 7(p), then i f (z)I < 1+A(p) IzI2 < exp (A(p) IzI2) and, since p' < 2, we obtain from here and (56)
If(z)I <exp (C(p)min{Iz12
Izlp'}) ,
.
Now we remove the assumption IIXII(p) = 1. Let's put t = IIXII(p) and
Y = t-1X. Denoting the characteristic function of Y by g(z), we have g(z) = f(z/t). Using the obtained estimates for g(z), we get the needed estimates for f (z).
46
Rearrangement Invariant Spaces
Lemma 10. Let a r.v. X have the entire characteristic function f (z). Suppose there are positive constants a and b such that P{ I X I > x} > b exp (-axp) for x > b. Then there exist positive constants a and y such that f (-it) > exp (a Itlp) if ItI > y.
Proof: We have for t E R
f (-it) =
J
00 00
etxdF(x) > r
etxdF(x) >exp(vltlp)P{IXl>vltlp'-1},
where F(x) = P{X < x} and v > 0 will be chosen later. From the assumptions of the lemma
f(-it) > bexp (v
ltlp'
- a (v
Itlp'-1)p) .
Since p(p' - 1) = p', then
f(-it) > exp (v ItIp (1 - avp'') +log(b))
.
Let's choose v under the condition 1 - avp' 1 > 0. Then, if It I is sufficiently large, the needed estimate holds.
3. Proof of Theorem S. Let r.v.s {Xk}k=1 C LND(fl) be independent and EXk = 0 (1 < k < n). We may assume without loss of generality that H(p) = 1.
(58)
Let's denote Jlk = IIXII(p) and Yk = .1k 1Xk. Let fk(z) be the characteristic
function of Yk. Then the sum S = >k=1 Xk has the characteristic function n
f(z) = [J fk(Akz)
(59)
k=1
Suppose 1 < p < 2. Applying Lemma 9 to every function fk(z) and using (52) and (58), we obtain If W1 < exp \I C(p) ((z1HP)2 .+ (Izl H(P))P'
= exp (C(p) (lzl2 + lzlp'))
/
2. Inequalities for sums of independent random variables
47
IzI" Since p' > 2, then Iz12 < if IzI > 1. From here and Lemma 7 the needed estimate follows. Let p > 2. According to (59) and Lemma 9 n
(C(P)min{IkzI2 IzIp }
I f(z)I < exp
k_1 From (58) and (52) Ak < 1 . Since p' < 2, then ak < A '. Hence .xk,
min {IAkzI2
min
IAkzI1' } <
,
{1x12 ,
IzIp'}
.
This estimate and (58) imply that if IzI > 1, then If (z)I < exp (C(p) min
{Iz12 ,
Izlp }) = exp (C(p)
IzIp'.)
Applying Lemma 7, we obtain Theorem 8. 0
4. Proof of Theorem 10. Let 0 < p < 1. It was mentioned above that LNp (f2) E (BE)2 in this case. If 1 < p < 2, then according to Theorem 9, LN, (0) E (WR)p,. Since p' > 2, then LNp (0) E (BE)2. So, n
111/ n
EakYk k=1
(p)
\111 112
for 0 < p < 2, where C2 = C2(p) is a constant. Since LNp (S2) C L2 (0), then n
EakYkll k=1
k=1
(p)
21/2
n
=D IIY1IILa(-)
D E akYk IIL2(a)
(Ea) k
k=1
where D = D(p) > 0 is a constant. Therefore for 0 < p < 2 Theorem 10 is proved.
Let p > 2. Theorem 9 implies the upper estimate. To prove the lower one we need the following auxiliary statement.
Lemma 11. Let the conditions of Theorem 10 be fulfilled. Then for every positive b and c there is a positive D = D(b, c, p) with the following property: if for every x > 0
> xl < bexp (-cxp,)
P{ k=1
(60)
Rearrangement Invariant Spaces
48
then
n
1 I ak I r(P) < D. k=1
Proof: Let p > 2. Let's denote the characteristic function of the r.v. Y1 by f (z). Since Y1 is symmeric, then
Za
EY21 z
f(z) = 1 -
+o(Iz12)
as z -+ 0. Hence f (-it) > exp (ate) for sufficiently small t E R and some positive constant a. It is easy to verify that from (54) the strong inequality f (-it) > 1 follows for each real non-zero t. Applying Lemma 10, we obtain the estimate It1P, f (-it) > exp (b min {t2 , }) , which holds for all t E R and a positive constant b. Suppose (60) holds. Let's denote the characteristic function of the sum S = 1:'=1 akYk by g(z). We have
(6min {(akt)2
g(-it) = 11 f(-iakt) > exp
,
aktlp'}.
k=1
Lemma 7 and (60) imply that there are positive constants jI and y depending on b and c only, such that Ig(z)I < exp (QIzIP )
if IzI > y. From the last relations n
b1: min{(akt)2
Iaktlp,}
k=1
if Itl > y. Since r(p) = p' < 2, then Iak Ir(P) < b
k=1
and in the case p > 2 the proof is complete. If 0 < p< 2, then (60) gives us n
EakEYl = E k=1
n
2
(akYk) < C(b, c) < 00
2. Inequalities for sums of independent random variables
49
and Lemma follows.
We continue to prove Theorem 10. Suppose that for p > 2 the lower estimate is not true. Then there are real numbers {ak j }kGi (j = 1, 2, ... ) such that n(j)
l ak,j l
r(P)
=
n(j)
1
k=1
Put m(0) = 0 and m(j) = n(1) + i
< 2-j.
E ak,jYk
k=1
(61)
(P)
+ n(j) and consider the sums n(j)
Si =
Eak,1Ym(j-1)+k.
(62)
j=1 k=1
According to (61), 11SiII(P) < 1 for all i E N, which implies the estimate (55) for Si with the constants b = 2 and c = 21/P. Applying Lemma 11, we obtain i
n(j)
lak j lr(P) < D j=1 k=1
where D does not depend on i. But according to (61), the left-hand side is equal to i. Hence the last inequality cannot be true for all i. This contradiction proves Theorem 10.
CHAPTER III
LINEAR COMBINATIONS OF INDEPENDENT RANDOM VARIABLES IN REARRANGEMENT INVARIANT SPACES
1. lq-estimates (q # 2)
1. Introduction and results. Throughout this chapter {Xk}k 1 is a sequence of independent identically distributed random variables (i.i.d.r.v.s), E is a r.i. space and q is a fixed positive number. We suppose that {Xk }°k°__1 C
E and consider estimates of the types < C, IIa"i9 E
and
:BIlallq,
E(akXkI k=1
E
where a = {cak}7=1 and B and C are positive constants independent of n and lak. We call these estimates the upper and lower l9-estimates respectively.
Theorem 1. Suppose there exist positive constants C1 and C2 such that for every it E N n
C1,,t1/a <
E Xk
< C2n1/4
I
k=1
(1)
E
Then l < q < 2. H q > 1, then EX1 = 0. Let Zq be a symmetric r.v. such that for all x > 0 P{IZgI
>x}=min{x_Q, 1}.
(2)
Theorem 2. Let 1 < q < 2, E D LQ,.(52) and c 11.1w
0.
(3)
Suppose (1) is fulfilled. Then EX1 = 0 and there exist positive constants a, b and c such that for a: > c cax_q < P
{IXuI > x} < bx-Q.
(4)
3. Linear combinations of independent random variables
51
In this connection the upper estimate in (1) implies the same estimate in (4) and the condition EX1 = 0.
Theorem 3. Suppose 1 < q < 2, E D Lq,,,.(fl), EX1 = 0 and (4) holds. Then for all ak E R and integers n D1
(IakI l
1/q
n
< D2(IakI E
akxk k.=1
k.=1
E
1/q q
(5)
k=1
where D1 and D2 are positive constants depending on E and the constants from (4) only. In addition, the right-hand side of (5) follows from the righthand side of (4).
The next result shows that the condition (3) is essential. Indeed, for the space Lq,,,(Q) this condition doesn't hold.
Theorem 4. Let 1 < q < 2
. There are i.i.d.r.v.s {Xk}k1 C L1,3(S2) such that (5) holds and the lower estimate in (4) doesn't hold. The condition E D Lq,0., (ft) is also essential. Consider the Orlicz space LN, (f2) and i.i.d.r.v.s {Yk }x_1 for which (2.54) is fulfilled. If p > 2, then according to Theorem 2.10, the relation (5) holds, where E = LN, (f2) and q = p'. But the lower estimate in (4) does not hold. It is clear that LN, (0) does not contain the space Lq 0.(2). One may easy verify that for every sequence of i.i.d.r.v.s {Xk}k 1 C
L... (0) such that EX1 = 0, the inequality (5) holds, where q = 1. Conversely, if a r.i. space E has such a property, then E = L,,.(fl) (see [48]). Now we consider the spaces Lp(f2) for 0 < p < oo. If p < 1, the space LP(S2) is not a normed space and therefore Theorem 1 cannot be applied. Theorem 5. Let p, q > 0, (154 2 and let {Xk}k_1 be i.i.d.r.v.s. Suppose for E = Lp (f2) (1) holds and if q = 1 inf n
> 0.
En-
k=1
Ln(11)
Then
1)0
3) if q > 1, then EX 1 = 0 4) if q = 1, then sup IEX1I{Ix,I
In addition, the upper estimate in (1) and the condition 1) imply the upper estimate in (4). The supplementary condition to (1) is essential. Indeed, let Xk - 1, q = 1 and p > 0. Then (1) holds, but the lower estimate in (4) is not true.
Rearrangement Invariant Spaces
52
Theorem 6. Let the conditions 1)-4) of Theorem 5 be fulfilled. Then for E = Lp(S2) the estimate (5) holds. In this connection the upper estimate in (4) implies the same estimate in (5).
It should be noted that under the assumptions E = Lp(fl) and 0 < p < q < 2 the equivalence of the upper estimate in (1) and (4) was proved in [18].
2. Proof of Theorem 1. First we prove an auxiliary result. Lemma 1. Let E be a r.i. space, {Xk}°k° 1 C E be i.i.d.r.v.s and X1 # 0. Then there is a constant C > 0, depending on E and the distribution of the r.v. X1 only, such that (6)
for all ak E R and integers it
.
Proof: Put Yk = X2k - X2k_1 and for b > 0 Yk,b = YkI{IYkI
Uk,b = 2Yk,b - Yk.
,
In view of symmetry Uk,b = Y. Since (Ilk,b +Yk)/2 = Yk,b, then akYk
akYk,b k.=1
k=1
J-
E akUk,b k=1
E
E akYk k=1
E
Choosing b > 0 so that Yk.,b # 0 and applying Paley and Zygmund's inequality (see section 1.4), we obtain n
(1:
Yk,b
2
k=1
> 17
k=1 ak/
where it > 0 is independent of it and ak . From here n
1/2
tt.fYk,b
k=1
E
k=1
where D > 0 is a constant. Taking into account the above, we obtain (6) for {Yk}k 1. Since the r.v.s {Xk.}o° 1 are identically distributed, then
<2 E
k=1
E
3. Linear combinations of independent random variables
53
which implies the needed estimate.
Let's turn to the proof of Theorem 1. According to (1) C1n1/q <
ri IIXIIIE
IIXkII
ti_1
E
k=1
for every integer it. Therefore q > 1. Using (1) and (6) we get n
Cn1/2 <
EXk k=1
< CZn1/a, E
which implies q < 2. Proposition 1.1 and (1) give us
t
C2,nl1' >
k=1
xk
>A
l
EXrll k=1
E
A
ErXk
II L,(1l)
=AnIEX1I,
k-1
where A > 0 depends on E only. If q > 1, it follows that EX1 = 0.
3. Some estimates for characteristic functions. To prove the results mentioned above we use characteristic functions. Here we find out how (4) affects the behavior of the related characteristic function near zero.
Lemma 2. Let f (x) be the characteristic function corresponding to the r.v. X and 0 < q < 2. The condition (4) holds if and only if there are positive constants a, 0 and -y such that
altI' < 1-Ref(t) <9It1'
(7)
for ItI < y. The constants a, /3 and y are determined by a, b and c and vice versa. The upper estimate (4) is equivalent to the same estimate in (7). First we prove the following auxiliary bound.
Proposition 1. Let the upper estimate in (4) hold and denote F(x) _ P{X <x}. Then for every r> q andt > 0 J-1/t
Ix I' dF(x) < (r
q)
max{b,c9}tg
Proof: It is obvious that r 1/t
J
1/t
I,.
Ix;
dF(x) _ -
r
1/t
x'd (1 - F(x) + F(-x))
.
(8)
Rearrangement Invariant Spaces
54
We integrate by parts. Outside the integral we have
t,
C1-F t
+F
t
I
I <0
for all t > 0. Hence 1/t
J_ 1/t
1/t
r dF(x) < r J I xI
x'-1 (1 - F(x) + F(-x)) dx.
According to (4),
1 - F(x) + F(-x) < P {IX I > x} <
bx_y
for x > c. If 0 < x < c, then 1 - F(x) + F(-x) < 1 < c9x'g. The last inequalities imply the needed bound. p
Proof of Lemma 2. Implication (4) 1 - Ref (t) = J
(7): We have
(1 - cos(tx)) dF(x),
(9)
where F(x) is the same as above. Since 0 < 1 - cos(x) < min 12, x2 } then
fort00
'L
I - Ref (t) <
x2dF(x).
L
2
t
IxI>1/t
We estimate the first integral using Proposition 1 with r = 2. The second integral equals 2P {ix I > itI-1}. According to (4) it is not greater than 2blt19 if Iti > c. Therefore, if Iti < 1/c = -y, then the upper estimate (7) holds, where 2b +
max 1b, c9 }
(10)
2-q
Our arguments show that this estimate follows from the upper one in (4).
Let's turn to the lower estimate. Applying (9) and the well known inequality 1 - cos(x) > x2/3 (IxI < 1), we obtain 3l
I -Ref (t) >
r a/Itl < IxI <_1/It,
x2dF(x)
(11)
3. Linear combinations of independent random variables
55
for all t # 0 and every b E (0, 1). From here
1-Ref(t)> +
t; b2 t2 b2
3
(F
-F \ItI
\ItI+o)
CF I- It
I+
(P{IXI>
0) - F t
Itlll
}_{ixi
})
Using the estimates (4), we get for 0 < I< S/c
Choosing b E (0, 1) under the condition b2(a6_9
ab_q
- b)
- b > 0 and putting S
,
cY
3
y= C
we get the lower estimate in (4). Implication (7) (4): First we prove the upper estimate. The well known inequality [35]
It
P IXI?t
7,11
<
l+q
for 0 < t < y. Putting x = 1/t, c. = 1/7 and b = 7/3(1 + q), we obtain the right-hand side inequality in (4).
Let's turn to the lower estimate. Putting in (11) S = 0 and taking into account (7), we get for 0 < r < y x2dF(x) < 3/30-2.
Let t = Sr and 0 < b < 1. From (7), (9) and the last inequality
at4 < 1 - Ref (t) <
J 8/t x2dF(x) + 2 fixi b/t
(.)2t2+2P{lxi> t }
dF(x) > b/t
Rearrangernent Invariant Spaces
56
where 0 < t < -yb. So,
P { IX I >
'
3p 22-q
>la-
12
for such t. Let's denote
/
1/(2-q)
crbq
b- 1 3/j)
ta
4
Since 0 < q < 2 and 0 < cx < /j, then 0 < b < 1. Let. x = b/t. The last relations imply that if x > c = 1/-y, then the lower estimate in (4) holds. Similar results were proved in [4] and [5]. Now we consider estimates for the imaginary part of the characteristic function. We study conditions under which there are positive constants y and v such that (12)
Ilmf(t)I < It It 1' if
Lemma 3. Let f (t) be the characteristic function of the r.v. X and 0 < q < 2 . Suppose the upper estimate in (4) holds. Then the following hold: 1) if 0 < q < 1, then (12) is true; 2) if q = 1, then (12) holds if and only if sup IEXI{IxlO
3) if 1 < q < 2, then (12) holds if and only if EX = 0.
Proof: We have
CO
Im f (t) = J sin(tx)dF(x),
(13)
where F(x) is the corresponding distribution function. Let's put 1/Itl
J1(t) = f-1/Itl (sin(tx) - tx) dF(x)
,
J2(t) = t
1/ItI 1/Itl
xdF(x),
sin(tx)dF(x),
J3(t) I=I ? 1/Itl
where 10. Then Imf(t) = J, (t) + J2(t) + J3(t). It is well known that Isin(x) - x,
Ix13 /6 if IxI < 1. Using Proposition
1, we get (b, cl) it 11.
IJ1(t)I < in,,6-2q
3. Linear combinations of independent random variables
57
As Isin(x) I < 1 , the upper estimate in (4) implies the inequality IJ3(t)I < b i t I'
for Itl < l/c. It follows from the last relations that (12) holds if and only if j2(t) = O(Itl9) (t -> 0). (14)
Let 0 < q < 1. Applying Proposition 1 for r = 1 > q, we get (14). From the definition of J2(t), for q = 1 the condition (14) is equivalent to lilt) sup too
xdF(x) = UpIEXI{IXI
I
J-1/Itl
Now we consider the case 1 < q < 2. Suppose (14) holds. Then J2(t) _ o(Itl'*) as t -* 0 for all r E (1,q). Therefore 1/Itl
hJ-1tl xdF(x) = aim J t t) = 0.
EX = him
Let now EX = 0. Then for t # 0
xdF(x) =
Itl
xdF(x).
1/Itl
? 1/Itl
Integration by parts and the upper estimate in (7) give us
xdF(x) J, X1 > 1/Itl
<1P(IXI>1}+j 171 -1
l
xl - 1/Itl
Itl
P{IXI>x}dx.
Applying the upper estimate in (4) once more, we conclude that the righthand side is O(Itl9-1) as t , 0, which implies (14).
4. Estimates for the distributions of the sums. The result of this subsection will be used to prove Theorems 2 and 3, but it has an independent value.
Lemma 4. Let {Xk.}k'-1 he i.i.d.r.v.s and 0 < q < 2. Suppose (4) holds, EX 1 = 0 if q > 1 and sup I EX,I{Ix,I
if q = 1. Then there are positive constants u, v and w such that for every
akER,i ENandx> -tv n Ux-q
-1/q
n
at,Xk
I-]. F I k=1
I
>x
< vx-g.
(15)
Rearrangement Invariant Spaces
58
In addition, the upper estimate in (4) together with the other conditions of the lemma imply the upper estimate in (15). We break the proof into several steps. Without loss of generality we may assume n
Elakl9=1.
(16)
k=1
Let f (t) be the characteristic function corresponding to the r.v. X1. Then the sum Ek=1 cakXk has the characteristic function n
g(t) _ II f(akt).
(17)
k=1
We show that (7) holds for g(t) with constants independent of ak and n, which together with Lemma 2 implies (15).
Proposition 2. Let ak, 13k E R and Ial:l < 1 (1 < k < n). Then
1-H(ak+if3k)
+exp 1:.1
k=1
(i:
IQkI
Proof: We have n
n
1-II (ak+0k)
n
n
k=1
k=1
H (Yk +
k=1
k=1
Expanding the parentheses, we get n
II (ak + ij k) k=1
n-2
n-1
n qq
- II ak = On II (ak + i/ k) + iani n-1 11 (ak + iqqk) qq
k=1
k=1
k=1
n-3 + iancxn_lfn-2 II (ak + i k) + ... + ianan_1 ... Q1. k=1
Since lak.I < 1, it follows that n
n
k.=1
k.=1
n-k
n-1
11 ((.Yk+zlk)-Iak.
<:IQn-k+11 11 (1 + IQs 1) + 1911 k=1
j=1
=ll(1+lakl)-1. k=1
As 1 + IxI < exp(IxI), then n
n
fl (1 + IQk.1) < exp (t 1,6kI k=1
This implies the needed inequality. El
k=1
3. Linear combinations of independent random variables
59
Proposition 3. Let (7) and (12) be fulfilled. Then there are positive constants o' i )31 and yl such that if I1 < -y1, then
exp (-)61 It I') < Ref(t) < If(t)I < exp (-al It I') .
(18)
In addition, the upper estimate in (7) implies the lower estimate in (18).
Proof: It follows from (7) and the well known inequality 1- x < exp(-x) < 1 - x/2, where 0 < x < 1, that there is b > 0 such that exp (-2,Q It I') < Ref (t) < exp (-a It I')
for Itl < b, where a and 3 are the constants from (7). This together with (12) gives us the estimate
If MI < (exp (-2a
ItI')+/-,' ItI2q) 1/2
where ItI < min 1b, v}.
If h > 0, then
<exp(-4)
exp(-x) +hr;2<1- 2+h.x2<1-
for small enough positive x. Therefore there exist4positive constants a1 and b1 such that exp (-2a ItIl) +{p2 ItI2q < exp (-2a1 Itlq) for ItI < 61. This and the above implies the needed inequalities. One can see that the lower estimate in (18) follows from the upper estimate in (7).
Proof of Lemma 4: According to Lemmas 2 and 3, from the conditions of Lemma 4 the estimates (7) and (12) follow. Suppose that g(t) is defined by the formula (17) and (16) holds. Then Iakl < 1 and we obtain, using (18), n
1-Ref(t)> 1-Ig(t)I=1-
f(akt) k=1
n
> 1 - exp
(_ai
E Iaktlq = 1 - exp (-a1 Itlq) ,
(19)
k=1
where It I < yl.
Denote c(t) = Ref(t) and d(t) = Imf(t). According to (17) and Proposition 2, n
1 - Ref(t) < I1- g(t) I =
1 - II (c(akt) + id(akt))
k=1
c(akt) + exp
E Id(akt) I (k="
1
1.
Rearrangement Irtvariautt Spaces
60
Since Ic(t)I < 1, then from (18) and (16) n
tb
0 < 1 - fl c(akt) < 1 - fl exp (-,31 IaktI') = 1 - exp (-)1 It1) k=i
.
k=1
Applying (12), we get for ItI < v n
n
EId(ak.t)I pEIaktI'=it It I' k=1
L=1
The last estimates and (19) yield that there exist positive constants a2, /32 and 72, independent of n and cak, such that a2 It I' < 1 - Reg(t) < Q2 It 1'
(20)
if Iti < 72. These estimates and Lemma 2 give us (15). According to Lemma
2 and Proposition 3, the upper estimate in (4) implies the same estimate in (7) and the lower bound in (18). Reasoning as above, we get the righthand side inequality in (20). Using Lemma. 2 once more, we obtain the same estimate in (15).
5. Convergence of the norms. Lemma 5. Let the condition (3) be fulfilled, 1 < q < 2 and r.v.s Yn be weakly convergent to the r.v. Y
.
Suppose there exists a constant b > 0 such
that f'oralla:>0andit EN P {I1,1 I > x:} < bx:-q.
(21)
Then Y,Yn EEanditYnIIE -'IIYIIE'
Proof: The assumptions of the lemma imply that the r.v. Y satisfies (21) with the same constant. Since Zq E E, Proposition 1.2 gives that Yn and Y are contained in the space E. Let a > 0. The estimate (21) and Proposition 1.2 give us IIY"1{IYnI>a}IIE
B
IIZ9I{jzqj>a}IIE
where B is independent of a and it. From here and (3) lim sup -, n
a
0,IIYnIflYnl>a} (I
E-
So, for fixed e > 0 there exists a > 0 such that IIYn'flYYI>a}IIE < e
0.
3. Linear combinations of independent random variables
61
for all it E N and we may conclude that for all integers n IIYnIIE - E < IIYnr{IYni
Since the r.v. Y satisfies (21), the same inequality is true for Y.
The condition (3) implies E OL,,(S2). Hence ¢E(t) -; 0 as t -> 0 (see [48]). Using this, it is not difficult to verify that nlim IIYnr{isrni
From here and the previous IIYIIE - e < hm IIYn IIE << n_Oo lim IIYnIIE 5 IIYIIE + E.
n00
Putting f --+ 0, we obtain the needed relation. p
6. Some more inequalities for characteristic functions. Here we show that (1) implies (7).
Lemma 6. Let {Xk}k 1 be i.i.d.r.v.s, f(i) be the corresponding characteristic function and let 0 < p < q < 2 be fixed. Suppose p
C},,q = sup
nil
< 00.
(22)
n
Then (12) and the upper estimate in (7) are true. First we prove some auxiliary results. Since f (t) is continuous and f (0) 1, there exists log f (t) near zero. We consider the branch of the logarithm for which log(1) = 0 and put
-logf(t) ItI4
Therefore, there is b > 0 such that
f(t) = exp((- Itlg0 (L))
(23)
for ItI < S, t # 0. To prove Lemma. 6 we have to set bounds for 0(t) near zero.
Let's denote
X,
Sn = k=1
(24)
62
Rearrangement Invariant Spaces
This sum has the characteristic function fn (t) = (f (tn-lfq))n
(25)
.
The condition (22) yields that the sequence {Sn},, 1 is weakly compact (see [35]). Let's denote the collection of all characteristic functions related to the limit distributions of {Sn};'_1 by A. Put °° =A U {fn}n=1
By Yh we denote a r.v. with the characteristic function h(t). If h E ', then (22) and the well-known properties of absolute moments (see [35]) imply that E IYhI < Cj,q Therefore the collection {Yh : h E ti'} is also weakly compact.
Proposition 4. Let (22) be fulfilled. Then there is e > 0 for which v(e) = inf {Reh(t) : ItI < c, h E'1 } > 1/2.
Proof: Suppose the contrary holds. Then there are tk 'N 0 and hk E I such that Rehk(tk) < 1/2 (k E N). By virtue of the compactness we may choose integers k(n) / oo with the property h.k(n)(t) g(t) for every t E R, where g(t) is a characteristic function. From here (see [35]) lim hk(n)(tk(n)) = g(0) = 1.
n
oo
This relation contradicts the above. Choosing e > 0 under the condition v(e) > 1/2 we have Reh(t) > 1/2 for all h E T and t E (-E, E). So, for every h E ' the function log(h(t)) ItIl
is defined on (-e, e)\ {0} and on this set the formula h(t) = eXp (- It I' Oh (t))
(26)
takes place.
Proposition 5. Let h E 'I' and fk(n)(t) , h(t) for each t E R ¢(tk(n)`11) - Oh(t) uniformly on each set It : 0 < a < Itl G Q G E} 0(t) is the function from (23).
.
,
Then where
Proof: It follows from (23) and (25) that f" (t) = exp (- I w m (n-1/qt))
(27)
3. Linear combinations of independent random variables
63
if 0 < ItI < E. From here 10
(n-1/qt)
-q
log f(t) - log fni(t)I < Ifn(t) - fm(t)I (m-1/gt) I _
ItI° Idm,n(t)I
ItI°
where d,,, n(t) is a point contained in the linear segment joining the points f,n(t) and f, (t). According to Proposition 2, If,, (t) I > 1/2 if ItI < c (n E N). (t)I > 2. Hence It is well known that. the convergence of characteristic functions is uniform
on each finite segment (see [35]). Therefore the last estimates yield the needed assertion. Proof of Lemma 6: According to (23), the function q5(t) is defined on (-E, e)\ {0}. Proposition 5 and Arzela.-Ascoli's theorem (see [16]) give us C
sup 11
(tn-119)
E2-114 < ItI < E, n E N} < oo.
We show that I¢(t) I < C for every i E
{0}. It is not difficult to verify
that for such t there is an integer n with the property c2- 11q < I tn1/Q I < From this and the above I0(t)I = 10 (n-'/q (tn1/4)) I < C
This bound and (23) give us 1 - f (t) = 0 (ItI') as t
0, which implies
Lemma. 6.
Lemma 7. Let one of the following conditions be fulfilled: (a) 1 < q < 2 and (3) holds;
(b)0
1 C E be a sequence of i.i.d.r.v.s such that (1) holds and, if
E=L1,(1l)andp
(Xzx. - X2k-1) k..1
E
Then for the characteristic function f (t) corresponding to X1 the estimates (7) and (12) are true. If follows from (1) and Proposition 1.1 that in the case (a) the condition (22) holds, where 1 = p < q < 2. This condition follows from (1) in the case (b). According to Lemma. 6, the upper estimates in (7) and (12) are true. Hence, we have only to prove the lower estimate in (7).
Rearrangement Invariant Spaces
64
Proposition 6. Suppose It,, E IF and Yh , are weakly convergent to Y Then Y E E and lim IIYh,,, IIE nl-w
= IIYIIE
Proof: The upper estimates in (7) and (12) give us the conditions of Lemma 4 (see Lemmas 2 and 3). So, for the sum (24) the inequality (21) holds with a constant independent of n. According to the definition of the collection 'Ir, the same estimate is true for every r.v. Yh, h E T. Suppose (a) holds. Then the needed assertion follows directly from Lemma 5. Let (b) be fulfilled. According to (21)
supElYhI< oo
(29)
hET for every r E (p, q).
From here and the theorem on the convergence of
moments (see [35]) the needed relation follows.
Proposition 7. Let h E T and {Yk } _1 be independent r.v.s equidistributed with Yh . Then these r.v.s satisfy the condition of Lemma 7.
Proof: For each h E' there are integers n(j) / oo such that S,,(j) weakly converges to Yh, where S,, is determined by the formula (24). Let's fix an integer in and consider nx
(Jnx = M-
11q
EYk.
(30)
k=1
We have n(j) S,nxn(j) =
n(j)-11v
7711-1h9
k=1
Hence S,,,,(j) IlSrnn(j)IIE
E i=1X(k-1)n(j)+i
U,,, weakly as j =- oo, and according to Proposition 6 IIUn2IIE. From (1)
C1 < IjSrnn(j)IIE < C2
for all j. So, the same estimate holds for Un,, which is equivalent to (1) for Yk.
In the case p < q = I it is similarly proved that (28) holds for the sequence in question.
As usual we call a, r.v. X and the corresponding characteristic function degenerate if P {x = a} = 1 for some a E R.
3. Linear combinations of independent random variables
65
Proposition S. Suppose that for some 0 < p < v and for the functions fn determined by (25)
sup{Ifn(t)I:p
Proof: There are tk E [p, v] and integers n(k) / oo such that I fn(k)(tk)I 1. As mentioned above, (29) yields the weak compactness of {Yh : h E f}. Passing to a subsequence, we may assume that tk -> to E [p, v] and fn(k) (t)
h(t) for all y E R and some h E F. From here (see [35]) Ih(t0)I = klim Ifn(k)(tk)I = 1. Suppose Yh is degenerate and consider independent r.v.s {Yk Q 1 equidis-
tributed with Yh. Then Yk equals a constant almost surely. But it is not difficult to verify that then (1) does not hold if q # 1, and (28) is not true if q = 1. This contradicts Proposition 7. Let's denote
Jm = [(2,n)-11q, m-119]
(31)
Proposition 9. There is an integer m such that
sup {Ifn(t)I:tEJ.,nENJ =y<1. Proof: Suppose the contrary holds. Then, according to Proposition 8, for each m there are non-degenerate hm E IF and tm E Jm such that Ihm(tm)I = 1.
Hence hm corresponds to a lattice distribution. If am is the maximal step of this distribution, then (see [43], Ch.1) tm > 2ir/am and am > 21r/tm > 2rmt/4. As the collection {Yh : h E AF } is weakly compact, there are integers
m(k) / oo for which hm(k)(t) , h(t) E IF for every real t. It is not difficult to prove that am -. oo implies Ih(t)I = 1. So, the r.v. Yh is degenerate. As in the proof of Proposition 8, a contradiction follows. The next assertion is easily verified.
Proposition 10. For every 0 < t < v and integer j there is e > 0 such that (0 e) C I
I
n>>j
[µn-1/4 vn-tk]
Rearrangement Invariant Spaces
66
Proof of Lemma 7: Let's denote u(t) = Ref (t)
.
According to (25) and
Proposition 9, for some integer m and every t E Jm j
(to-'I9)
u (t,- 11q) I n< If
In
= If.(t)I C 'Y < 1.
Since u(0) = 1, there is b > 0 such that u(x) > 0 if Ixl < b. In addition, for some integer j the inequality to-1/Q < b holds if n > j and t E Jm, which yields nlog (u (tn-1/g)) < log(7) < 0 for such n and t. From here and (31)
-
(tn-119)-q
log (u (to-1/9)) > -t-Q log(7) > -mlog(ry) = a > 0 .
Therefore u (tn-1/0 < exp (-atQn-1). Proposition 10 implies that the union of the segments J,,. (n > j) contains some interval (0, e). If t E (0, e), then tn1/9 E Jm for some n > j . Taking into account the previous, we get
u(t) =
u n- 11q (t7z11q))
< exp
(_a (tn1/) n
9
= exp (-atQ) .
It follows from (9) that u(t) is even. Hence the last inequality yields the lower estimate in (7).
7. Proof of Theorems 2 and 3. Let the conditions of Theorem 2 be fulfilled. According to Lemma 7, the estimates (7) and (12) are true. From Lemmas 2 and 3, EX1 = 0 and (4) holds. If the right-hand side of (1) holds, then Proposition 1.1 implies (22), where p = 1 < q < 2. Using Lemmas 2 and 3 we get the second part of Theorem 2. Theorem 3 follows immediately from Lemma 4.
8. Proof of Theorem 4. Let {Yk}k 1 be symmetric i.i.d.r.v.s with the q-stable distribution. It follows from (1.18) that Y1 satisfies the estimate (4)
for some positive constants a, b and c. We choose numbers a /3 and xj from (c, oo) under the conditions
xj >
4j1-1/9a
(32)
2bj)1/4 Xj,
fli lim
'l' = 0.
j-.oo aj+l
(33)
(34)
Let's put 00
Xk = EYklf.9:5lY,,l
(35)
3. Linear combinations of independent random variables
67
It is clear that {Xk}k°__1 are symmetric i.i.d.r.v.s contained in the space Lq,cx (S2).
From (34) and (35) P {IXk l > fl j} = P {IXkI > aj+1}. The formula (35) gives us the estimate P { I Xk I > x} P { IYk I > x}, which holds for every
x > 0. From here and (4)
P{IXkI>/3j}