Research Article | Open Access

Mingzhou Xu, Kun Cheng, "Equivalent Conditions of Complete th Moment Convergence for Weighted Sums of I. I. D. Random Variables under Sublinear Expectations", *Discrete Dynamics in Nature and Society*, vol. 2021, Article ID 7471550, 12 pages, 2021. https://doi.org/10.1155/2021/7471550

# Equivalent Conditions of Complete th Moment Convergence for Weighted Sums of I. I. D. Random Variables under Sublinear Expectations

**Academic Editor:**Ya Jia

#### Abstract

We investigate the complete th moment convergence for weighted sums of independent, identically distributed random variables under sublinear expectations space. Using moment inequality and truncation methods, we prove the equivalent conditions of complete th moment convergence of weighted sums of independent, identically distributed random variables under sublinear expectations space, which complement the corresponding results obtained in Guo and Shan (2020).

#### 1. Introduction

Peng [1, 2] presented the concept of the sublinear expectation space to study the uncertainty of probability and distribution. The seminal work of Peng [1, 2] encourages people to study limit theorems under sublinear expectations space. Zhang [3–5] proved results including exponential inequalities, Rosenthal’s inequalities, and Donsker’s invariance principle under sublinear expectations. Wu [6] obtained precise asymptotics for complete integral convergence. Xu and Cheng [7] studied precise asymptotics in the law of iterated logarithm under sublinear expectations. The interested reader could refer to Xu and Zhang [8, 9], Chen [10], Gao and Xu [11], Fang et al. [12], Hu et al. [13], Hu and Yang [14], Huang and Wu [15], Kuczmaszewska [16], Ma and Wu [17], Wang and Wu [18], Wu and Jiang [19], Yu and Wu [20], Zhang [21], Zhong and Wu [22], and references therein for more limit theorems under sublinear expectations.

Recently, Guo and Shan [23] studied equivalent conditions of complete th moment convergence for weighted sums of sequences of negatively orthant dependent random variables. Xu and Cheng [24] obtained equivalent conditions of complete convergence for weighted sums of sequences of i. i. d. random variables under sublinear expectations. Motivated by the work of Guo and Shan [23] and Xu and Cheng [24], here we try to prove the equivalent conditions of complete th moment convergence of weighted sums of independent, identically distributed random variables under sublinear expectations space, which complement the corresponding results obtained in Guo and Shan [23], also extend results in Xu and Cheng [24] from complete convergence to complete th moment convergence.

We organized the rest of this paper as follows. In Section 2, we give necessary basic notions, concepts, and relevant properties and present necessary lemmas under sublinear expectations. In Section 3, we give our main results, Theorems 1–4, whose proofs are presented in Section 4.

#### 2. Preliminaries

As in Xu and Cheng [24], we adopt similar notations as in the work by Peng [2] and Chen [10]. Suppose that is a given measurable space. We assume that is a subset of all random variables on such that (cf. [10]), where , and implies , for each , where denotes the linear space of (local Lipschitz) function satisfyingfor some and depending on .

*Definition 1. *A sublinear expectation on is a functional satisfying the following properties: for all , we have(a)Monotonicity: if , then (b)Constant preserving: , (c)Positive homogeneity: , (d)Subadditivity: whenever is not of the form or

A set function is said to be a capacity if it obeys(a) and .(b), and . Moreover, if is continuous, then should satisfy(c) if .(d) if .

A capacity is called subadditive if , .

In this paper, given a sublinear expectation space , set , (see (8) and the definitions of above (8) in [4]). Clearly, is a subadditive capacity. Denote the Choquet expectations by

Suppose that , , and , , are two random vectors on . is called to be independent of if, for each Borel-measurable function on with ; for each , we have whenever for each and (see Definition 2.5 in [10]). is called a sequence of independent random variables if is independent of , for each .

Assume that and are two -dimensional random vectors defined, respectively, in sublinear expectation spaces and . They are called identically distributed if, for every Borel-measurable function such that ,whenever the sublinear expectations are finite. is called to be identically distributed if, for each , and are identically distributed.

In the sequel, we suppose that is countably subadditive, i.e., , whenever , , and , , . Let denote a positive constant which may differ from line to line. or represents the indicator function of , means that there exists a constant such that for large sufficiently, and means that and . We use for .

We first present several necessary lemmas to prove our main results. By using Corollary 2.2, Theorem 2.3 in [5], the proofs of Theorem 3 in [25], and Minkowski’s inequality under sublinear expectations, we see that the following lemma holds.

Lemma 1 (cf. Lemma 2.3 in [24]). *Suppose that is a sequence of independent random variables under sublinear expectation space with , , , and . Then,where depends on only.*

*Proof. *For readers’ convenience, we give complete proofs here. First, for and , set and , andwhere depends on determined as in (2.8) of Corollary 2.2 (b) in [5]. By Corollary 2.2 in [5], we know that, for all and ,Obviously, for and ,Set , and for , , where is the integer part of . implies that . Now, it is enough to prove that, for and ,As in the proof of Theorem 4 of Móricz [25], let be given. Obviously, or . For and , we see thatwhence, for such ’s,Since, for , we have , hence, for ,Thus,and by Minkowski’s inequality under sublinear expectations (see (4.10) in Proposition 4.2 of Chapter I of [2]),Suppose now that (8) holds for . Then, by the choice of , we see thatBy these two inequalities above and (7), we see thatFinally, (6) impliesBy (13)–(16), we conclude thatwhich implies the result. Hence, by (6), the conclusion of (8) is true for . By induction, (8) holds for all . The proof is complete.

Lemma 2 (see Lemma 2.4 in [24, 26]). *Let be a sequence of independent random variables under sublinear expectation space . Then, for all and ,*

*Remark 1. *In the proofs of Lemma 2.4 in [24, 26], by independence of , for constant and (Definition 2.5 in [10]), we see thatwhich implies that Lemma 2 is valid. The difference between the sublinear expectations and linear expectations could be implied by the subtle observation thatwhich is not necessarily equal to .

Lemma 3. *Let be a random variable under sublinear expectation space , , , and . Then, the following are equivalent:*(i)*(ii)*

*Proof. *The proof is finished.

Lemma 4. *Let be a random variable under sublinear expectation space , , , and . Then, the following is equivalent:*(i)*(ii)*

*Proof. *This finishes the proof.

#### 3. Main Results

We state our main results, the proofs of which will be given in Section 4.

Theorem 1. *Let be a sequence of independent random variables, identically distributed as under sublinear expectation space . Assume that , , and , and suppose that for . Suppose that is a triangular array of real numbers. Then, the following is equivalent:*(i)*(ii)*

*Remark 2. *If is classic probability space, then Theorem 1 recovers Theorem 7 in [23] in case in which is a sequence of independent random variables, identically distributed as . As pointed in Hossein and Nezakati [27], why we need th moment convergence under sublinear expectations, the complete moment covergence is a more general expression than complete convergence in theory and practice, and the interested reader also could refer to the first study in complete moment convergence by Chow [28].

*Remark 3. *Under the same conditions in Theorem 1, if (28) holds, by the same proof of Remark 3.2 of Hossein and Nezakati [27], we see that, for all ,Moreover, if is continuous, by (29), we see thatIn (30), if , , we conclude that , which is similar to Theorem 1 of Zhang and Lin [29].

Theorem 2. *Let be a sequence of independent random variables, identically distributed as under sublinear expectation space . Assume that , , and , and suppose that for . Suppose that is a triangular array of real numbers. Then, (28) is equivalent to*

Theorem 3. *Let be a sequence of independent random variables, identically distributed as under sublinear expectation space . Assume that , , and , and suppose that for . Suppose that is a triangular array of real numbers. Then, (28) is equivalent to*

As in the proofs of Theorems 1–3, we can get the following corollary.

Corollary 1. *Let be a sequence of independent random variables, identically distributed as under sublinear expectation space . Assume that , , and , and suppose that , for . Suppose that is a triangular array of real numbers. Then,*(i)*(27) is equivalent to*(ii)*(31) is equivalent to (33) when .*(iii)*(32) is equivalent to (33) when .*

The following theorem is complete th moment convergence on Cesàro summation of independent, identically distributed random variables under.

Theorem 4. *Let be a sequence of independent random variables, identically distributed as under sublinear expectation space . Assume that , , , and , and suppose that , for . Suppose , and . Then,*(i)*(27) is equivalent to* *when .*(ii)*(31) is equivalent to (34) when .*(iii)*(34) is equivalent towhen .*

*Remark 4. *If is classic probability space, then Theorems 1–3 and Corollary 1 recover, respectively, Theorems 10, 11, and 14 and Corollary 13 in Guo and Shan [23] in case in which is a sequence of independent random variables, identically distributed as .

#### 4. Proofs of the Main Results

*Proof. *of Theorem 1. We first prove that (27) implies (28). Notice thatFrom Xu and Cheng [24] and the fact that (27) implies , we see that . We next establish . Choose , and is small sufficiently and integer is large enough. For every , , we note the fact that is large sufficiently to guarantee . Since the first finite terms of the series do not affect the convergence of the series, without loss of restictions, we could assume the definitions of below are meaningful. WriteObserve that . Notice thatConsequently, to prove (28), we only need to prove thatFrom the definition of , we deduce thatObserve that implies . Therefore,From , we see thatHence, from Lemma 3 and (27), we obtain . From the definition of , we conclude that . From the subadditivity of capacity and Definition 2.5 in [10], it follows thatFrom , we obtain . Since (27) implies , by Markov’s inequality under sublinear expectations (cf. (9) in Hu and Wu [30]) and (43), we conclude thatNotice that and ; we could choose small sufficiently and integer large enough such that and . Hence, from (44), we obtain . Similarly, we can get . In order to estimate , we first prove thatObserve that (27), Lemma 4.5 in Zhang [4], and Hölder’s inequality under sublinear expectations imply that and . When , observe that and , by Hölder’s inequality, we see thatObserving that , by (46), for any , we obtainWhen , noticing that and by choosing small sufficiently such that , we see that