A Robust Moment System Based on Absolute Deviations and Quantile Slicing
This study develops two robust, quantile-sliced moment systems, mean and median absolute deviation (MAD and MedAD moments), to serve as foundational tools in parametric modeling, statistical inference, and describing distributional location, scale, s…
Authors: Elsayed Elamir
1 A Robust Moment Sy stem Based on Absolute Deviatio ns and Quantile Slic ing Elsayed A. H. Elamir Department of Management and Marketing, Col lege of Business Ad ministration, Kingdom of B ahrain Email: shabib@uo b.edu.bh Abstract This study develops two robust, quantile-sliced moment systems mean and median absolute deviation (MAD and M edAD moments ) to s erve as f oundational tools in parame tric modelling, sta tistical inference, describing d istributiona l location, scale, skewness, and t ail behaviour in settings whe re classical moments and L-moments fail. MAD moments use block-wise absolute deviations around the median and ex ist whenever the mean is finite while MedAD mo ments replace expectations with medians, ensuring existenc e for all distributions, including heavy-tailed cases with undefined mean or variance. The systems exhibit strong consistency, slice-based r obustness, and bounded influence. The results show that MAD and L-moment ratios are efficient for light to m oderate tails, whereas MedAD ratios remain uniquely stable when higher moment s do not exi st . Application to Cauchy parame ter estimation high light the p ractical va lue of MedAD estimators as simple, fully robust alternatives to likelihood-based app roach. Together, th ese systems o ffer a unified, med ian -anchored fra mework for reliable distribu tional infere nce under heavy tails and c ontamination. Keywords : heavy-tailed distributions; m edian based estimation; MAD; robust estimation; quanti le slicing. Subject classification code: 62G30 and 62G32 ORCID: 0000-0002-9430-072X 2 1 Introduction Mean absolute deviation (MAD) avoids squaring deviations, which ensur es greater stability under heavy-tailed distributions and reduced sensitivity to outliers relative to variance -based measures. I ts robustne ss has led to widespread app lication across diverse fields ; (Gorard, 2005; Pham-Gia and Hung, 200 1; Elamir, 2012). I n finance, MAD is used in volatility modelling and in mean a bsolute deviatio n portfolio optimization, off ering risk measures that are less sensitive to extreme returns (Konno et al., 1993 ; Hosseini et al. 2023). In machin e learning, the ana logue of MAD is widely used as a loss function and in clustering methods (Shafizadeh et al., 2022; Holste et al. 2025). Across these and other disciplines, the MAD combination of robustness, interpretability, and b road applicability has made it a foundational too l for character izing dispersion where c lassical va riance-based methods perf orm poo rly, particul arly in he avy tailed or contaminated data environments. Moment-based approach is one of important approaches that estimat e distributional characteristics. The oldest and most prominently, the method of moments (MoM, Pearson , 1936) and the more recently developed L-moments (Hosking, 1990). Both frameworks aim to summarize dist ributional shape, location, scale, skewness, and kurtosis , through functionals computed from data, and both serve as foundational tool s in parametric modelling and statistical inference (H azelton, 2025; B reitung et al., 2022 ; Shin et al. , 2025). The classical method of moments is based on equating population moments with their empirical counterparts and solving the resulting system for the model parameters. Because thes e moments involve powers of d eviations from the mean, Mo M is inherently tied to the existence of finite centra l moments and highly aff ect ed by outlier s. Even when moments exist, the higher-order terms and amplify the influence of extreme observations. To ov ercome the instability of cent ral moments, L-moments were introduced as linear combinations of order statistics. The -th population L-moment is given by where is the j-th order sta tistic of a sample o f size . The first L-moment is , This is the population mean from conceptual size 1. The second L-moment is , this is a robust scale me asure and r eflects the dis tance from con ceptual size 2. The third L-moments is these c ompares left a nd 3 right outers with median from conceptual size 3. The fourth L-moments is , t his compares two middle parts with two outer parts from conceptual size 4 (Elamir a nd Seheult, 2003). L-moment estimators exhibit smaller sampling variability and often smaller bias than classical moment estimators, making them effective even when s ample sizes are limited . Despite these big im provements, L-moments retain a fatal dependence on the population mean and tail-sensitive (Sankarasubramanian and Srinivasan, 1999 ; Ulrych et al., 2000 ; Rychlik and Szymkowiak, 2025 ). These limitations leave a methodological gap for characterizing distribut ional shape s when classical moments and L-moments either do not exist or behave unre liably . This study addresses that g ap by developing MAD and MedAD mom ents, two families of quantile localized absolute deviation-based shap e descriptors. Built around the median rather than the mean, these moments remain stable und er heavy tails, resistant to outl iers, and applicable in settings where standard approaches fail where we establish a comprehensive theoretical framework for these ro bust alternatives, including their definitions, properties, sampling behaviour, and standardized forms, thereby providing a unified and fully applicable moment system for all distributions, especially those for which conventional methods break down. This study is organized as follows. S ection 2 introduces the proposed MAD moment system. Section 3 pre sents estimation of MAD mom ent and its sampling distribution. Section 4 develops the corresponding median absolute deviation moment system. Section 5 pre sents estimation of median absolute moments; breakdown point and influence curve. Section 6 demonstrates the practical uti lity of the proposed methods through Cauchy parameter estimation. The conclusion is given in Section 7. 2 The proposed MAD Moments Let be an independent and identically distributed random s ample drawn from a continuous population with distribution function , where , density with , and quantile function . Let the population mean be , the population median be Med , and the standard de viation be . Denote by the indicator function, which equals 1 if and 0 otherwise. The corresponding order statistics of the sample are written as . Mu noz-Perez and Sanchez-Gomez (1990) provid e a key theoretical foundation through the dispersion function . Building on this, we introduce a structured, quantile-localized decom position of the mean absolute deviation , evaluated at the 4 median . This representation treats the MAD as a special case of the dispersion function and incorporates alternating signs ac ross quantile slices. The th MAD moment is defined as For continuous distribution where is population media n and denotes the indicator function and th e term restricts contributions to observations whose fixe d quantiles fall within the corr esponding slice of the distribution. This construction parallels to L-moments but replaces linear combinations of order statistics with localized absolute deviations, yielding a family of robust, quantile-anchored measu res of location, scale, and higher-order shape. By alternating signs and aggregating ac ross quant ile blocks, MAD moments capture subtle distributional asymmetries and peripheral behaviours. When the general form is , This formulation partitions the distribution into adjacent quantile intervals of equal probability mass, specifically, the intervals for . Within each slice, the expectation of the absolute deviation is computed. By working with absolute deviations around the media n, the MAD-moments remain naturally centred on a robust location mea sure median. Thus, using gives MAD moments a clear meaningful interpretation (“avera ge distance from the middle”) and e nsures that the resulting shape descriptors r eflect the dominant structure o f the distribution. The first four MAD mome nts can be written as 5 The first MAD moment is median ( the we ll-known a robust measure of central location. The second mo ment is the mean absolute deviation about the median. Unlike the standard deviation , is not inflated by extreme values and is particularly informative for heavy-tailed distributions. The third moment uses two quantile halves It compares avera ge absolute deviations below the median to those above it, a distribution with larger deviations on the right side produc es (right-skew). A distribution with heavier left-side deviations produces (left-skew). Be cause the measure relies on absolute deviations rather than cubic terms, provides an interpretable sense of “which side spreads fur ther” . The fourth moment divides the distribution into three e qual quantile slices capturing how strongly the centre slice deviates relative to the peripheral. I f > 0, mi ddle deviations dominate (fla tter centre / lighter peripheral). I f < 0, outer slices dominate. Unlike classical kurtosis, achieves this without cubing, making it robust and more int erpretable. It is interesting to note that because the median is the centre of these mom ents the middle term can be split as Theorem 1. The MAD moments , of a real valued random v ariable exist if and only if has a finite mean. Proof . From Muñoz-Perez and Sanchez-Gomez (1990) . When , that depend on mean. A natura l extension of the MAD moment framework is obtained by standardizing each higher-order moment by the robust scale measure . The resulting ratio produces a dimensionless index of distributional shape that is directly comparable across datasets and measurem ent units. Because represents the mean absolute deviation around the me dian dividing by ensures that reflects pure shape information rather than differences in dispersion. For , the standardized skewness measure 6 compares th e magnitude of deviations in the upper half of the distribution to those in the lower half. A positi ve value in dicates greater spread above the median (r ight -skewness), while a negative value reflects h eavier dispersion below the median (left- sk ewne ss). For , the standardized measure assesses the balance of deviations between the centre and peripheral. It can be called peripheral- central measure. If , deviations in the centra l slice dominate (flatter centre / lighter periphery), deviations in the peripheral sli ces dominate (more pe aked/ heavier-periphery shape ), periphera l and cent ral deviations are balanced (shape symmetry in dispersion ). Theorem 2. Let be non degenerate random variable with finite mean. The MAD moments ratio satisfy . Proof. S ince is partitions of , with alternating sign, is bounded. Theorem 3. M AD moments are (a) location invariance , (b) scale equivariance . Proof. regarding location invariance, the i f , then Quantile ranks are unch anged by translation, so slice membership doe s not change, and thus all for . Regarding scale equivariance, for with , Since mean scale linearly, each term summand in is times by , yielding . 7 Table 1. MAD moments and standa rdized shape measu r es for severa l distributions . Distribution MAD moments For uniform d istribution with parameters and . , For normal with mean an d standard deviation , , For logistic distribu tion with location and scale , , For Laplace distribution with location and scale b>0, , For Cauchy with location and scale , Exponential ( ) , , Pareto , Table 1 shows MAD moments for some distributions. Values of Γ₃ and Γ₄ var y according to how strongly peripheral deviations outweigh central ones. Lighter-periphe ry distributions like the uniform give th e least negative Γ₄, whereas heavier -periphery cases such as the Laplace produce more negative v alues. The Cauchy has infinite MAD moments beyond due to its extreme heavy-tailed behaviour. 3 Estimation of M AD Moments 3.1 Sam ple MAD Moments Let be an independent s ample dra wn from a continuous distribution an d let denote the sample median. The sample version of the MAD moments provides a fully empirical analogue of the populat ion definitions based on absolute deviations and quantile-slice partitioning. For intege rs , the -th sample MAD moment is defined as 8 where denotes the s ample quantile slice co rresponding to the interval , is the indicator function. Thus, for , the sample MAD moments are computed by partitioning the sample into adjacent quantile blocks of equal probability, evaluating the block-wise mean absolute deviations from the sample median, and aggregating these deviations using the alternating sign scheme. As in the p opulation case, the resulting provides a robust mea sure of dist ributional shape, with estimating the median, giving the empirical mean absolute deviation about the median, and higher -order describing asymmetry and peripheral behaviour in a manner resilient to outliers and heavy-tailed observations. Analogous to the standardized population MAD moments, each higher-or der empirical MAD moment may be normalized by the sample scale measure . For integers , the standardized sample MAD moment is defined as Because is a robust estimator of scale, divi ding by produces dimensionless indices of distributional shape that are directly comparable ac ross samples, measurement units, or distinct datasets. The standardized third sample MAD moment serves as a robust measure of skewness, comparing deviations above and below the sample median. A positi ve value indicates greater spread in the upper half of the sample, while a negative value indicates heavier dispersion in the lower half. Similarly, provides a robust measure, reflecting the balance of deviations between the centre portion of the sample and th e peripheral. Positive values indicate relatively broader central d eviations (flatter centre), while n egative values indicate stronger peripheral. Standardization therefore isolates shape features independe ntly of sample scale, preserving the interpretability and robustness of the underlying MAD-based framework. To define a ku rtosis measure bas ed on 9 mean absolute deviation, Pinsky (2024) adopts a different approach that depends on the computation of two specific values and the of the building dist ributions with these reference values as their averages. 3.2 Sam pling distribution of sample MAD m oments The asymptotic behaviour of the s ample MAD mom ents follows from thei r representation as linear combinations of block-wise empirica l averages of the truncated deviation func tions . Because each is the sample mean of an i.i.d. sequence, the classical multivariate Central Limit Theorem ensures joint asymptot ic normality of around their population expectations. The -th sample MAD moment is a fixed alternating-sign linear combination of these slice means. Theorem 4. Let where is the population median and denotes the quantile slice . Define the sample MAD moment and the corresponding population moment Then, as , where Proof. For each , define The sample counterpart is which is sim ply the mean of i.i.d. variables . Thus, for each fixed , 10 is asymptotically normal by the classical Central Limit Theorem. Because the vector consists of averages of i.i.d. vectors, the multivariate CLT gives where The sample MAD moment can be written as Similarly, Hence the estimation error is Let Then Because any fixed linear combination of a multivariate normal limit is itself normally distributed, Since See, Wasserman (2004). Theorem 5. Let be i.i.d. from a continuous distributi on with population median . Let be the -th sample MAD moment, defined using the sample median and empirical quantile slices. If the population MAD moment is finite, then Proof 11 Each sli ce-based term in is a n e mpirical average of the form and conv erges almost s urely to its population e xpectation because: (i) t he sample median a.s.; (ii) empiric al quantiles conv erge to population quantiles; and (iii) each truncated deviation is integrable. By the strong law of la rge numbers, each slice averag e converges a.s. to its expectation. Since is a finite alternating sign linear combination of these terms, it also converges almost surely to Hence is strongly consistent. This satisfies to standardiz ed sample MAD moment because a.s., a.s., and the continuous mapping theorem applies (Casella a nd Berger, 2024 ; Wasserman, 2004 ). 4 Median Absolut e Deviation Moment s (MedAD Moment s) A natural and robust extension of the MAD-moment framework can be obtained by replacing the expected value operator in the original definitions with the median functional. This leads to a new family of distri butional descriptors that we refer to as Medi an Absolute Deviation Moments (MedAD-moments) . While the classical MAD-moments require the existence of finite expectations, the median exists for every pro bability distribut ion on the real line, making the MedAD moment framework applicable to an e ven broader cl ass of distributions, including those lacking finite mean or finite first absolute moment such as Ca uchy distribution ( Rousseeuw, and Croux, 1993 ; Arachchige and Prendergast, 2026 ; Falk, 1997 ; Arachchige et al., 2022). Let denote the population median. For integers , the - th MedAD-moment is defined as where denotes the quantile slice corresponding to the interval of the cumulative distribution function. Thus, for , the MedAD mome nt is obtained by partitioning the support of the distribution into equiprobable quantile blocks, computing the median absolute deviation within each block, and aggregating these block -wise medians using an alterna ti ng sign scheme that parallels the construction of classical MAD moments and L-moments. The structure of the MedAD moments mi rrors closely that of the MAD moments 12 defined earlier, with one crucial distinction: the expectation operator is replaced everywhere by the me dian functional . This substituti on significantly increases robustness. Medians are unaffected by extreme v alues, unlike expectations, and they are w ell defined even for distributions without finite moments. Consequently, MedAD-moments inherit the spirit of quantile-based shape characterization while avoiding the moment-existence restrictions inherent in the MAD moment framework. The first four M edAD moments can b e written as both systems begin with the population median . Thus, the two frameworks share the same robust location measure. Second mom ent measures the absolute deviation. Because the median is unaffected by extreme outliers, provides an even more robust scale measure and is always well defined, regardless of tail behaviour and well known in li terature by median absolute deviation studied by many authors, such as Rasuw and …. (1993). Third moments reflects median imbal ance, making it less sensitive to long heavy tail dist ributions and applicable even when does not exist. Fourth moments measure peripheral-versus-centre deviation using three equal probability blocks yields a measure that is robust to extreme t ail ma ss and well -defined for al l distributi ons. MedAD-moments retain the interpretability and quantile-based decomposition of the original MAD-moments but achieve greater robustness and universal existence. The standardizing MedAD moments are produces a dimensionless index of distributional shape that is directly comparable ac ross datasets and measurement units. For example, the skewness measure at and a measure of peripheral centre at 13 serve as peripheral – central measure, quantifying how absolute deviations in the outer quantile slices compare with those in the central slice. I f , de viations in the centra l slice dominate (lighter periphery), deviations in the peripheral sli ces dominate (heavier-periphery shape ), peripheral and central deviations are balanc . Note that the standardizing MedAD moments is unbounded. Therefore, th e MedAD moment system provide s a fully robust, quantile-localized hierarchy of location, scale, and higher-order distributional shape descriptors. Because medians are always defined, the MedAD-moment approach extends applicability of absolute deviation-based moment methods to all probability distributions. Theorem 6 . Every MedAD-moment exists and finite for all and for every probability distribution on . Proof. each M edAD moment is defined as either the median of , the median of , or a finite sum of media ns of the truncated variables . Since the median exists for every real-valued distribution, and always exist. F or , each qua ntile slice has positive probability mass , so the truncated variable is well-defined on a finite i nterval. Within suc h a sli ce, is bounde d a bove by t he distance between the slice e ndpoints and the median and therefore has a finite median. Because a finite sum of finite medians is also finite, all MedAD moments exist and are fi nite for every distribution. Theor em 7. MedAD moments are (c) location invariance , (d) scale equivariance . Proof. With respect to location invariance, i f , then Quantile ranks are unchanged by translation, so sli ce membership does not change, and thus all for . Rega rding sca le equivariance. For with , Since medians scale linearly, each summand in is multiplied by , yielding . Obtaining MedAD moments depends on the distribution of . Let the random variable 14 represents the absolute deviation of from its median. The distribution of plays a central role in MedAD moments, since all higher order deviations are computed t hrough block wise statistics of restricted to quantile slices and since , the CDF of can be obtained as Thus, the distribution function of is This re presentation is valid for all distributions with a well -defined median. If admits a density , then also has a density given by If is symm etric around , then Hence, The density reduces to showing that the distribution of absolute deviations is simp ly a right-half folding of the parent density around the median. Therefore, the MedAD moments can be computed as 1. first moment 2. Second moment 3. Higher moments ( ): Partition the distribution into quantile slices For each slice, the truncated deviation variable is . Its distribution inside the slice is The conditional CDF is The slice median satisfies 15 The -th MedAD moment is Example Uniform Distribution has and . Therefore, 1. 2. For a uniform distribution, the folded distribution of is l inear on . Solving gives 3. : Partition into slices ▪ Truncate Truncated deviations ▪ Slice distributions Both slices are symmetric around , so the deviation distribution in each slice is identical. Thus, each slice median is ▪ Combine with alternating signs 4. : Partition into slices . Each slice ha s width . ▪ Slice medians: because the uniform distribution is linear inside e ach slice, the absolute deviation inside each slice grows linearly from 0 at the median slice boundary. The slice medians are equal in magnitude but follow the alternating-sign pattern: ▪ Combine These match the expecte d symmetry properties: ze ro skewness , ne gative kurtosis-type value , indicating light peripheral s and a flat-top relative to symmetric heavy-tailed distributions. 5 Estimation of M edAD moment 5.1 Sam ple MedAD moments For an independent sample drawn from a continuous dist ribution, the sample MedAD moments provide empirical counterparts to the population quantities define d through quantile-localized medians. Le t denote the sample median and let the 16 empirical qua ntiles determine the sli ce bounda ries and for . The -th sample MedAD moment is then Thus, each component is computed as a sli ce-specific sample median of absolute deviations, and the final estimator is obtained by aggre gating these components using the same alternating-sign sch eme a s in the population d efinition. Bec ause the sample median and empirical quantiles are strongly consistent, each slice-wise truncated median is a strongly consistent estimator of its population counter part. Conseque ntly, the sample MedAD moments inherit consistency, and their standardized forms provide dimensionless, r obust estimators of dist ributional shape. These estimators require no existence o f moments and remain well -defined under h eavy-tailed or contaminated distributions, reflecting the fully robust nature of the MedAD fra mework. T he statistic captures median imbala nce between th e upper and lower halves of the distribution, taking positive values when dev iations above the m edian domi nate and negative values when lower side deviations are larger. Similarly, compares the typical deviation within the central slice to that in the two outer slices, yi elding a robust indicator of peripheral heaviness versus central concentration. B ecause both measures are based on medians within quantile slice s, they remain stable under contami nation and he avy peripheral, offering interpretable shape descriptors even when classical moments do not exist. Theorem 8. Let denote the -th population MedAD mom ent and let be th e corresponding sample MedAD moment defined using the sample median and empirical quantile slices. For every , 17 That is, sa mple MedAD moments are strongly consistent estimators of the population MedAD moments. Proof. the sample median and empirical qu antiles converge almost surely to their population counterparts. The trunc ated variables converge pointwise to the population sli ce deviations, and the slice -wise s ample medians converge almost surely to the population medians. Because each MedAD moment is a finite alternating sum of these slice medians, the sample moment converge s almost sure ly to the population moment. He nce a.s. for all . Figure 1. histogram and Q-Q plots for standardized third and fourth moments based on MAD, MedAD, L-moment and MoM moments fro m and Because the data comes from a t-distribution with , the variance is finite (df > 2) but the distribution still has very heavy tails. With a lar ge sample siz e ( ), the stable estimators ; particularly the MedAD, MAD, and L -moment ratios ; approximate normality for lower-order moments; however, a t the fourth order, the estimators and become less stable 18 compared with the MedAD-based measure. Whil e L-moment and MAD estimators are tighter than MedAD measures, classical moments suffer from severe distortions, including wide spreads, heavy tails, and curved Q – Q plots. 5.2 Breakdo wn point For , the statistic is a finite alterna ting sum of slice-wise medians I The slice contains approximately observations. The median insi de this slice breaks wh en half of the slice values are replaced by arbitrari ly large contamination. Thus, the smallest number of contaminated observations needed to make diverge is Since is an alternating sum of the slice medians, if any sli ce median diverges, then the entire statistic diverg es. Thus, the minimal co ntamination needed to br eak is exactly the minimal contamination needed to break one slice The finite-sample breakdown point (proportion of contaminated data) is For the first two MedAD mome nts ( ), which do not use quantile slice s, the breakdown point is . This result shows the natural tr ade-off b etween robustness and resolution. The more slices a MedAD moment uses, the smaller the breakdown point, because each slice median depends on fewer observations (Hampel, 1985; Hekimoglu, 1997). Figure 2 compares the sampling distributions and normal Q – Q plots of standardized shape estimators under a distribution with undefined variance. Because the variance does not exist, the MAD-, L-moment-, and classical moment – based measu res behave poorly and cannot b e reliably computed, leaving the MedAD ratios as the only suitable estimators for this setting. This highlights the unique value of MedAD moments ; they remain fully defined and applic able even when all other moment-based methods fail. 19 Figure 2. histogram and Q-Q plots for standardized third and fourth moments based on MAD, MedAD, L-moment and MoM moments from distribution and 5.3 Influence function Because the influence function is a population concept, we de rive the IF for the population MedAD moments , whic h in turn determine s the asymptotic influence of the sample estimator . Let be a distr ibution with median . The -th population MedAD moment is where is a quantile slice of . Let . The influence function is 20 (Hampel, 1974; Ruppert, 1987; Hekimoglu, 1997) ). For any scalar functional , the classical IF is where is the density at the median. For slice , define the slice-restricted deviation variable Let the population slice median be Bec ause is a univar iate variable, it s median has influence where and is the density of at . Since depends on , the perturbation at point affects through Thus the Gatea ux derivative adds Therefore, the IF contribution from the shift of the centre is The influence function for slice is The full MedAD moment is a line ar combination Thus the IF is the same alternating sum of slice IF functions Asymptotic Influence of the Sample MedAD Estimator is an empirical plug-in estimator (Law, 1986). By standard M-estimator theory Then the plug-in estimator of the IF evaluated at a point is 21 Figure 3. EIF for L-moment and using data from normal distribution Figure 3 compares the empirical influence functions of the L-moment skewness e stimator and the MedAD skewness estimator under point contamination. The influence function of displays a smooth, con ti nuous, and ultimately unbounde d response. As the contaminating value moves into the extreme tails, the IF grows without li mit, indicating that remains sensitive to sufficiently large outliers. In contr ast, the MedAD estimator exhibits a piecewise, step-shaped influence function with c lear flat regions and a brupt transitions whe n crosses the quantile-slice boundaries used in its construction. Note that R code for computing empirical influence is available from the author upon request. 6 Applications Estimating the para meters of the Cauchy distribution is challenging because its mean and variance do not exist, making classical moment- based methods unusable. As a result, most established approaches r ely on likelihood, quantiles, or robust statistics. The MedAD moments for Cauchy distribution is , and Therefore, . To produ ce the result s shown in Table 2, we pe rform a simulation study comparing four estimators of the Cauchy distribution pa rameters and : MLE, MedAD, Quantile, and MGOF. The motivation is that the Cauchy distribution lacks a finite mean and 22 variance, making mome nt-based estimators unusable, while MedAD remains well-defined with and . For each scenario, samples are dr awn from a Cauchy (θ, s) distribution. Thre e sample sizes ar e conside red; n = 25, 50, 100, as used in Table 2. Estimators Compared are • MLE: efficient under clean data but sensitive to outliers, • MedAD: = sample median, = , fully robust. • Quantile estimator: based on sample quantiles, • MGOF: minimum goodness-of-fit estimator, weaker performance under heavy tail. For each sample size, ge nerate many (B = 10,000 ) independent samples fr om Cauchy ( ), All four estimators are computed for each sample. The bias = , and and for . Table 2 summarizes the bias and mean square d error (MSE) for estimating the loca tion (θ) a nd scale ( ) para meters of the Cauchy distribution using four competing methods (Cane , 1974; McCullagh, 1993; Pekasiewicz, 2014, Delignette-Muller and Dutang, 2015; R Core Team, 2026). The simulation results clearly demonstrat e the classical robustness – effi ciency trade -off. MLE remains the most efficient estimator in uncontaminated Cauchy samples, achieving the lowest MSE for both θ and at all sample sizes. In contrast, MedAD and quantile estimators offer superior robustne ss at the cost of modest efficiency loss, whic h is consistent with their median-based construction and bounded influence properties. The MGO F esti mator is the least competitive, particularly under small-sample he avy-tailed conditions. Thes e findings reinforce the role of MedAD an d similar qua ntile-based procedures as valuable alternatives when robustness to outliers or contamination is desired, while MLE remains opti mal under strict model validity. 23 Table 2. Bias and MSE for Cauchy parameters estimat ion based on dif ferent methods 25 25 50 50 100 100 Bias MSE Bias MSE Bias MSE MLE 0.001 0.0895 -0.003 0.0421 0.002 0.0196 MedAD 0.004 0.1040 -0.003 0.0512 0.001 0.0245 Quantile 0.004 0.1040 -0.003 0.0512 0.001 0.0245 MGOF 0.005 0.1260 -0.002 0.0589 0.002 0.0280 MedAD 0.004 0.1040 -0.003 0.0512 0.001 0.0245 MLE -0.003 0.0926 0.000 0.0405 0.000 0.0206 MedAD 0.026 0.1352 0.018 0.0527 0.005 0.0262 Quantile 0.026 0.1352 0.018 0.0527 0.005 0.0262 MGOF 0.057 0.1164 0.028 0.0471 0.014 0.0220 MedAD 0.031 0.1212 0.017 0.0489 0.006 0.0252 7 Conclusion This study develops two quantile-sliced absolute deviation moment systems MAD moments and MedAD moments as robust alternatives to classical moments and L -moments to serve as a tool of parametric modelling and summ arizing location, scale, skewness, and tail behaviour. MAD moments retain existenc e under finite mean and ac hieve robustness through median and alternating sign block-wise absolute d eviation agg regation. Their standardized forms provide dimensionless shape descriptors with strong consistency and asymptotic normal distribution. To address distributions lacking finite first absolute mom ents, the MedAD system replaces expectations with slice-wise medians, ensuring existence for all distributions, including those with undefined mean or varia nce. Breakdown-point analysis of MedAD-moments demonstrates a clear robustness resolution trade-off where the first two MedAD moments achieve 50% breakdown point while higher order achieves . The applicability of the MedAD fr amework is further demonstrated through Cauchy parameter estimation. In this case, the MedAD moments yield simple, fully robust estimators that remain well-defined for all sample sizes and tail configurations. Simulation experiments confirm the robustness – efficiency trade-off where the maximum likelihood estimation achieves lower MSE unde r unc ontaminated data, but MedAD and quantile-based estimators maintain stability and bounded influence under heavy tailed or contaminated sample s wh ile minim um goodness-of-fit methods degrade substantially. 24 Therefore, the MAD and MedAD frameworks form a unified, quantile -structured, median-anchored family of robust moments that re main valid across moment non-existent settings. Future research may extend MAD and MedAD moments to estimation of different distributions such as Pareto and Weibull distributions and c ompare it with other methods. Conflict of interest: The author does not have any conflict of interest. Financial support: No financial support References Arachchige, C. N., & Pre ndergast, L. A. (2026). Confidence i ntervals for median absolute deviations. Commun ications in S tatistics-Simulation and Computa tion , 55 (1), 13-22. Arachchige, C. N., Prendergast, L. A., & Staudte, R. G. (2022). Robust analogs to the coefficient of variation. Journal o f Applied Sta tistics , 49 (2), 268-290. Breitung, J., Kripfgan z, S., & Hayakawa, K . (2022). Bias-corrected me thod of moments est imators for dynamic panel da ta models. Econome t rics and Stat istics , 24 , 116-132. Cane, G. J. (1974). Li near estimation o f p arameters of t he C auchy distrib ution based on sample quantiles. Journal o f the America n Statistical Assoc iation , 69 (345), 243-245. Casella, G., & Be rger, R. (2 024). Statistical infer ence . Chapman and H all/CRC. Elamir, E. A. (2012) On uses of mean absolute deviation: decomposition, skewness and correlation coefficients. METRON , 70 (2), pp. 145-164. Elamir, E. A., & Se heult, A. H. (2003). Trimmed L-moments. Computational Statist ics & Data Analysis , 43 (3), 29 9-314. Falk, M. (1997). Asymptotic independence of median a nd MAD. Statistics & p robability letters , 34 (4), 341-345. Gorard, S. (2005). Revisiting a measure of dispersion: The mean absolute deviation about the median. British Journal of Edu cational Studies , 53(4), 417 – 430 . Hazelton, M. L. (2025). Methods of moments estimation. In I nternational Encyclopedia of Statistical Science (pp. 1464- 1465). Berlin, Heide lberg: Springe r Berlin Heide lberg. Hampel, Frank R. "The influence curve and its role in robust estimation." Journal of the American statistical associ ation 69.346 (197 4): 383-393. Hampel, F. R. (1985). The breakdown points of the mean combined with some rejection rules. Technometr ics , 27 (2), 95-107. Hekimoglu, S. (1997). Fi nite s am ple breakdown poi nts of outlier detect ion pr ocedures. Journal of surveying engine ering , 123 (1), 15-31. Holste, G., Oikonomou, E. K., Tokodi, M., Kovács, A., Wang, Z., & Khera, R. (2025). Complete AI - enabled echocard iography interpre tation with multitask deep learning. JAMA , 334 (4), 306-318. 25 Hosking, J. R. (1990). L-moments: analysis and estimation of distributions using li near combinations of order statistics. Journal of the Royal Statis tical Society Series B: Statistical Meth odology , 52 (1), 105-124. Hosseini-Nodeh, Z., Khanjani-Shiraz, R., & Pardalos, P. M. (2023). Portfolio optimization using robust mean absolute deviation model: Wasse rstein metr ic approach. Finance Resea rch Letters , 54 , 103735. Konno, H., Shirakawa, H., & Y amazak i, H. ( 1993). A mean -absolute deviation-skewness portfolio optimization mode l . Annals of Operat ions Research , 45 (1), 205-220. Law, J. (1986). R obust statistics — the app roac h based on influence fun ctions. Wiley online l ibrary. Marie Laure Delignette-Muller, Christophe Dutang (2015). fitdistrplus: An R Package for Fi tting Distributions. Jou r nal of Statist ical Software, 64(4), 1-34. D OI 10.18637/jss.v064. i 04. McCullagh, P. (1993). On the distribution of the Cauchy maximum-likelihood estimator. Proceedings of the Royal Society of Lon don. Series A: Mathematical and Physical Science s , 440 (1909), 475- 479. Munoz-Perez J. a nd Sanchez-G omez (1990). A characteriz ation of the distribution fu nction: the dispersion functi on, Statistics and P robability Letters , 10, 235 – 23 9. Pekasiewicz, D. (2014). Application of quantile methods to estimation of Cauchy distribution parameters. Statis tics in Transition. N ew Series , 15 (1), 133- 144. Pham-Gia, T., & Hung, T. (2001). The mean absolute deviation about the mean. Journal of Statistical Planning and In ference , 93(1 – 2), 1 – 11. Pearson, K. (1936). Method of mom ents and method of maximum likelihood. Biometrika , 28 (1/2), 34- 59. Pinsky, E. (2024). Mean absolute d eviation (about mean) metric fo r kurtosis. s. 2: 1021 R Core T eam (2026). R : A language and environment for statistical computin g. R Foun dation for Statistical Comp uting, Vienna, Aust ria. URL https: //www.R-project.org / . Rousseeuw, P. J., & Croux , C. (1993). Alternatives to the median absolute deviation. Journal of the American Statistica l association , 88 (424), 1273-1283. Ruppert, D. (1987). Robust stati stics: The approac h based on influe nce functions Ruppert, D. (1987). What is kurtosis? An influence function approach. The American Statistician , 41 (1), 1-5. Rychlik, T., & Szymkowia k, M. ( 2025). Ext reme val ues of scaled L-moments. St atistics & Probability Letters , 110626. Sankarasubraman ian, A., & Srinivasan, K. (1999). Investigation and comparison of sampling properties of L-moments and co nventional moments. Journal of h ydrology , 218 (1 -2 ) , 13-34. Shafizadeh, A., Shahbeig, H., Nadian, M. H., Mobli, H., Dowlati, M., Gupta, V. K., ... & Aghbas hlo , M. (2022). Machine learning predicts and optimizes hydrothermal liquefaction of biomass. Chemical E ngineering Journ al , 445 , 136579. 26 Shin, Y ., Shin, Y., & Par k, J. S. (2025). Bu ilding nonstationary ex treme v alue model using L - moments. Journal o f the Korean S tatistical Soc iety , 54 (4), 947-970. Ulrych, T. J., Velis, D. R., Woodbury, A. D., & Sacchi, M. D. (2000). L -moments and C- moments. Stochast ic Environm ental Research and R i sk Assessmen t , 14 (1), 50-68. Wasserman, L. ( 2004). All of statistics: a concise course i n statistical inference (Vol. 26, p. 86). New York: Springer.
Original Paper
Loading high-quality paper...
Comments & Academic Discussion
Loading comments...
Leave a Comment