Showing 1 to 10 of 2603 matching Articles
Results per page:
By
Mignani, Stefania; Cagnone, Silvia; Casadei, Giorgio; Carbonaro, Antonella
Show all (4)
The aim of this paper is to evaluate the student learning about Computer Science subjects. A questionnaire based on ordinal scored items has been submitted to the students through a computer automated system. The data collected have been analyzed by using a latent variable model for ordinal data within the Item Response Theory framework. The scores obtained from the model allow to classify the students according to the reached competence.
more …
By
Amendola, Alessandra; Storti, Giuseppe
This paper proposes a modified approach to the combination of forecasts from multivariate volatility models where the combination is performed over a restricted subset including only the best performing models. Such a subset is identified over a rolling window by means of the Model Confidence Set (MCS) approach. The analysis is performed using different combination schemes, both linear and non linear, and considering different loss functions for the evaluation of the forecasting performance. An application to a vast dimensional portfolio of 50 NYSE stocks shows that (a) in nonextreme volatility periods the use of forecast combinations allows to improve over the predictive accuracy of the single candidate models (b) performing the combination over the subset of most accurate models does not significantly reduce the accuracy of the combined predictor.
more …
By
Raffinetti, Emanuela; Giudici, Paolo
2 Citations
The theoretical contributions to a “good” taxation have put the attention on the relations between the efficiency and the vertical equity without considering the “horizontal equity” notion: only recently, measures connected to equity (iniquity) of a taxation have been introduced in literature. The taxation problem is limited to the study of two quantitative characters: however the concordance problem can be extended in a more general context as we present in the following sections. In particular, the aim of this contribution consists in defining concordance indexes, as dependence measures, in a multivariate context. For this reason a kvariate (k > 2) concordance index is provided recurring to statistical tools such as ranksbased approach and multiple linear regression function. All the theoretical topics involved are shown through a practical example.
more …
By
Bianchi, Annamaria
Economic indicators need to be estimated at regional level. Small area estimation based on Mquantile regression has recently been introduced by Chambers and Tzavidis (Biometrika 93:255–268, 2006) and it has proved to provide a valid alternative to traditional methods. Thus far, this method has only been applied to crosssectional data. However, it is well known that the use of panel data may provide significant gains in terms of efficiency of the estimators. This paper explores possible extensions of Mquantilebased small area estimators to the panel data context. A modelbased simulation study is presented.
more …
By
Ingrassia, Salvatore; Minotti, Simona C.; Vittadini, Giorgio
49 Citations
Clusterweighted modeling (CWM) is a mixture approach to modeling the joint probability of data coming from a heterogeneous population. Under Gaussian assumptions, we investigate statistical properties of CWM from both theoretical and numerical point of view; in particular, we show that Gaussian CWM includes mixtures of distributions and mixtures of regressions as special cases. Further, we introduce CWM based on Studentt distributions, which provides a more robust fit for groups of observations with longer than normal tails or noise data. Theoretical results are illustrated using some empirical studies, considering both simulated and real data. Some generalizations of such models are also outlined.
more …
By
Giordano, Francesco; Niglio, Marcella; Restaino, Marialuisa
The crisis of the first decade of the 21st century has definitely changed the approaches used to analyze data originated from financial markets. This break and the growing availability of information have lead to revise the methodologies traditionally used to model and evaluate phenomena related to financial institutions. In this context we focus the attention on the estimation of bank defaults: a large literature has been proposed to model the binary dependent variable that characterizes this empirical domain and promising results have been obtained from the application of regression methods based on the extreme value theory. In this context we consider, as dependent variable, a strongly asymmetric binary variable whose probabilistic structure can be related to the Generalized Extreme Value (GEV) distribution. Further we propose to select the independent variables through proper penalty procedures and appropriate data screenings that could be of great interest in presence of large datasets.
more …
By
Muliere, Pietro; Secchi, Piercesare
Summary
Let {X_{n}} be a sequence of random variables conditionally independent and identically distributed given the random variable Θ. The aim of this paper is to show that in many interesting situations the conditional distribution of Θ, given (X_{1},…,X_{n}), can be approximated by means of the bootstrap procedure proposed by Efron and applied to a statisticT_{n}(X_{1},…,X_{n}) sufficient for predictive purposes. It will also be shown that, from the predictive point of view, this is consistent with the results obtained following a common Bayesian approach.
more …
By
Giordano, Francesco; Coretto, Pietro
Signaltonoise ratio (SNR) statistics play a central role in many applications. A common situation where SNR is studied is when a continuous time signal is sampled at a fixed frequency with some noise in the background. While estimation methods exist, little is known about its distribution when the noise is not weakly stationary. In this paper we develop a nonparametric method to estimate the distribution of an SNR statistic when the noise belongs to a fairly general class of stochastic processes that encompasses both short and longrange dependence, as well as nonlinearities. The method is based on a combination of smoothing and subsampling techniques. Computations are only operated at the subsample level, and this allows to manage the typical enormous sample size produced by modern data acquisition technologies. We derive asymptotic guarantees for the proposed method, and we show the finite sample performance based on numerical experiments. Finally, we propose an application to electroencephalography data.
more …
