Indo-Russian Joint Conference
in Statistics & Probability
15 - 18 January 2015

Hide abstracts

Poster Presentations

15 January 2015

2:30pm - 5:20pm

Poster Session 1

Anirvan Chakraborty, ISI Kolkata

On the behaviour of some tests for high dimensional data

For the two sample problem involving data with large dimensions and small sample sizes, we show that the performance of the test based on spatial ranks and different tests based on sample means do not depend much on the heaviness of the tails of the distributions. Instead, it largely depends on the underlying dependence structure. All these tests have the same asymptotic power under appropriate mixing conditions on the coordinate variables as the dimension grows and the sample size is fixed. This is in striking constrast to the performance of the Hotelling's \(T^{2}\) test and the spatial rank based tests in the classical multivariate setup, where the data dimension is smaller than the sample size. In some other models involving stronger dependence among the coordinate variables, the spatial rank based test significantly outperforms the mean based tests.

Kaustav Nandy, ISI Delhi

Semiparametric Mixture Density Modeling

Modeling mixture distributions is a well studied problem in statistics. In this work we consider a special case, where the data comes from a mixture of an arbitrary decreasing density and a parametric density, a model with applications in particle physics and astrophysics. However, often in physics the instruments do not record the exact values, instead producing observations as bin counts in predetermined intervals. Our model deals with this type of data. An advantage of the model is that it reduces to one with a finite-dimensional parameter space. As is common with mixture modeling, we use the EM algorithm to find maximum likelihood estimates, using a variant of Grenander's least concave majorant estimator to estimate the decreasing density. Simulation is used to explore the performance of our estimators. This is joint work with Deepayan Sarkar.

16 January 2015

9:30am - 1:00pm

Poster Session 2

Sourish Das, CMI

Fast Algorithm for Gaussian Process Regression for Big Data

Gaussian process regression is a popular class of non-linear regression model in Statistical machine learning. However, time complexity of such model is \(O(n^3)\) and space complexity is \(O(n^2)\), where \(n\) is the sample size. As a result, application of such model is infeasible for large data. We propose re-sample based fast algorithm with order \(O(n^e)\) where \(1\lt e \lt 2\). A simulation study will be presented where the estimates of the parameters from this algorithm is a reasonably close to standard brute-force algorithm. We present an application from image processing, and mean square error (MSE) comparison between the proposed methods with popular median filter and mean filter will be presented. Joint work with Rajiv Sambasivan and Sasanka Roy.

Gursharn Kaur, ISI Delhi

Urn Schemes with Negative Reinforcement

In this work we consider negatively reinforced Polya type urn processes with finitely many colors. We consider two types of such processes Linear and Inverse. We show that in case of only two colors both the processes are equivalent and the limiting configuration of the urn is uniform a.s., and for more than two color case the configuration of the urn after \(n\) draws converge a.s. to a limiting distribution which is uniform if and only if uniform distribution is the unique stationary distribution of the underline Markov chain.

2:30pm - 5:20pm

Poster Session 3

Ananya Lahiri, CMI

Inference for option prices when the stock price is driven by a fractional Brownian motion

Diffusion processes driven by Fractional Brownian motion (FBM) have often been considered in modeling stock price dynamics in order to capture the long range dependence of stock price observed in reality. Option prices for such models are obtained by Necula (2002) under constant drift and volatility. We obtain option prices under time varying volatility model. The expression depends on volatility and the Hurst parameter in a complicated manner. Properties of estimators of volatility and of Hurst parameter have been studied separately before. We derive a central limit theorem for the quadratic variation as an estimator for volatility. We obtain estimates of option prices and their asymptotic distributions.

Tulasi Ram Reddy, IISc.

Critical points of random polynomials

Let \(a_1, a_2, a_3, \dots\) be complex numbers such that their empirical measure converges weakly to a compactly supported probability measure in complex plane. Let \(\xi_i =a_i + \sigma_iX_i\) be a sequence such that \(\sigma_i \downarrow 0\), also \(\lim\limits_{n\rightarrow \infty}\sqrt[n]{\sigma_n}=1\) and \(X_is\) are i.i.d standard complex Gaussian random variables. Then the empirical measure of critical points of the polynomials formed by taking \(\xi_is\) as its roots converges weakly to the same of that of its zeros. This is a generalization for the theorem proved by Kabluchko in arXiv:1206.6692.