top of page

Research Interests:

​

My recent work so far has focused on discrete log-concavity in the probabilistic setting. Log-concavity is a property of many sequences appearing in combinatorics, algebra, and geometry. One of the classical examples of such sequences stems from a result due to Newton, which states that the coefficients of any real-rooted polynomial form a log-concave sequence. This survey of Stanley provides further examples of log-concave sequences. A natural question is the following: why does log-concavity show up in a lot of areas of mathematics?  While considerable research has been devoted to proving the log-concavity of certain naturally occurring sequences, little is known about the underlying structure responsible for this condition. For more details about the work in this direction, see the breakthroughs of Huh-Brändén and Adiprasito-Huh-Katz

 

In probability, the log-concave assumption provides a broad, flexible, yet natural, convolution-stable class of distributions on integers. Examples include Bernoulli, the sum of independent Bernoulli, geometric, binomial, negative binomial, and Poisson. While log-concave measures, their geometry, and properties are well understood in the continuous setting (for example, Gaussian), consideration of discrete log-concavity in the probabilistic setting is very limited. I am interested in investigating various properties/characteristics of this class of distributions and its connections to other areas of mathematics. More specifically, the following are some topics (not necessarily related to log-concavity) of interest:

​

  • Discrete log-concave distributions; Concentration of measure phenomenon, moment comparisons, information-theoretic inequalities, entropy minimization/maximization under moment constraints, connections to convex geometry, and combinatorics.

​

  • Discrete log-concavity in higher dimensions; Discrete log-concave measures in higher dimensions (equivalently, discrete convexity), relationship with log-concave densities, extensions of existing probabilistic results to log-concave measures on high dimensional integer lattices, localization, and majorization type techniques.

​

  • Functional inequalities in the discrete setting; log-Sobolev and Prékopa–Leindler type inequalities for discrete measures with applications. 

​

  • Random Graphs; limit theorems for minimal spanning trees.

​

  • Statistical mechanics; mixing times of Markov chains, path coupling (P.S. I am very much a beginner in this area).

​

​

​

​

Papers & Preprints:

​

  • On a conjecture of Feige for discrete log-concave distributions, SIAM J. Discrete Math. , 38(1), 93–102, 2024. [arXivjournal]

        (with A. Alqasem, A. Marsiglietti and J. Melbourne)

​

​

Description

​

The focus of this work is a conjecture of Feige which can be stated as follows: let X be the sum of number of independent non-negative random variables X_1, X_2,..., X_n, each with expectation at most 1. Then, P(X < E[X] + 1) is at least 1/e

 

The standard probabilistic tools fail to address this problem as they only provide trivial lower bounds. For example, Markov’s inequality yields a trivial bound when X_i's are i.i.d and n is large. Chebyshev’s inequality is not applicable since the variance of each X_i is arbitrary. Similarly, Hoeffding’s and Bennett’s inequalities are not useful to lower bound P(X < E[X] + t) when t is small. Based on a rather involved case analysis, Feige managed to show that P(X < E[X] + 1)1/13. Since then, several attempts have been made to improve the lower bound. The current best-known bound is 0.1798 (due to Guo-He-Ling-Liu).

​

The conjectured bound has been verified for binomial and Bernoulli sums (see Garnett). In fact, these results follow from a special case of Samuel's conjecture, a conjecture similar in nature to Feige's conjecture. It is also known that a stronger inequality holds when X_i's are Poisson (follows from a result of Teicher). Our work extends these results to the whole class of log-concave random variables. More specifically, we show that the conjectured bound 1/e holds when X_i ’s are independent discrete log-concave with arbitrary expectation.

​

Since log-concavity appears naturally in many combinatorial sequences, it would be interesting to see if our result has any implications, particularly in graph theory. For example, Alon-Huang-Sudakov used Feige’s bound to find a connection between a conjecture of Manickam, Miklós, and Singhi with matchings and fractional covers of hypergraphs. Investigation of combinatorial applications remains a possible future direction for this work

​

​

        

​

Description

​

In this work, I establish information-theoretic inequalities for log-concave random variables. More specifically, it is proven that the geometric random variables minimize the discrete min-entropy among all log-concave random variables with fixed variance. Applications include entropy power inequality for Rényi entropy in the log-concave setting. Our results extend and improve the recent work of Bobkov et al. 

 

Entropy and moment comparison has been an important topic of study in information theory. In particular, there have been extensive works on problems of maximizing entropy within certain classes of random variables under fixed variance. For example, it is known that the Gaussian maximizes the entropy (differential entropy) among all real-valued random variables with fixed variance (due to Boltzmann). Therefore, it is natural to think about "minimizing" instead of maximizing entropy, subject to certain moment conditions. In general, the existence of entropy minimizers is not guaranteed. However, the recent works concerning structured subclasses of probability measures show that, identifying such minimizers is possible. In these situations, log-concavity is a natural assumption to be made. In fact, entropy minimization among log-concave random variables is a well-studied problem in the continuous setting. Nonetheless, the work in the discrete setting is limited. The motivation for our work comes from the aforementioned work of Bobkov et al., in which they have utilized a majorization technique to establish entropy-variance relations. Improved results in this paper are obtained using a sophisticated localization approach.

​

​

  • Concentration inequalities for ultra log-concave distributionsStudia Mathematica, 265, 111-120, 2022. [arXiv, journal

        (with A. Marsiglietti and J. Melbourne) 

​

​

Description

​

In this work,  we study a structured subclass within discrete log-concave probabilities, namely ultra log-concave (ULC) distributions. The notion of ultra log-concavity arises from the search for a theory of negative dependence which has long been desired in probability & statistical physics, in analogy with the theory of positive dependence. ULC random variables are non-negative discrete random variables that are log-concave with respect to the Poisson measure (definition due to R. Pemantle). Examples include binomial, sums of independent binomial with arbitrary parameters, Poisson, and hypergeometric distributions.

​

The main motivation to study this class of distribution is due to a result in convex geometry. In here, Lotz-McCoy-Nourdin-Peccati-Tropp studied the so-called intrinsic volumes, which are canonical measures of the content of a convex body in R^n. They showed that the intrinsic volumes of a convex body concentrate sharply (sub-Gaussian type) around a certain index, called the central intrinsic volume. Recent works have revealed striking implications of this property for high-dimensional integral geometry, uncovering new phase transitions in formulas for random projections, rotation means, and random slicing. It is also known that, for a given convex body, its intrinsic volumes form an ultra-log concave sequence (due to P. McMullen).

 

We manage to show that all ultra log-concave sequences exhibit Poisson-type concentration. As a consequence, we generalize and improve the concentration result of LMNPT.

​​

​

​

  • Discrete Log-Concave Distributions, Extreme Points, and Applications, Doctoral dissertation, ​University of Florida ProQuest Dissertations Publishing, 2023.   Discrete Log-Concave Distribut... - UF Digital Collections (ufl.edu)

​

​

​

 

​List of Talks (including slides/notes):

​​

​

​

​

​

​

​

​

​

​

​

  • Concentration inequalities for ultra log-concave distributions, Analysis Seminar, University of Florida, Nov 2021.

​

​

  • On ultra log-concave sequences, Combinatorics Learning Seminar, University of Florida, Oct. 21, 2021.

​

  • Discrete convexity and log-concave distributions in higher dimensions, Analysis Seminar, University of Florida, Feb, 2021. [Notes]

​

  • Investigating convergence of subseries of harmonic series with respect to corresponding gap sequences, Physical Science Awards, Sri Lanka Association for the Advancement of Science (SLAAS), Oct. 2017 *

​

​

                                                                                                                                                * - undergraduate research presentations

 

         

      My PhD dissertation defense slides can be found here.

​

​

​

​

​Invited Schools/Workshops:

​

​

​

​

​

​

​

​

bottom of page