Skip to main content Skip to navigation

Haoran Ni

I am currently a 4th year PhD student at the MathSys CDT. My research interests are in the mathematics of information, machine learning, deep learning and numerical analysis. My current work is on numerical perspective of information theory, dimensionality reduction, optimal transport, machine learning algorithms, generative models and other deep neural networks.

Education Background

  • The University of Warwick, UK (2020 - 2024)
    PhD in Mathematics for Real-world Systems at the MathSys CDT.
    Supervisor: Dr. Martin Lotz
    Thesis: Numerical Estimation of Information Measures and Learning Generative Models.
  • The University of Warwick, UK (2019 - 2020)

MSc in Mathematics for Real-World Systems at the MathSys CDT. (Distinction)

  • The University of Bath, UK (2017 - 2018)

MSc in Modern Application of Mathematics at the Department of Mathematical Science. (Distinction)

  • Central University of Finance and Economics, China (2013 - 2017)

BSc in Mathematics and Applied Mathematics (Financial Mathematics) at the Department of Statistics and Mathematics.

Group Projects

  • Stochastic Parareal: an application of probabilistic methods to time-parallelisation.

The project is focused on improving the rate of convergence (equivalently, computational efficiency) of Parareal (which is a time-parallel algorithm that provides speed-up for a broad variety of existing initial value problems (IVPs)) by applying stochastic methods. Certain classes of problems such as the Brusselator equations and the Lorenz systems were investigated.

The idea of stochastic methods is to generate, instead of deterministic solutions at each time interval, M solutions from a probability distribution (denoted as the 'sampling rule'), and piece together a continuous trajectory that minimises the errors at interval boundaries.

We presented in the experiments that, with the increasing number of samples M and larger variance in the sampling rule, our proposed methods tend to beat the deterministic Parareal with high probability. In chaotic systems such as the Lorenz, our methods also showed the potential to indicate multiple numerical solutions caused by small perturbations.

This project is supervised by Dr. Massimiliano TamborrinoLink opens in a new window, Dr. Debasmita Samaddar and Dr. Lynton Appel, and supported by UKAEA. (Mar. - Jun. 2020)

Individual Projects

  • Extensions of Normalizing Flows.

Normalizing flows refer to a class of generative models, where the probability distribution underlying observed data is characterised as pushforward, with respect to a series of invertible and easily computable transformations, of a simple base distribution such as a normal distribution. We analyse the recently introduced concept of augmented normalizing flows and study in detail their relation to variational autoencoders. We further observe that the invertibility assumption for normalizing flows can be overly restrictive for certain tasks. We then propose a generalization, by applying the co-area formula of integral geometry, to replace the Jacobian of the transformation with an integral of a normal Jacobian. We
illustrate the potential of these extensions in numerical experiments which includes the reduction in computational costs of training flows.
This project is supervised by Dr. Martin Lotz.

  • Research paper classification using neural networks.

The project is focused on classifying research papers by disciplines using NLP techniques such as word embedding algorithms (word2vec & GloVe), convolutional neural networks and recurrent neural networks. Auto-optimization algorithms of hyper-parameters such as Bayesian optimization and Tree-structured Parzen estimator were implemented in the paper.

Although the training datasets are extremely small sized due to multiple difficulties in labelling research papers, the final model was successfully managed to classify more than 70000 research papers published by the Chinese Academy of Sciences (CAS). The accuracy of classification is averagely over 90% on test datasets.

This project is supported by Computer Network Information Center, CAS. (Jun. - Sep. 2019)

The project is focused on numerically estimating entropy and mutual information using k-th nearest neighbor estimators and its applications in related areas. Entropy and mutual information are defined as follows:

H(x):=-\mathbb{E}[\log{(\mu_x(x))}]

I(x,y):=H(x)+H(y)-H(x,y)

For continuous estimators, KSG, BI-KSG and G-knn estimators were reproduced. For discrete cases, Gao’s estimator and Multi-KL estimator were reproduced. We also improved the bias of G-knn method (not vanished yet) and proposed an approximate k-NN method which slightly outperforms the state-of-art KSG method in the paper.

The applications of these methods such as MIMO channel systems, quadrature amplitude modulation and feature selection were also discussed.

This project is supervised by Dr. Keith Briggs and supported by BT Wireless Research. (Jun. - Oct. 2018)

Speaking

      • Talk, 10th International Congress on Industrial and Applied Mathematics, Tokyo, Japan(Aug. 2023)

      "Normalizing Flows based Mutual Information Estimation"

        • Talk, 4th IMA Conference on the Mathematical Challenges of Big Data, Oxford, UK (Sep. 2022)

        "Normalizing Flows based Mutual Information Estimation"

          • Talk, SIAM Annual Meeting, Pittsburgh, USA (Jul. 2022)

          "Information Measures Estimation and Random Projection"

            • Poster, SIAM UKIE National Student Chapter Conference, Edinburgh, UK (Jun. 2022)

            "Information Measures Estimation and Random Projection"

            • Talk, Mathematics of Data Science (MathODS) Virtual Conference (Jun. 2020)

            "Numerical Estimation of Information Measures"

            Teaching/Outreach

            • Senior Graduate Teaching Assistant, University of Warwick (Oct. 2023 - Jun. 2023)
            Lecturer for MA930: Data Analysis and Machine Learning.
            • Senior Graduate Teaching Assistant, University of Warwick (Oct. 2022 - Jun. 2023)

            TA for MA258: Mathematical Analysis III, MA4M9: Mathematics of Neuronal Networks, and MA3K1: Mathematics of Machine Learning with delivering support classes and marking assignments. Invited for a one day workshop of MA930: Data Analysis and Machine Learning.

            • Senior Graduate Teaching Assistant, University of Warwick (Oct. 2021 - Jun. 2022)

            TA for MA124: Mathematics by Computer, MA3J4: Mathematical Modeling with PDE, and MA3K1: Mathematics of Machine Learning with delivering support classes and marking assignments.

            • Senior Graduate Teaching Assistant, University of Warwick (Oct. 2020 - Jun. 2021)

            TA for MA124: Mathematics by Computer and MA3K1: Mathematics of Machine Learning with delivering support classes and marking assignments.

            President of the SIAM student chapter where we organise the weekly SPAAM seminarLink opens in a new windowLink opens in a new window, careers events, problem-solving days, and the annual conference.

            Vice-President of the SIAM student chapter where we organise the weekly SPAAM seminarLink opens in a new windowLink opens in a new window, careers events, problem-solving days, and the annual conference.

            Organising and co-hosting the Statistics, Probability, Analysis and Applied Mathematics (SPAAM) seminar series with Diogo Caetano. Get more information about the seminar this year by clicking here.

            • Chairman, Qianfan Maths Association, Central University of Finance and Economics, China. (Sep. 2014 - Jul. 2015)

            Organising Maths-related competitions, seminars, and study groups.

            • Team Leader, Math Software Study Group, Central University of Finance and Economics, China. (Sep. - Dec. 2014)

            Skills

            • Proficient in Python, Matlab, Julia, Fortran, R and Latex.
            • Knew the basic programming skills of C, Lingo and SAS.

            Contact details:

            Email: Haoran (dot) ni (at) Warwick (dot) ac (dot) uk

            Office: D1.04, MathSys CDT, Zeeman Building.