Machine Learning

Machine Learning is getting a central part of our field. There is a huge curated list with all kinds of ML frameworks and packages

The most recent advances can be found with paper and code on paperswithcode

Online Courses

Articles

Books

Python Packages

Various trivia

  • An underappreciated measure of centreal tendency is the trimean (\(TM\))

    \[TM = \frac{Q_1 + 2Q_1 + Q_3}{4},\]

    where \(QM_2\) is the median and \(Q_1\) and \(Q_2\) are the quartiles.

    “An advantage of the trimean as a measure of the center (of a distribution) is that it combines the median’s emphasis on center values with the midhinge’s attention to the extremes.” — Herbert F. Weisberg, Central Tendency and Variability.

  • It is quite useful to keep the following nomogram in mind

    P value nomogram
    This is directly connected to

    “Extraordinary claims require extraordinary evidence” – Carl Sagan/Laplace

  • A nice visualization of the famous Ioannidis paper is this RShiny app

  • A quite interesting discussion of the variance in the output function is reduced by adding more parameters to a (ensembled) network which then leads to a lower generalization error. They also provide a discussion of a divergence of the error at \(N^*\) for networks without regularization. Preprint version is on arXiv:1901.01608v3

    Measured generalization error as a function of the number of parameters (arXiv:1901.01608v3)
  • I find dilated convolutional NNs to be quite a interesting way to increase the perceptive field. Ferenc Huszár gives another description in terms of Kronecker factorizations of smaller kernels

  • Spatial dropout is quite interesting to make dropout work better on spatial correlations.

  • Jensen’s paper about GA for logP optimization and also a recent work from Berend Smit’s group are reminders that we shouldn’t forget good old techniques such as GA.