I just finished teaching our graduate level Bayesian statistics course. If you know me, you know that (a) this is one of my favourite subjects, (b) I have a lot of opinions about it, and (c) I am 1st percentile in the personality trait _agreeableness_. Therefore, I find it difficult to teach this subject using only other people's material, and feel compelled to write my own.
Therefore, I've been writing tutorials and assignments myself, and trying to write the questions and solutions to produce insights that you might not get from other courses or books on the same subject. Lectures do include others' material so I probably won't post those. This is a `.tar.gz` file containing said tutorials, assignments, and solutions. It is now complete and contains everything except the exam.
Please let me know if you find any mistakes and I'll do my best to fix them. I already know there's no Tutorial 2, so don't report that — we skipped it due to the disrupted semester.
Some of my followers have requested that I post more video content. Therefore, I decided to upload this PDF document. ;-) I promise I'll make more videos eventually!
This is a document intended to teach Bayesian statisticians how to use my software package *DNest4*. I explain the usage with a simple example that will be familiar to practicing statisticians. If the response is good, I might also make a longer, more detailed version, and sell it (for a low price, don't stress).
I realise this won't be of much interest to a decent chunk of my LBRY following. I apologise for any inconvenience caused..
A presentation in which I discuss the cause of overfitting in statistics, which is the use of optimisation methods such as maximum likelihood for estimating the parameters, when ideally we should use the posterior distribution.
I give an example of fitting a dataset with a complex model, and show how the maximum likelihood estimate is very atypical of the posterior distribution.
There’s one thing I wasn’t completely clear about towards the end of the talk, in the bit with the red and green bars where I discuss trans-dimensional models. The green parts are meant to represent the regions of parameter space that fit the data. The regions that overfit the data will be a tiny subset of the green bars, even in the complex model on the right hand side of the slides. Even if you conditioned on the model all the way on the right, you wouldn’t get overfitting unless you optimised within that model.
Links to the things I referred to about foundations of probability:
http://aapt.scitation.org/doi/pdf/10.1119/1.1990764
https://www.amazon.com/Probability-Theory-Science-T-Jaynes/dp/0521592712/ref=sr_1_1?ie=UTF8&qid=1509311627&sr=8-1&keywords=probability+theory+logic+of+science
http://www.mdpi.com/2075-1680/1/1/38
Earlier this year I wrote an assignment question about proof-of-work for my graduate class. I thought some folks here might enjoy it or learn something from it, so here it is...
This publication (_publish isn't a noun, guys_) is just a markdown file with a few words about my new LBRY wallet server, which has the "variable decay" trending algorithm running on it. It gives the IP address of the server in case you want to connect to it, and I'm charging a modest fee so that the wallet server doesn't get too many users. I can't guarantee that I'll maintain the server long term, but you can expect it to be up for at least the next couple of months.
This server might be of interest to you if:
(a) you use the LBRY desktop app (i.e., real LBRY ;));
(b) you're located near Sydney, Australia, where the server is physically located; and
(c) you want a more interesting (hopefully) trending list.
Over the last few days I've rewritten the backend of LBRYnomics and that's made it easier to produce new things. So here are some early treats for y'all, including the top *200* channels as of right now.
Thumbnail photo credit: Sindre Strøm from Pexels.
Over the last few days I've rewritten the backend of LBRYnomics and that's made it easier to produce new things. So here are some early treats for y'all, including the top *200* channels as of right now.
Thumbnail photo credit: Sindre Strøm from Pexels.