The Problem with Dummies The Index Variable Approach More Categories Even More Categories Notes of Caution These are code snippets and notes for the fifth chapter, The Many Variables & The Spurious Waffles, section 2, of the book Statistical Rethinking (version 2) by Richard McElreath. In this section, we go through different ways how to add categorical variables to our models. The Problem with Dummies In the simplest case we only have two categories, e. »

Hidden Influence in Milk The Causal Reasoning behind it Markov Equivalence Simulating a Masking Ball These are code snippets and notes for the fifth chapter, The Many Variables & The Spurious Waffles, section 2, of the book Statistical Rethinking (version 2) by Richard McElreath. Hidden Influence in Milk In the previous section about spurious associations we used multiple regression to eliminate variables that seemed to have an influence when comparing bivariate relationships but whose association vanishes when introducing more variables to the regression. »

Spurious Waffles (and Marriages) Yo, DAG Does this DAG fit? More than one predictor: Multiple regression In Matrix Notation Simulating some Divorces How do we plot these? Predictor residual plots Posterior prediction plots Counterfactual plot Simulating spurious associations Simulating counterfactuals These are code snippets and notes for the fifth chapter, The Many Variables & The Spurious Waffles, section 1, of the book Statistical Rethinking (version 2) by Richard McElreath. »

Polynomial Regression Splines These are code snippets and notes for the fourth chapter, Geocentric Models, , sections 5, of the book Statistical Rethinking (version 2) by Richard McElreath. Polynomial Regression Standard linear models using a straight line to fit data are nice for their simplicity but a straight line is also very restrictive. Most data does not come in a straight line. We can use polynomial regression to extend the linear model. »

Easy. Medium. Hard. These are my solutions to the practice questions of chapter 4, Linear Models, of the book Statistical Rethinking (version 2) by Richard McElreath. Easy. 4E1. In the model definition below, which line is the likelihood: \begin{align*} y_i &\sim \text{Normal}(\mu, \sigma) & & \text{This is the likelihood}\\ \mu &\sim \text{Normal}(0, 10) \\ \sigma &\sim \text{Exponential}(1) \end{align*} 4E2. In the model definition just above, how many parameters are in the posterior distribution? »