• Home
  • Probability
  • Foundations of Linear and Generalized Linear Models by Alan Agresti

Foundations of Linear and Generalized Linear Models by Alan Agresti

By Alan Agresti

A precious review of an important principles and ends up in statistical modeling

Written through a highly-experienced writer, Foundations of Linear and Generalized Linear Models is a transparent and finished advisor to the main techniques and result of linear statistical versions. The booklet provides a wide, in-depth evaluate of the main everyday statistical versions through discussing the idea underlying the versions, R software program purposes, and examples with crafted versions to clarify key principles and advertise useful version building.

The booklet starts through illustrating the basics of linear types, reminiscent of how the model-fitting initiatives the knowledge onto a version vector subspace and the way orthogonal decompositions of the knowledge yield information regarding the consequences of explanatory variables. therefore, the publication covers the preferred generalized linear versions, which come with binomial and multi-nomial logistic regression for express information, and Poisson and detrimental binomial log linear versions for count number info.

Focusing at the theoretical underpinnings of those versions, Foundations of Linear and Generalized Linear Models additionally beneficial properties:
• An advent to quasi-likelihood tools that require weaker distributional assumptions, akin to generalized estimating equation methods
• an outline of linear combined types and generalized linear combined versions with random results for clustered correlated information, Bayesian modeling, and extensions to deal with tricky situations equivalent to excessive dimensional difficulties
• a variety of examples that use R software program for all textual content information analyses
• greater than four hundred workouts for readers to perform and expand the idea, tools, and information research
• A supplementary site with datasets for the examples and workouts a useful textbook for upper-undergraduate and graduate-level scholars in information and biostatistics classes, Foundations of Linear and Generalized Linear Models is additionally a good reference for training statisticians and biostatisticians, in addition to someone who's attracted to studying in regards to the most crucial statistical versions for reading info.

Show description

Read or Download Foundations of Linear and Generalized Linear Models PDF

Best probability books

Symmetry and its Discontents: Essays on the History of Inductive Probability

This quantity brings jointly a set of essays at the historical past and philosophy of chance and records via an eminent pupil in those topics.

Written over the past fifteen years, they fall into 3 large different types. the 1st bargains with using symmetry arguments in inductive chance, particularly, their use in deriving principles of succession.

The moment workforce offers with 3 extraordinary people who made lasting contributions to likelihood and statistics in very other ways. The final team of essays offers with the matter of "predicting the unpredictable. "

Quality Control and Reliability, Volume 7

Hardbound. This quantity covers a space of data facing complicated difficulties within the construction of products and providers, upkeep and service, and administration and operations. the outlet bankruptcy is through W. Edwards Deming, pioneer in statistical qc, who was once fascinated with the standard keep watch over circulation in Japan and helped the rustic in its speedy commercial improvement.

Seminaire de Probabilites X Universite de Strasbourg

Ce quantity contient deux events : d'abord, les exposés du séminaire de probabilités de Strasbourg pour l'année universitaire 1974-75, sur des sujets très divers. Nous emercions les conférenciers qui ont bien voulu nous confier leurs textes - beaucoup d'entre eux résentant des résultats nouveaux, qui ne seront pas publiés ailleurs.

Probability for Statisticians

Likelihood for Statisticians is meant as a textual content for a twelve months graduate path aimed specially at scholars in data. the alternative of examples illustrates this goal essentially. the fabric to be provided within the lecture room constitutes a piece greater than part the textual content, and the alternatives the writer makes on the collage of Washington in Seattle are spelled out.

Extra resources for Foundations of Linear and Generalized Linear Models

Example text

To be consistent with GLM formulas, we will usually express linear models in terms of E(y). Foundations of Linear and Generalized Linear Models, First Edition. Alan Agresti. © 2015 John Wiley & Sons, Inc. Published 2015 by John Wiley & Sons, Inc. 1 introduces the least squares method for fitting linear models. 2 shows that the least squares model fit ????̂ is a projection of the data y onto the model space C(X) generated by the columns of the model matrix. 3 illustrates for a few simple linear models.

3, (c) the model containing both weight and color predictors. 21 Littell et al. (2000) described a pharmaceutical clinical trial in which 24 patients were randomly assigned to each of three treatment groups (drug A, drug B, placebo) and compared on a measure of respiratory ability (FEV1 = forced expiratory volume in 1 second, in liters). 4. Here, we let y be the response after 1 hour of treatment (variable fev1 in the data file), x1 = the baseline measurement prior to administering the drug (variable base in the data file), and x2 = drug (qualitative with labels a, b, p in the data file).

Data projection gives unique least squares fit: For each y ∈ Rn and its projection PX y = ????̂ onto the model space C(X) for a linear model ???? = X????, ‖y − PX y‖ ≤ ‖y − z‖ for all z ∈ C(X), with equality if and only if z = PX y. To show why this is true, for an arbitrary z ∈ C(X) we express y − z = (y − PX y) + (PX y − z). Now (y − PX y) = (I − PX )y is in C(X)⟂ = N(XT ), whereas (PX y − z) is in C(X) because each component is in C(X). Since the subspaces C(X) and C(X)⟂ are orthogonal complements, ‖y − z‖2 = ‖y − PX y‖2 + ‖PX y − z‖2 , because uT v = 0 for any u ∈ C(X) and v ∈ C(X)⟂ .

Download PDF sample

Rated 4.82 of 5 – based on 29 votes