0% found this document useful (0 votes)
179 views

CS1 Mapping Syllabus PDF

This document contains a syllabus for the CS1 Actuarial Statistics exam. It covers key topics in probability, statistics, and regression including random variables and distributions, data analysis, statistical inference, and linear regression. The syllabus is broken down into 4 main sections covering these topics and lists specific learning objectives within each section.

Uploaded by

Bakari Hamisi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
179 views

CS1 Mapping Syllabus PDF

This document contains a syllabus for the CS1 Actuarial Statistics exam. It covers key topics in probability, statistics, and regression including random variables and distributions, data analysis, statistical inference, and linear regression. The syllabus is broken down into 4 main sections covering these topics and lists specific learning objectives within each section.

Uploaded by

Bakari Hamisi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

`

Subject CS1
Actuarial Statistics
Core Principles

Syllabus (For exemptions via non-accredited courses only)

for the 2020 exams

20 June 2019
Exemption application summary (to be completed by the applicant)

IFoA Year Module Grade/Mark


University Module Code and Name
Subject Completed Achieved

CS1 STAT101: Probability and Statistics 2017 73%

CS1

CS1

CS1

CS1

CS1

CS1

CS1

CS1

CS1

CS1

CS1

CS1

CS1

CS1
CS1 – ACTUARIAL STATISTICS

University
University Module
CB1 – Actuarial Statistics Codes only
Syllabus page
Number

1 Random variables and distributions (20%)

1.1 Define basic univariate distributions and use them to calculate


probabilities, quantiles and moments.

1.1.1 Define and explain the key characteristics of the discrete


distributions: geometric, binomial, negative binomial,
hypergeometric, Poisson and uniform on a finite set

1.1.2 Define and explain the key characteristics of the continuous


distributions: normal, lognormal, exponential, gamma, chi-
square, t, F, beta and uniform on an interval.

1.1.3 Evaluate probabilities and quantiles associated with


distributions (by calculation or using statistical software as
appropriate).
1.1.4 Define and explain the key characteristics of the Poisson
process and explain the connection between the Poisson
process and the Poisson distribution.

1.1.5 Generate basic discrete and continuous random variables


using the inverse transform method.

1.1.6 Generate discrete and continuous random variables using


statistical software.

1.2 Independence, joint and conditional distributions, linear


combinations of random variables

1.2.1 Explain what is meant by jointly distributed random variables,


marginal distributions and conditional distributions.

1.2.2 Define the probability function/density function of a marginal


distribution and of a conditional distribution.

1.2.3 Specify the conditions under which random variables are


independent.

1.2.4 Define the expected value of a function of two jointly


distributed random variables, the covariance and correlation
coefficient between two variables, and calculate such
quantities
1.2.5 Define the probability function/density function of the sum of
two independent random variables as the convolution of two
functions.

1.2.6 Derive the mean and variance of linear combinations of


random variables

1.2.7 Use generating functions to establish the distribution of linear


combinations of independent random variables.

1.3 Expectations, conditional expectations

1.3.1 Define the conditional expectation of one random variable


given the value of another random variable, and calculate such
a quantity.

1.3.2 Show how the mean and variance of a random variable can be
obtained from expected values of conditional expected values,
and apply this.

1.4 Generating functions

1.4.1 Define and determine the moment generating function of


random variables.

1.4.2 Define and determine the cumulant generating function of


random variables.

1.4.3 Use generating functions to determine the moments and


cumulants of random variables, by expansion as a series or by
differentiation, as appropriate.

1.4.4 Identify the applications for which a moment generating


function, a cumulant generating function and cumulants are
used, and the reasons why they are used

1.5 Central Limit Theorem – statement and application

1.5.1 State the Central Limit Theorem for a sequence of


independent, identically distributed random variables.

1.5.2 Generate simulated samples from a given distribution and


compare the sampling distribution with the Normal.

2 Data analysis (15%)

2.1 Data analysis


2.1.1. Describe the possible aims of a data analysis (e.g.
descriptive, inferential, and predictive)

2.1.2 Describe the stages of conducting a data analysis to solve


real-world problems in a scientific manner and describe tools
suitable for each stage.

2.1.3 Describe sources of data and explain the characteristics of


different data sources, including extremely large data sets.

2.1.4 Explain the meaning and value of reproducible research and


describe the elements required to ensure a data analysis is
reproducible
2.2 Exploratory data analysis

2.2.1 Describe the purpose of exploratory data analysis.

2.2.2 Use appropriate tools to calculate suitable summary statistics


and undertake exploratory data visualizations.

2.2.3 Define and calculate Pearson’s, Spearman’s and Kendall’s


measures of correlation for bivariate data, explain their
interpretation and perform statistical inference as appropriate.

2.2.4 Use Principal Components Analysis to reduce the


dimensionality of a complex data set.

2.2 Random sampling and sampling distributions

2.2.1 Explain what is meant by a sample, a population and statistical


inference.

2.2.2 Define a random sample from a distribution of a random


variable.

2.2.3 Explain what is meant by a statistic and its sampling


distribution.

2.2.5 Determine the mean and variance of a sample mean and the
mean of a sample variance in terms of the population mean,
variance and sample size.

2.2.6 State and use the basic sampling distributions for the sample
mean and the sample variance for random samples from a
normal distribution.

2.2.7 State and use the distribution of the t-statistic for random
samples from a normal distribution.
2.2.7 State and use the F distribution for the ratio of two sample
variances from independent samples taken from normal
distributions.

3 Statistical inference (20%)

3.1 Estimation and estimators

3.1.1 Describe and apply the method of moments for constructing


estimators of population parameters.

3.1.2 Describe and apply the method of maximum likelihood for


constructing estimators of population parameters.

3.1.3 Define the terms: efficiency, bias, consistency and mean


squared error.
3.1.4 Define and apply the property of unbiasedness of an
estimator.

3.1.5 Define the mean square error of an estimator, and use it to


compare estimators.

3.1.6 Describe and apply the asymptotic distribution of maximum


likelihood estimators.

3.1.7 Use the bootstrap method to estimate properties of an


estimator.

3.2 Confidence intervals

3.2.1 Define in general terms a confidence interval for an unknown


parameter of a distribution based on a random sample.

3.2.2 Derive a confidence interval for an unknown parameter using a


given sampling distribution.

3.2.3 Calculate confidence intervals for the mean and the variance
of a normal distribution.

3.2.4 Calculate confidence intervals for a binomial probability and a


Poisson mean, including the use of the normal approximation
in both cases.
3.2.5 Calculate confidence intervals for two-sample situations
involving the normal distribution, and the binomial and Poisson
distributions using the normal approximation.
3.2.6 Calculate confidence intervals for a difference between two
means from paired data.

3.2.7 Use the bootstrap method to obtain confidence intervals.

3.3 Hypothesis testing and goodness of fit


3.3.1 Explain what is meant by the terms null and alternative
hypotheses, simple and composite hypotheses, type I and
type II errors, test statistic, likelihood ratio, critical region, level
of significance, probability-value and power of a test.
3.3.2 Apply basic tests for the one-sample and two-sample
situations involving the normal, binomial and Poisson
distributions, and apply basic tests for paired data.
3.3.3 Apply the permutation approach to non-parametric hypothesis
tests.
3.3.4 Use a chi-square test to test the hypothesis that a random
sample is from a particular distribution, including cases where
parameters are unknown.
3.3.5 Explain what is meant by a contingency (or two-way) table,
and use a chi-square test to test the independence of two
classification criteria.

4 Regression theory and applications (30%)

4.1 Linear regression

4.1.1 Explain what is meant by response and explanatory variables.

4.1.2 State the simple regression model (with a single explanatory


variable).

4.1.3 Derive the least squares estimates of the slope and intercept
parameters in a simple linear regression model.

4.1.4 Use appropriate software to fit a simple linear regression


model to a data set and interpret the output.

 Perform statistical inference on the slope parameter.  

 Describe the use of measures of goodness of fit of a linear


 
regression model.
 Use a fitted linear relationship to predict a mean response or
 
an individual response with confidence limits.

 Use residuals to check the suitability and validity of a linear


 
regression model.

4.1.5 State the multiple linear regression model (with several


explanatory variables).

4.1.6 Use appropriate software to fit a multiple linear regression


model to a data set and interpret the output.

4.1.7 Use measures of model fit to select an appropriate set of


explanatory variables.
4.2 Generalised linear models

4.2.1 Define an exponential family of distributions. Show that the


following distributions may be written in this form: binomial,
Poisson, exponential, gamma, normal.
4.2.2 State the mean and variance for an exponential family, and
define the variance function and the scale parameter. Derive
these quantities for the distributions above.

4.2.3 Explain what is meant by the link function and the canonical
link function, referring to the distributions above.

4.3.4 Explain what is meant by a variable, a factor taking categorical


values and an interaction term. Define the linear predictor,
illustrating its form for simple models, including polynomial
models and models involving factors.
4.2.5 Define the deviance and scaled deviance and state how the
parameters of a generalised linear model may be estimated.
Describe how a suitable model may be chosen by using an
analysis of deviance and by examining the significance of the
parameters.
4.2.6 Define the Pearson and deviance residuals and describe how
they may be used.

4.2.7 Apply statistical tests to determine the acceptability of a fitted


model: Pearson’s chi-square test and the likelihood ratio test

4.2.8 Fit a generalised linear model to a data set and interpret the
output.

5 Bayesian statistics (15%)

5.1 Explain the fundamental concepts of Bayesian statistics and


use these concepts to calculate Bayesian estimators.

5.1.1 Use Bayes’ theorem to calculate simple conditional


probabilities.

5.1.2 Explain what is meant by a prior distribution, a posterior


distribution and a conjugate prior distribution.

5.1.3 Derive the posterior distribution for a parameter in simple


cases.

5.1.4 Explain what is meant by a loss function.

5.1.5 Use simple loss functions to derive Bayesian estimates of


parameters.
5.1.6 Explain what is meant by the credibility premium formula and
describe the role played by the credibility factor.
5.1.7 Explain the Bayesian approach to credibility theory and use it
to derive credibility premiums in simple cases.

5.1.8 Explain the empirical Bayes approach to credibility theory and


use it to derive credibility premiums in simple cases.

5.1.9 Explain the differences between the two approaches and state
the assumptions underlying each of them.

END

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy