Syllabus For The Subject Elements of Computer & I.T.

 

ELEMENTS OF COMPUTER & I.T.

Contents

Chapter-1 Introduction to statistical computing

Early, classical, and modern concerns

Computation in different areas of statistics

Applications

Theory of statistical methods

Numerical statistical methods

Graphical statistical methods

Meta-methods: strategies for data analysis

Statistical meta-theory

Different kinds of computation in statistics

Numerical computation

Graphical computation

Symbolic computation

Computing environments

Theoretical computation

Statistics in different areas of computer science

Some notes on the history of statistical computing

 

Chapter -2 Basic numerical methods

Floating-point arithmetic

Rounding error and error analysis

Algorithms for moment computations

Error analysis for means

Computing the variance

Diagnostics and conditioning

Inner products and general moment computations

Floating-point standards

 

 

Chapter-3 numerical linear algebra

Multiple regression analysis

The least-squares problem and orthogonal transformations

Householder transformations

Computing with householder transformations

Decomposing

Interpreting the results from householder transformations

Other orthogonalization methods

Givens rotations

Solving linear systems

The cholesky factorization

The SWEEP operator

The elementary SWEEP operator

The matrix SWEEP operator

Stepwise regression

The constant term

Collinearity and conditioning

Collinearity

Tolerance

The singular-value decomposition

Conditioning and numerical stability

Generalized inverses

Regression diagnostics

Residuals and fitted values

Leverage

Other case diagnostics

Variable diagnostics

Conditioning diagnostics

Other diagnostic methods

Regression updating

Updating the inverse matrix

Updating matrix factorizations

Other regression updating

Principal components and eigenproblems

Eigenvalues and eigenvectors

Generalized eigenvalues and eigenvectors

Population principal components

Principal components of data

Solving eigenproblems

The symmetric QR algorithm

The power method

The QR algorithm with origin shifts

The implicit QR algorithm

The golub-reinsch singular value decomposition

Householder-golub-reinsch singular value decomposition

The SVD of a bidiagonal matrix

The chan modification to the SVD

The generalized eigenproblem

Jacobi methods

Generalizations of least-squares regression

GLM: the general linear model

WLS: weighted least squares

GLS: generalized least squares

GLIM:  generalized least squares

Iteratively reweighted least squares

Iterative methods

The Jacobi iteration

Gauss-seidel iteration

Other iterative methods

Additional topics and further reading

Lp regression

Robust regression

Subset selection

All-subsets regression

Stepwise regression

Other stepwise methods

 

Chapter-4 Nonlinear statistical methods

Maximum likelihood estimation

General notions

The MLE and its standard error: the scalar case

The MLE and the information matrix: the vector case

Solving f =0 scalar

Simple iteration

Newton-raphson

The secant method (regula falsi)

Bracketing methods

Starting values and convergence criteria

Some statistical examples

A multinomial problem

Poisson regression

Solving =0: the vector case

Generalizations of univariate methods

Newton-raphson

Newton-like methods

Discrete newton methods

Generalized secant methods

Rescaled simple iteration

Quasi-Newton methods

Nonlinear gauss-seidel iteration

Some statistical examples

Example: binomial/Poisson mixture

Example: poisson regression

Example: logistic regression

Obtaining the hessian matrix

Optimization methods

Grid search

Linear search strategies

Golden-section search

Local polynomial approximation

Successive approximation

Selecting the step direction

Newton steps

Steepest descent

Levenberg-marquardt adjustment

Quasi-newton steps

Conjugate-gradient methods

Some practical considerations

Nonlinear least squares and the gauss-newton method

Iteratively reweighted least squares

Constrained optimization

Linear programming

Least squares with linear equality constraints

Linear regression with linear equality constraints

Nonquadratic programming with linear constraints

Nonlinear constraints

Estimation diagnostics

Computer-intensive methods

Nonparametric and semi parametric regression

Projection-selection regression

Additive spline models

Alternating conditional expectations

Generalized additive models

Missing data: The EM algorithm

Time-series analysis

Conditional likelihood estimation for ARMA models

The method of backward forecasts

Factorization methods

Kalman filter methods

A note on standard errors

 

 

Chapter-5 numerical integration and approximation

Newton-cotes methods

Riemann integrals

The trapezoidal rule

Simpson’s rule

General Newton-cotes rules

Extended rules

Romberg integration

Improper integration

Integrands with singularities at the end points

Integration over infinite intervals

Gaussian quadrature

Gauss-legendre rules

Orthogonal polynomials

On computing Gaussian quadrature rules

Other gauss-like integration rules

Patterson-kronrod rules

Automatic and adaptive quadrature  

Interpolating splines

Characterization of spline functions

Representations for spline functions

Truncated power functions

Piecewise-polynomial representations

B-splines

Choosing an interpolating spline

Computing and evaluating an interpolating spline

Computing with truncated power functions

Cubic splines based on B-splines

Monte Carlo integration  

Simple monte carol

Variance reduction

A hybrid method

Number-theoretic methods

Multiple integrals

Iterated integrals and products rules

General multivariate regions

Adaptive partitioning methods

Monte carlo methods

Gaussian orthant probabilities

Bayesian computations

Exploring a multivariate posterior density

Some computational approaches

Laplace’s method

Gauss-hermite quadrature

The tanner-Wong method of data augmentation

The Tierney-kadane –Laplace method

General approximation methods

Cumulative distribution functions

Tail areas

Percent points

Methods of approximation

Series approximation

Continued fractions

Polynomial approximation

Rational approximation

Tail-areas and inverse cdf’s for common distributions

The normal distribution

Normal tail areas

Normal quantiles

The X2 distribution

The F distribution

Student ‘s t distribution

Other distributions

 

 

Chapter-6 Smoothing and density estimation

Histograms and related density estimators

The simple histogram

A native density estimator

Kernel estimators

Nearest-neighbor estimates

Computational considerations

Linear smoothers

Running means

Kernel smoothers

Running lines

General linear smoothers

Spline smoothing

Smoothing splines

Regression splines

Multivariate spline smoothing

Nonlinear smoothers

LOWESS

Super smoother

Running medians

Other methods

Choosing the smoothing parameter

Applications and extensions

Robust smoothing

Smoothing on circles and spheres

Smoothing periodic time series

Estimating functions with discontinuities

Hazard estimation

  

 

 

 

 

 

 

 

 

 

  

  

  

DMCA.com Protection Status
Important Links : Privacy Policy | Terms & Conditions