1. Combined Linear Congruential Generator for Pseudo-random Number Generation

    Combined linear congruential generators, as the name implies, are a type of PRNG (pseudorandom number generator) that combine two or more LCGs (linear congruential generators). The combination of two or more LCGs into one random number generator can result in a marked increase in the period length of the generator which makes them better suited for simulating more complex systems.

  2. Linear Congruential Generator for Pseudo-random Number Generation with R

    Linear congruential generators (LCGs) are a class of pseudorandom number generator (PRNG) algorithms used for generating sequences of random-like numbers. The generation of random numbers plays a large role in many applications ranging from cryptography to Monte Carlo methods. Linear congruential generators are one of the oldest and most well-known methods for generating random numbers primarily due to their comparative ease of implementation and speed and their need for little memory.

  3. Set Union and Intersections with R

    The set operations 'union' and 'intersection' should ring a bell for those who've worked with relational databases and Venn Diagrams. The 'union' of two of sets A and B represents a set that comprises all members of A and B (or both).

    Tagged as : R set theory
  4. Factor Analysis with Principal Factor Method and R

    As discussed in a previous post on the principal component method of factor analysis, the \(\hat{\Psi}\) term in the estimated covariance matrix \(S\), \(S = \hat{\Lambda} \hat{\Lambda}' + \hat{\Psi}\), was excluded and we proceeded directly to factoring \(S\) and \(R\). The principal factor method of factor analysis (also called the principal axis method) finds an initial estimate of \(\hat{\Psi}\) and factors \(S - \hat{\Psi}\), or \(R - \hat{\Psi}\) for the correlation matrix.

    Tagged as : R factor analysis
  5. Factor Analysis with the Principal Component Method and R Part Two

    In the first post on factor analysis, we examined computing the estimated covariance matrix \(S\) of the rootstock data and proceeded to find two factors that fit most of the variance of the data. However, the variables in the data are not on the same scale of measurement, which can cause variables with comparatively large variances to dominate the diagonal of the covariance matrix and the resulting factors. The correlation matrix, therefore, makes more intuitive sense to employ in factor analysis.

  6. Factor Analysis with the Principal Component Method and R

    The goal of factor analysis, similar to principal component analysis, is to reduce the original variables into a smaller number of factors that allows for easier interpretation. PCA and factor analysis still defer in several respects. One difference is principal components are defined as linear combinations of the variables while factors are defined as linear combinations of the underlying latent variables.

  7. Principal Component Analysis with R Example

    Often, it is not helpful or informative to only look at all the variables in a dataset for correlations or covariances. A preferable approach is to derive new variables from the original variables that preserve most of the information given by their variances. Principal component analysis is a widely used and popular statistical method for reducing data with many dimensions (variables) by projecting the data with fewer dimensions using linear combinations of the variables, known as principal components.

Page 4 / 4