About 510,000 results
Open links in new tab
  1. Why are regression problems called "regression" problems?

    I was just wondering why regression problems are called "regression" problems. What is the story behind the name? One definition for regression: "Relapse to a less perfect or developed state."

  2. regression - When is R squared negative? - Cross Validated

    Also, for OLS regression, R^2 is the squared correlation between the predicted and the observed values. Hence, it must be non-negative. For simple OLS regression with one predictor, this is …

  3. How to describe or visualize a multiple linear regression model

    Then this simplified version can be visually shown as a simple regression as this: I'm confused on this in spite of going through appropriate material on this topic. Can someone please explain to …

  4. regression - Trying to understand the fitted vs residual plot?

    Dec 23, 2016 · A good residual vs fitted plot has three characteristics: The residuals "bounce randomly" around the 0 line. This suggests that the assumption that the relationship is linear is …

  5. regression - Difference between forecast and prediction ... - Cross ...

    I was wondering what difference and relation are between forecast and prediction? Especially in time series and regression? For example, am I correct that: In time series, forecasting seems …

  6. Support Vector Regression vs. Linear Regression - Cross Validated

    Dec 5, 2023 · Linear regression can use the same kernels used in SVR, and SVR can also use the linear kernel. Given only the coefficients from such models, it would be impossible to …

  7. correlation - What is the difference between linear regression on y ...

    The Pearson correlation coefficient of x and y is the same, whether you compute pearson(x, y) or pearson(y, x). This suggests that doing a linear regression of y given x or x given y should be …

  8. Back-transformation of regression coefficients - Cross Validated

    Apr 25, 2012 · I'm doing a linear regression with a transformed dependent variable. The following transformation was done so that the assumption of normality of residuals would hold. The …

  9. regression - Converting standardized betas back to original …

    I have a problem where I need to standardize the variables run the (ridge regression) to calculate the ridge estimates of the betas. I then need to convert these back to the original variables scale.

  10. When conducting multiple regression, when should you center …

    Jun 5, 2012 · In some literature, I have read that a regression with multiple explanatory variables, if in different units, needed to be standardized. (Standardizing consists in subtracting the mean …