Is the 'spherical' GaussianMixture Model of sklearn the same as performing k-means?
The GaussianMixture()
implementation in scikit-learn offers four different types of covariance matrices when fitting the model. One of those is the 'spherical' type, in which each component has its own single variance.
My question, isn't this the same as doing k-means on a dataset?
python scikit-learn gaussian
add a comment |
The GaussianMixture()
implementation in scikit-learn offers four different types of covariance matrices when fitting the model. One of those is the 'spherical' type, in which each component has its own single variance.
My question, isn't this the same as doing k-means on a dataset?
python scikit-learn gaussian
add a comment |
The GaussianMixture()
implementation in scikit-learn offers four different types of covariance matrices when fitting the model. One of those is the 'spherical' type, in which each component has its own single variance.
My question, isn't this the same as doing k-means on a dataset?
python scikit-learn gaussian
The GaussianMixture()
implementation in scikit-learn offers four different types of covariance matrices when fitting the model. One of those is the 'spherical' type, in which each component has its own single variance.
My question, isn't this the same as doing k-means on a dataset?
python scikit-learn gaussian
python scikit-learn gaussian
asked Nov 26 '18 at 9:32
ArchieArchie
590725
590725
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
K-Means is exactly like a Hard Assignment GMM, where each mixture component has isotropic variance, and they are all equal.
Just being isotropic ('spherical') does not guarantee equivalence to K-Means. the variance should also be the same.
More detailed explanation can be found here.
So if the implementation would be that each component share the same single variance value, then the two are equivalent (combined with hard assignments)?
– Archie
Nov 26 '18 at 12:47
1
Yes, if the above conditions exist, then the likelihood some pointx
is a member of some gaussian, depends only on the distance from itsmean
, exactly like inK-Means
, and during the re-evaluating of the Gaussians, you only evaluate its mean, by averaging all the points, which is too, exactly like inK-means
(As you don't re-evaluate the variance).
– Dinari
Nov 26 '18 at 12:51
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53478149%2fis-the-spherical-gaussianmixture-model-of-sklearn-the-same-as-performing-k-mea%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
K-Means is exactly like a Hard Assignment GMM, where each mixture component has isotropic variance, and they are all equal.
Just being isotropic ('spherical') does not guarantee equivalence to K-Means. the variance should also be the same.
More detailed explanation can be found here.
So if the implementation would be that each component share the same single variance value, then the two are equivalent (combined with hard assignments)?
– Archie
Nov 26 '18 at 12:47
1
Yes, if the above conditions exist, then the likelihood some pointx
is a member of some gaussian, depends only on the distance from itsmean
, exactly like inK-Means
, and during the re-evaluating of the Gaussians, you only evaluate its mean, by averaging all the points, which is too, exactly like inK-means
(As you don't re-evaluate the variance).
– Dinari
Nov 26 '18 at 12:51
add a comment |
K-Means is exactly like a Hard Assignment GMM, where each mixture component has isotropic variance, and they are all equal.
Just being isotropic ('spherical') does not guarantee equivalence to K-Means. the variance should also be the same.
More detailed explanation can be found here.
So if the implementation would be that each component share the same single variance value, then the two are equivalent (combined with hard assignments)?
– Archie
Nov 26 '18 at 12:47
1
Yes, if the above conditions exist, then the likelihood some pointx
is a member of some gaussian, depends only on the distance from itsmean
, exactly like inK-Means
, and during the re-evaluating of the Gaussians, you only evaluate its mean, by averaging all the points, which is too, exactly like inK-means
(As you don't re-evaluate the variance).
– Dinari
Nov 26 '18 at 12:51
add a comment |
K-Means is exactly like a Hard Assignment GMM, where each mixture component has isotropic variance, and they are all equal.
Just being isotropic ('spherical') does not guarantee equivalence to K-Means. the variance should also be the same.
More detailed explanation can be found here.
K-Means is exactly like a Hard Assignment GMM, where each mixture component has isotropic variance, and they are all equal.
Just being isotropic ('spherical') does not guarantee equivalence to K-Means. the variance should also be the same.
More detailed explanation can be found here.
answered Nov 26 '18 at 9:50
DinariDinari
1,694523
1,694523
So if the implementation would be that each component share the same single variance value, then the two are equivalent (combined with hard assignments)?
– Archie
Nov 26 '18 at 12:47
1
Yes, if the above conditions exist, then the likelihood some pointx
is a member of some gaussian, depends only on the distance from itsmean
, exactly like inK-Means
, and during the re-evaluating of the Gaussians, you only evaluate its mean, by averaging all the points, which is too, exactly like inK-means
(As you don't re-evaluate the variance).
– Dinari
Nov 26 '18 at 12:51
add a comment |
So if the implementation would be that each component share the same single variance value, then the two are equivalent (combined with hard assignments)?
– Archie
Nov 26 '18 at 12:47
1
Yes, if the above conditions exist, then the likelihood some pointx
is a member of some gaussian, depends only on the distance from itsmean
, exactly like inK-Means
, and during the re-evaluating of the Gaussians, you only evaluate its mean, by averaging all the points, which is too, exactly like inK-means
(As you don't re-evaluate the variance).
– Dinari
Nov 26 '18 at 12:51
So if the implementation would be that each component share the same single variance value, then the two are equivalent (combined with hard assignments)?
– Archie
Nov 26 '18 at 12:47
So if the implementation would be that each component share the same single variance value, then the two are equivalent (combined with hard assignments)?
– Archie
Nov 26 '18 at 12:47
1
1
Yes, if the above conditions exist, then the likelihood some point
x
is a member of some gaussian, depends only on the distance from its mean
, exactly like in K-Means
, and during the re-evaluating of the Gaussians, you only evaluate its mean, by averaging all the points, which is too, exactly like in K-means
(As you don't re-evaluate the variance).– Dinari
Nov 26 '18 at 12:51
Yes, if the above conditions exist, then the likelihood some point
x
is a member of some gaussian, depends only on the distance from its mean
, exactly like in K-Means
, and during the re-evaluating of the Gaussians, you only evaluate its mean, by averaging all the points, which is too, exactly like in K-means
(As you don't re-evaluate the variance).– Dinari
Nov 26 '18 at 12:51
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53478149%2fis-the-spherical-gaussianmixture-model-of-sklearn-the-same-as-performing-k-mea%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown