What's the difference between multi label classification and fuzzy classification?Machine Learning - Where is the difference between one-class, binary-class and multinominal-class classification?Machine Learning - Where is the difference between one-class, binary-class and multinominal-class classification?How to use binary relevance for multi-label text classification?Large Numpy.Array for Multi-label Image Classification (CelebA Dataset)How to classify a dataset into 5 classes even though the performance is low?multi class classification : unbalanced data - good testing results poor prediction resultsHow to visualize results/errors of multilabel classifiers?Training multi-label classifier with unbalanced samples in KerasAppropriate math for evaluating coverage/fit across multiple weighted many-to-many relationshipsHow to optimize function built on top of the classifier?
Minor Revision with suggestion of an alternative proof by reviewer
Two field separators (colon and space) in awk
Phrase for the opposite of "foolproof"
How could Tony Stark make this in Endgame?
Don’t seats that recline flat defeat the purpose of having seatbelts?
How to write a column outside the braces in a matrix?
What is the optimal strategy for the Dictionary Game?
What is causing the white spot to appear in some of my pictures
a sore throat vs a strep throat vs strep throat
How do I deal with a coworker that keeps asking to make small superficial changes to a report, and it is seriously triggering my anxiety?
How to not starve gigantic beasts
How to display Aura JS Errors Lightning Out
What term is being referred to with "reflected-sound-of-underground-spirits"?
How to have a sharp product image?
How exactly does Hawking radiation decrease the mass of black holes?
Pulling the rope with one hand is as heavy as with two hands?
How come there are so many candidates for the 2020 Democratic party presidential nomination?
How much cash can I safely carry into the USA and avoid civil forfeiture?
How to limit Drive Letters Windows assigns to new removable USB drives
Checks user level and limit the data before saving it to mongoDB
I preordered a game on my Xbox while on the home screen of my friend's account. Which of us owns the game?
Does a large simulator bay have standard public address announcements?
Why does Mind Blank stop the Feeblemind spell?
Get consecutive integer number ranges from list of int
What's the difference between multi label classification and fuzzy classification?
Machine Learning - Where is the difference between one-class, binary-class and multinominal-class classification?Machine Learning - Where is the difference between one-class, binary-class and multinominal-class classification?How to use binary relevance for multi-label text classification?Large Numpy.Array for Multi-label Image Classification (CelebA Dataset)How to classify a dataset into 5 classes even though the performance is low?multi class classification : unbalanced data - good testing results poor prediction resultsHow to visualize results/errors of multilabel classifiers?Training multi-label classifier with unbalanced samples in KerasAppropriate math for evaluating coverage/fit across multiple weighted many-to-many relationshipsHow to optimize function built on top of the classifier?
$begingroup$
Is it just the between academics and practitioners in term usage?
Or is theoretical difference of how we consider each sample: as belonging to multiple classes at once or to one fuzzy class?
Or this distinction has some practical meaning of how we build model for classification?
classification multilabel-classification fuzzy-logic fuzzy-classification
$endgroup$
add a comment |
$begingroup$
Is it just the between academics and practitioners in term usage?
Or is theoretical difference of how we consider each sample: as belonging to multiple classes at once or to one fuzzy class?
Or this distinction has some practical meaning of how we build model for classification?
classification multilabel-classification fuzzy-logic fuzzy-classification
$endgroup$
add a comment |
$begingroup$
Is it just the between academics and practitioners in term usage?
Or is theoretical difference of how we consider each sample: as belonging to multiple classes at once or to one fuzzy class?
Or this distinction has some practical meaning of how we build model for classification?
classification multilabel-classification fuzzy-logic fuzzy-classification
$endgroup$
Is it just the between academics and practitioners in term usage?
Or is theoretical difference of how we consider each sample: as belonging to multiple classes at once or to one fuzzy class?
Or this distinction has some practical meaning of how we build model for classification?
classification multilabel-classification fuzzy-logic fuzzy-classification
classification multilabel-classification fuzzy-logic fuzzy-classification
asked Apr 23 at 12:37
DmytroSytroDmytroSytro
1809
1809
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow 0, 1^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_k=1^K p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_k=1^K p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
$endgroup$
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
Apr 23 at 14:19
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
Apr 23 at 14:27
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
Apr 23 at 14:29
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
Apr 23 at 15:31
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
Apr 23 at 15:50
add a comment |
$begingroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
$endgroup$
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
Apr 23 at 13:50
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
Apr 23 at 13:57
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "557"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f49780%2fwhats-the-difference-between-multi-label-classification-and-fuzzy-classificatio%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow 0, 1^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_k=1^K p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_k=1^K p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
$endgroup$
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
Apr 23 at 14:19
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
Apr 23 at 14:27
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
Apr 23 at 14:29
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
Apr 23 at 15:31
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
Apr 23 at 15:50
add a comment |
$begingroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow 0, 1^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_k=1^K p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_k=1^K p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
$endgroup$
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
Apr 23 at 14:19
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
Apr 23 at 14:27
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
Apr 23 at 14:29
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
Apr 23 at 15:31
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
Apr 23 at 15:50
add a comment |
$begingroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow 0, 1^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_k=1^K p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_k=1^K p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
$endgroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow 0, 1^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_k=1^K p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_k=1^K p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
edited Apr 23 at 15:55
answered Apr 23 at 14:13
EsmailianEsmailian
3,896422
3,896422
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
Apr 23 at 14:19
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
Apr 23 at 14:27
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
Apr 23 at 14:29
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
Apr 23 at 15:31
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
Apr 23 at 15:50
add a comment |
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
Apr 23 at 14:19
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
Apr 23 at 14:27
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
Apr 23 at 14:29
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
Apr 23 at 15:31
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
Apr 23 at 15:50
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
Apr 23 at 14:19
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
Apr 23 at 14:19
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
Apr 23 at 14:27
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
Apr 23 at 14:27
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
Apr 23 at 14:29
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
Apr 23 at 14:29
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
Apr 23 at 15:31
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
Apr 23 at 15:31
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
Apr 23 at 15:50
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
Apr 23 at 15:50
add a comment |
$begingroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
$endgroup$
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
Apr 23 at 13:50
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
Apr 23 at 13:57
add a comment |
$begingroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
$endgroup$
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
Apr 23 at 13:50
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
Apr 23 at 13:57
add a comment |
$begingroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
$endgroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
answered Apr 23 at 13:45
SterlsSterls
1435
1435
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
Apr 23 at 13:50
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
Apr 23 at 13:57
add a comment |
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
Apr 23 at 13:50
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
Apr 23 at 13:57
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
Apr 23 at 13:50
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
Apr 23 at 13:50
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
Apr 23 at 13:57
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
Apr 23 at 13:57
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f49780%2fwhats-the-difference-between-multi-label-classification-and-fuzzy-classificatio%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown