Finding an optimal set without forbidden subsetsHow to get bounds on ILP optimal solution quality
Who buys a weak currency?
Intern not wearing safety equipment; how could I have handled this differently?
Can Jimmy hang on his rope?
When did "&" stop being taught alongside the alphabet?
Are all diatonic chords in the diminished scale diminished?
How to find the positions of replaced elements in a list
How to convert diagonal matrix to rectangular matrix
Is it okay to roll multiple attacks that all have advantage in one cluster?
Is it okay to use open source code to do an interview task?
Would a Nikon FG 20 film SLR camera take pictures without batteries?
Write a function
How insert vertex in face?
What exactly is a "murder hobo"?
Conditions for Roots of a quadratic equation at infinity
When an electron changes its spin, or any other intrinsic property, is it still the same electron?
How does one acquire an undead eyeball encased in a gem?
Why does Trump want a citizenship question on the census?
What factors could lead to bishops establishing monastic armies?
Why does the Antonov AN-225 not have any winglets?
Finding overlapping polygons in two shapefiles and deleting them in R?
How should I ask for a "pint" in countries that use metric?
Party going through airport security at separate times?
An integral that needs subtitution to be solved.
Did the Ottoman empire suppress the printing press?
Finding an optimal set without forbidden subsets
How to get bounds on ILP optimal solution quality
$begingroup$
Given $n$ items, I want to select a set items $Ssubseteq1,2,dots,n$ that maximize profit. The profit of item $iin1,2,dots,n$ is given by $p_i$ and may be assumed to be non-negative.
Additionally, I have a set $mathscrF$ of forbidden subsets. That is, if $F in mathscrF$, then $S$ is not allowed to contain $F$ as a subset.
For example: if $n=3$, profits are given by $p_1=p_2=p_3=1$, and the forbidden subsets are given by $mathscrF = 1,2,2,3$, then the optimum is given by $S=1,3$ with profit $2$.
My question is how to best approach this problem.
Currently, I am using a knapsack problem type formulation that I solve with CPLEX. This works relatively well, but I am interested if better approaches exist, especially because I do not have any side constraints.
$$max sum_i=1^n p_i x_i,$$
$$sum_i in F x_i le lvert F rvert - 1, forall F in mathscrF,$$
$$x in 0,1^n.$$
integer-programming combinatorial-optimization algorithms
$endgroup$
add a comment |
$begingroup$
Given $n$ items, I want to select a set items $Ssubseteq1,2,dots,n$ that maximize profit. The profit of item $iin1,2,dots,n$ is given by $p_i$ and may be assumed to be non-negative.
Additionally, I have a set $mathscrF$ of forbidden subsets. That is, if $F in mathscrF$, then $S$ is not allowed to contain $F$ as a subset.
For example: if $n=3$, profits are given by $p_1=p_2=p_3=1$, and the forbidden subsets are given by $mathscrF = 1,2,2,3$, then the optimum is given by $S=1,3$ with profit $2$.
My question is how to best approach this problem.
Currently, I am using a knapsack problem type formulation that I solve with CPLEX. This works relatively well, but I am interested if better approaches exist, especially because I do not have any side constraints.
$$max sum_i=1^n p_i x_i,$$
$$sum_i in F x_i le lvert F rvert - 1, forall F in mathscrF,$$
$$x in 0,1^n.$$
integer-programming combinatorial-optimization algorithms
$endgroup$
1
$begingroup$
Is the size of your forbidden subsets bounded or small? If the size is at most $2$, then this seems to be an instance of the weighted independent set problem. If the size is larger, then I think this can be reformulated as some sort of weighted proper graph coloring problem, but I have no idea if that formulation would be easier to solve.
$endgroup$
– Discrete lizard
Jun 30 at 13:14
$begingroup$
Good point: you indeed seem to get the independent set problem for $lvert F rvert = 2$. In my application I am interested in larger forbidden sets (often exceeding size $n$/2), but I will check if the independent set literature provides new insights.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 13:48
1
$begingroup$
High cardinality forbidden sets will jack up the density of the constraint matrix (slowing the solver) and, I think, weaken the LP relaxation. Not much you can do about that. What about the cardinality of $mathscrF$? Do you tend to have lots of forbidden sets, or comparatively few?
$endgroup$
– prubin
Jun 30 at 17:57
$begingroup$
I also tend to have a relatively large number of forbidden sets, say exponential in n.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 18:16
add a comment |
$begingroup$
Given $n$ items, I want to select a set items $Ssubseteq1,2,dots,n$ that maximize profit. The profit of item $iin1,2,dots,n$ is given by $p_i$ and may be assumed to be non-negative.
Additionally, I have a set $mathscrF$ of forbidden subsets. That is, if $F in mathscrF$, then $S$ is not allowed to contain $F$ as a subset.
For example: if $n=3$, profits are given by $p_1=p_2=p_3=1$, and the forbidden subsets are given by $mathscrF = 1,2,2,3$, then the optimum is given by $S=1,3$ with profit $2$.
My question is how to best approach this problem.
Currently, I am using a knapsack problem type formulation that I solve with CPLEX. This works relatively well, but I am interested if better approaches exist, especially because I do not have any side constraints.
$$max sum_i=1^n p_i x_i,$$
$$sum_i in F x_i le lvert F rvert - 1, forall F in mathscrF,$$
$$x in 0,1^n.$$
integer-programming combinatorial-optimization algorithms
$endgroup$
Given $n$ items, I want to select a set items $Ssubseteq1,2,dots,n$ that maximize profit. The profit of item $iin1,2,dots,n$ is given by $p_i$ and may be assumed to be non-negative.
Additionally, I have a set $mathscrF$ of forbidden subsets. That is, if $F in mathscrF$, then $S$ is not allowed to contain $F$ as a subset.
For example: if $n=3$, profits are given by $p_1=p_2=p_3=1$, and the forbidden subsets are given by $mathscrF = 1,2,2,3$, then the optimum is given by $S=1,3$ with profit $2$.
My question is how to best approach this problem.
Currently, I am using a knapsack problem type formulation that I solve with CPLEX. This works relatively well, but I am interested if better approaches exist, especially because I do not have any side constraints.
$$max sum_i=1^n p_i x_i,$$
$$sum_i in F x_i le lvert F rvert - 1, forall F in mathscrF,$$
$$x in 0,1^n.$$
integer-programming combinatorial-optimization algorithms
integer-programming combinatorial-optimization algorithms
asked Jun 30 at 10:58
Kevin DalmeijerKevin Dalmeijer
75314 bronze badges
75314 bronze badges
1
$begingroup$
Is the size of your forbidden subsets bounded or small? If the size is at most $2$, then this seems to be an instance of the weighted independent set problem. If the size is larger, then I think this can be reformulated as some sort of weighted proper graph coloring problem, but I have no idea if that formulation would be easier to solve.
$endgroup$
– Discrete lizard
Jun 30 at 13:14
$begingroup$
Good point: you indeed seem to get the independent set problem for $lvert F rvert = 2$. In my application I am interested in larger forbidden sets (often exceeding size $n$/2), but I will check if the independent set literature provides new insights.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 13:48
1
$begingroup$
High cardinality forbidden sets will jack up the density of the constraint matrix (slowing the solver) and, I think, weaken the LP relaxation. Not much you can do about that. What about the cardinality of $mathscrF$? Do you tend to have lots of forbidden sets, or comparatively few?
$endgroup$
– prubin
Jun 30 at 17:57
$begingroup$
I also tend to have a relatively large number of forbidden sets, say exponential in n.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 18:16
add a comment |
1
$begingroup$
Is the size of your forbidden subsets bounded or small? If the size is at most $2$, then this seems to be an instance of the weighted independent set problem. If the size is larger, then I think this can be reformulated as some sort of weighted proper graph coloring problem, but I have no idea if that formulation would be easier to solve.
$endgroup$
– Discrete lizard
Jun 30 at 13:14
$begingroup$
Good point: you indeed seem to get the independent set problem for $lvert F rvert = 2$. In my application I am interested in larger forbidden sets (often exceeding size $n$/2), but I will check if the independent set literature provides new insights.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 13:48
1
$begingroup$
High cardinality forbidden sets will jack up the density of the constraint matrix (slowing the solver) and, I think, weaken the LP relaxation. Not much you can do about that. What about the cardinality of $mathscrF$? Do you tend to have lots of forbidden sets, or comparatively few?
$endgroup$
– prubin
Jun 30 at 17:57
$begingroup$
I also tend to have a relatively large number of forbidden sets, say exponential in n.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 18:16
1
1
$begingroup$
Is the size of your forbidden subsets bounded or small? If the size is at most $2$, then this seems to be an instance of the weighted independent set problem. If the size is larger, then I think this can be reformulated as some sort of weighted proper graph coloring problem, but I have no idea if that formulation would be easier to solve.
$endgroup$
– Discrete lizard
Jun 30 at 13:14
$begingroup$
Is the size of your forbidden subsets bounded or small? If the size is at most $2$, then this seems to be an instance of the weighted independent set problem. If the size is larger, then I think this can be reformulated as some sort of weighted proper graph coloring problem, but I have no idea if that formulation would be easier to solve.
$endgroup$
– Discrete lizard
Jun 30 at 13:14
$begingroup$
Good point: you indeed seem to get the independent set problem for $lvert F rvert = 2$. In my application I am interested in larger forbidden sets (often exceeding size $n$/2), but I will check if the independent set literature provides new insights.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 13:48
$begingroup$
Good point: you indeed seem to get the independent set problem for $lvert F rvert = 2$. In my application I am interested in larger forbidden sets (often exceeding size $n$/2), but I will check if the independent set literature provides new insights.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 13:48
1
1
$begingroup$
High cardinality forbidden sets will jack up the density of the constraint matrix (slowing the solver) and, I think, weaken the LP relaxation. Not much you can do about that. What about the cardinality of $mathscrF$? Do you tend to have lots of forbidden sets, or comparatively few?
$endgroup$
– prubin
Jun 30 at 17:57
$begingroup$
High cardinality forbidden sets will jack up the density of the constraint matrix (slowing the solver) and, I think, weaken the LP relaxation. Not much you can do about that. What about the cardinality of $mathscrF$? Do you tend to have lots of forbidden sets, or comparatively few?
$endgroup$
– prubin
Jun 30 at 17:57
$begingroup$
I also tend to have a relatively large number of forbidden sets, say exponential in n.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 18:16
$begingroup$
I also tend to have a relatively large number of forbidden sets, say exponential in n.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 18:16
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Your problem is equivalent to finding a maximum weighted independent set in a hypergraph, where each item is a vertex and every forbidden set is an hyperedge over the elements in the set.
This is a hard problem, not just NP-hard (since it is a generalisation of the NP-hard weighted independent set problem), but also NP-hard to approximate within a constant factor (Because independent set is also NP-hard to approximate1).
What this means is that unless your specific instances have some structure that can be exploited, heuristic solutions or MILP, SAT, etc. solvers are going to be the best you can do for this problem.
There is some literature on the (non-weighted) independent set problem in hypergraphs. Most of the results I see are about regular or uniform hypergraphs, where all hyperedges have the same size. So, in case your forbidden subsets have different cardinality, then there doesn't really seem to be much research about your problem.
All in all, I think this provides some evidence that there is not much better you can do than you already did. I'm happy to be proven wrong, of course, in which case the literature on hypergraphs may help.
1: It is known to be NP-hard to get a better approximation ratio than $n/2^(log n)^3/4+gamma$ for any $gamma>0$, see Khot, S., & Ponnuswami, A. K. (2006, July). Better inapproximability results for maxclique, chromatic number and min-3lin-deletion. In International Colloquium on Automata, Languages, and Programming (pp. 226-237). Springer, Berlin, Heidelberg.
$endgroup$
add a comment |
$begingroup$
Since this problem has exponentially many constraints, I suspect that most forbidden subset constraints (FSCs) will not be binding at optimality. Therefore, something which you could try is: (a) pick a promising set of FSCs to add apriori and (b) add the remaining forbidden constraints via lazy callbacks, where you add the "most violated" constraint at each iteration.
If you know that some of the FSCs imply other FSCs then you should also ignore the redundant constraints. In practise, this means that you should favour adding hidden subsets of small cardinality first.
$endgroup$
$begingroup$
N.b. a minimal working example which uses lazy constraints with JuMP.jl version 0.18 and Gurobi is given here
$endgroup$
– Ryan Cory-Wright
Jun 30 at 19:35
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "700"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2for.stackexchange.com%2fquestions%2f833%2ffinding-an-optimal-set-without-forbidden-subsets%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Your problem is equivalent to finding a maximum weighted independent set in a hypergraph, where each item is a vertex and every forbidden set is an hyperedge over the elements in the set.
This is a hard problem, not just NP-hard (since it is a generalisation of the NP-hard weighted independent set problem), but also NP-hard to approximate within a constant factor (Because independent set is also NP-hard to approximate1).
What this means is that unless your specific instances have some structure that can be exploited, heuristic solutions or MILP, SAT, etc. solvers are going to be the best you can do for this problem.
There is some literature on the (non-weighted) independent set problem in hypergraphs. Most of the results I see are about regular or uniform hypergraphs, where all hyperedges have the same size. So, in case your forbidden subsets have different cardinality, then there doesn't really seem to be much research about your problem.
All in all, I think this provides some evidence that there is not much better you can do than you already did. I'm happy to be proven wrong, of course, in which case the literature on hypergraphs may help.
1: It is known to be NP-hard to get a better approximation ratio than $n/2^(log n)^3/4+gamma$ for any $gamma>0$, see Khot, S., & Ponnuswami, A. K. (2006, July). Better inapproximability results for maxclique, chromatic number and min-3lin-deletion. In International Colloquium on Automata, Languages, and Programming (pp. 226-237). Springer, Berlin, Heidelberg.
$endgroup$
add a comment |
$begingroup$
Your problem is equivalent to finding a maximum weighted independent set in a hypergraph, where each item is a vertex and every forbidden set is an hyperedge over the elements in the set.
This is a hard problem, not just NP-hard (since it is a generalisation of the NP-hard weighted independent set problem), but also NP-hard to approximate within a constant factor (Because independent set is also NP-hard to approximate1).
What this means is that unless your specific instances have some structure that can be exploited, heuristic solutions or MILP, SAT, etc. solvers are going to be the best you can do for this problem.
There is some literature on the (non-weighted) independent set problem in hypergraphs. Most of the results I see are about regular or uniform hypergraphs, where all hyperedges have the same size. So, in case your forbidden subsets have different cardinality, then there doesn't really seem to be much research about your problem.
All in all, I think this provides some evidence that there is not much better you can do than you already did. I'm happy to be proven wrong, of course, in which case the literature on hypergraphs may help.
1: It is known to be NP-hard to get a better approximation ratio than $n/2^(log n)^3/4+gamma$ for any $gamma>0$, see Khot, S., & Ponnuswami, A. K. (2006, July). Better inapproximability results for maxclique, chromatic number and min-3lin-deletion. In International Colloquium on Automata, Languages, and Programming (pp. 226-237). Springer, Berlin, Heidelberg.
$endgroup$
add a comment |
$begingroup$
Your problem is equivalent to finding a maximum weighted independent set in a hypergraph, where each item is a vertex and every forbidden set is an hyperedge over the elements in the set.
This is a hard problem, not just NP-hard (since it is a generalisation of the NP-hard weighted independent set problem), but also NP-hard to approximate within a constant factor (Because independent set is also NP-hard to approximate1).
What this means is that unless your specific instances have some structure that can be exploited, heuristic solutions or MILP, SAT, etc. solvers are going to be the best you can do for this problem.
There is some literature on the (non-weighted) independent set problem in hypergraphs. Most of the results I see are about regular or uniform hypergraphs, where all hyperedges have the same size. So, in case your forbidden subsets have different cardinality, then there doesn't really seem to be much research about your problem.
All in all, I think this provides some evidence that there is not much better you can do than you already did. I'm happy to be proven wrong, of course, in which case the literature on hypergraphs may help.
1: It is known to be NP-hard to get a better approximation ratio than $n/2^(log n)^3/4+gamma$ for any $gamma>0$, see Khot, S., & Ponnuswami, A. K. (2006, July). Better inapproximability results for maxclique, chromatic number and min-3lin-deletion. In International Colloquium on Automata, Languages, and Programming (pp. 226-237). Springer, Berlin, Heidelberg.
$endgroup$
Your problem is equivalent to finding a maximum weighted independent set in a hypergraph, where each item is a vertex and every forbidden set is an hyperedge over the elements in the set.
This is a hard problem, not just NP-hard (since it is a generalisation of the NP-hard weighted independent set problem), but also NP-hard to approximate within a constant factor (Because independent set is also NP-hard to approximate1).
What this means is that unless your specific instances have some structure that can be exploited, heuristic solutions or MILP, SAT, etc. solvers are going to be the best you can do for this problem.
There is some literature on the (non-weighted) independent set problem in hypergraphs. Most of the results I see are about regular or uniform hypergraphs, where all hyperedges have the same size. So, in case your forbidden subsets have different cardinality, then there doesn't really seem to be much research about your problem.
All in all, I think this provides some evidence that there is not much better you can do than you already did. I'm happy to be proven wrong, of course, in which case the literature on hypergraphs may help.
1: It is known to be NP-hard to get a better approximation ratio than $n/2^(log n)^3/4+gamma$ for any $gamma>0$, see Khot, S., & Ponnuswami, A. K. (2006, July). Better inapproximability results for maxclique, chromatic number and min-3lin-deletion. In International Colloquium on Automata, Languages, and Programming (pp. 226-237). Springer, Berlin, Heidelberg.
answered Jun 30 at 17:37
Discrete lizardDiscrete lizard
8221 silver badge17 bronze badges
8221 silver badge17 bronze badges
add a comment |
add a comment |
$begingroup$
Since this problem has exponentially many constraints, I suspect that most forbidden subset constraints (FSCs) will not be binding at optimality. Therefore, something which you could try is: (a) pick a promising set of FSCs to add apriori and (b) add the remaining forbidden constraints via lazy callbacks, where you add the "most violated" constraint at each iteration.
If you know that some of the FSCs imply other FSCs then you should also ignore the redundant constraints. In practise, this means that you should favour adding hidden subsets of small cardinality first.
$endgroup$
$begingroup$
N.b. a minimal working example which uses lazy constraints with JuMP.jl version 0.18 and Gurobi is given here
$endgroup$
– Ryan Cory-Wright
Jun 30 at 19:35
add a comment |
$begingroup$
Since this problem has exponentially many constraints, I suspect that most forbidden subset constraints (FSCs) will not be binding at optimality. Therefore, something which you could try is: (a) pick a promising set of FSCs to add apriori and (b) add the remaining forbidden constraints via lazy callbacks, where you add the "most violated" constraint at each iteration.
If you know that some of the FSCs imply other FSCs then you should also ignore the redundant constraints. In practise, this means that you should favour adding hidden subsets of small cardinality first.
$endgroup$
$begingroup$
N.b. a minimal working example which uses lazy constraints with JuMP.jl version 0.18 and Gurobi is given here
$endgroup$
– Ryan Cory-Wright
Jun 30 at 19:35
add a comment |
$begingroup$
Since this problem has exponentially many constraints, I suspect that most forbidden subset constraints (FSCs) will not be binding at optimality. Therefore, something which you could try is: (a) pick a promising set of FSCs to add apriori and (b) add the remaining forbidden constraints via lazy callbacks, where you add the "most violated" constraint at each iteration.
If you know that some of the FSCs imply other FSCs then you should also ignore the redundant constraints. In practise, this means that you should favour adding hidden subsets of small cardinality first.
$endgroup$
Since this problem has exponentially many constraints, I suspect that most forbidden subset constraints (FSCs) will not be binding at optimality. Therefore, something which you could try is: (a) pick a promising set of FSCs to add apriori and (b) add the remaining forbidden constraints via lazy callbacks, where you add the "most violated" constraint at each iteration.
If you know that some of the FSCs imply other FSCs then you should also ignore the redundant constraints. In practise, this means that you should favour adding hidden subsets of small cardinality first.
answered Jun 30 at 19:31
Ryan Cory-WrightRyan Cory-Wright
8254 silver badges17 bronze badges
8254 silver badges17 bronze badges
$begingroup$
N.b. a minimal working example which uses lazy constraints with JuMP.jl version 0.18 and Gurobi is given here
$endgroup$
– Ryan Cory-Wright
Jun 30 at 19:35
add a comment |
$begingroup$
N.b. a minimal working example which uses lazy constraints with JuMP.jl version 0.18 and Gurobi is given here
$endgroup$
– Ryan Cory-Wright
Jun 30 at 19:35
$begingroup$
N.b. a minimal working example which uses lazy constraints with JuMP.jl version 0.18 and Gurobi is given here
$endgroup$
– Ryan Cory-Wright
Jun 30 at 19:35
$begingroup$
N.b. a minimal working example which uses lazy constraints with JuMP.jl version 0.18 and Gurobi is given here
$endgroup$
– Ryan Cory-Wright
Jun 30 at 19:35
add a comment |
Thanks for contributing an answer to Operations Research Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2for.stackexchange.com%2fquestions%2f833%2ffinding-an-optimal-set-without-forbidden-subsets%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
Is the size of your forbidden subsets bounded or small? If the size is at most $2$, then this seems to be an instance of the weighted independent set problem. If the size is larger, then I think this can be reformulated as some sort of weighted proper graph coloring problem, but I have no idea if that formulation would be easier to solve.
$endgroup$
– Discrete lizard
Jun 30 at 13:14
$begingroup$
Good point: you indeed seem to get the independent set problem for $lvert F rvert = 2$. In my application I am interested in larger forbidden sets (often exceeding size $n$/2), but I will check if the independent set literature provides new insights.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 13:48
1
$begingroup$
High cardinality forbidden sets will jack up the density of the constraint matrix (slowing the solver) and, I think, weaken the LP relaxation. Not much you can do about that. What about the cardinality of $mathscrF$? Do you tend to have lots of forbidden sets, or comparatively few?
$endgroup$
– prubin
Jun 30 at 17:57
$begingroup$
I also tend to have a relatively large number of forbidden sets, say exponential in n.
$endgroup$
– Kevin Dalmeijer
Jun 30 at 18:16