python - How to efficiently remove redundant linear constraints for optimization? -
There are thousands of linear obstacles in my optimization problem. I want to reduce the complexity of my problem by finding unnecessary obstacles, eg. So, I have a narrow array of all the qualities, and it looks like this, for example: Representation of constraints: How can I efficiently find out who is unnecessary? My general knowledge give me a loop (pseudo-kodish): But is it slow for thousands of constraints, any way to speed it up? 3 * x + 4 * y
4 * x + 5 * y
x
and y
are > which is a matter of my problem).
[[0.1, 3.0, 4.8, 0.2] , [1.0, 4.7], 5.3, 0.1], [2.2, 4.3, 5.2, 1.1]]
0.1 * w + 3.0 * x + 4.8 * Y + 0.2 * z & lt; 10 1.0 * w + 4.7 * x + 5.3 * y + 0.1 * z & lt; 10 2.2 * w + 4.3 * x + 5.2 * y + 1.1 * z & lt; 10
for
i, enumerate (array) in line 1: for j, enumerate in line 2 ( Array): if j & gt; I: If all (line 1> line 2): Delete row
You can think of every obstacle as a hyperplane, continuity of the amount ) / (Coefficient of that axis); If coefficient is 0, then the hyperplane is parallel to that axis (== "interference on intervals").
Unfortunately, if the axial block for a hyperplane is equal to or more than all related obstructions, for this second, this hyperplane is unnecessary.
To eliminate as many obstacles as possible, you want to start by comparing them with people whose hyperplane (A) is as close to the original as possible and (b) As parallel to some axes as possible, because it can only erase other hyperplanes, which are parallel to that axis. [An intact is not parallel to any axis may be able to solve parallel to that axis, but inverted truth never happens.]
I suggest that If you sort the list (the number of axis-parallel axes) (the amount of non-infinite axis inclined)
Comments
Post a Comment