While I was aware that Knapsack and IP were both NP complete which makes them in some sense algorithmically equivalent, I was not really thinking that the problems themselves were equivalent. If, indeed, I can convince myself that the constrained optimization problem I want to solve can be mapped into a knapsack problem (and I suspect it can and that I'm just not trying hard enough to do so) then we should reactivate our collaboration. I'm all in favor of something that can be described with mathematical rigor and knapsack plus widgets may very well be easier to build and maintain than a full blown IP. To be continued.
The whole point of NP-completeness is that all NP-complete problems are actually solving the same problem -- that is, search. It doesn't really matter if IP or knapsack is used.
(... can't ... resist ... biting ...)
Of course, if you pre-plan your diet with integer programming/knapsack, you might run into problems when enough of the foods are not available at the same price (which mafia won't let you know until after you acquire some of them), so you have to stop and recompute ...
Also, you might want to take a look at eleron's javascript at
http://www.houeland.com/kol/diets, that thing is regarded as very good. And is very fast (in part because it has pre-cached item prices).
...
As for dynamic programming ...
[ snip ]
Of course this is wrong, because it doesn't consider many real-world factors, like limited availability (or availability of the same thing at different prices), maximum allowed cost of diet (it's hard to spend 30k on food when you only have 5k at hand), safety features for people who dislike spending too much on a single item, etc ...
I'm well aware of Eleron's diet calculator -- in fact, the first version of consumption.ash (my ED replacement) actually just scraped the old calculator, but then I realized that was probably not a good idea if I wanted to release this script into the wild, especially since I hadn't asked him for permission.
The currently available version of consumption.ash runs very quickly if it doesn't have to look up mall prices (i.e. just using historical_price) -- I think I've clocked it at a tenth of a second or so. It does use DP; and yes, I know that it's not quite optimal in real-world use, since that much mall information just isn't available to ASH.
I originally had planned for a third revision as a port of Eleron's javascript calculator, but this turned out to be more complicated than originally envisioned, and I just haven't had enough time / interest.
Also: the way DP would work with your example:
Code:
1: 1fc
2: 2fc
3: 3fc
4: yuck
5: fc + yuck
6: 2fc + yuck
7: nom
8: fc + nom
9: 2fc + nom
10: yum
11: fc+yum
12: 2fc+yum
13: 3fc+yum
At this point, 2nom is better than yum+yuck, so
Expected run-time for unbounded knapsack is simply O(S*N), where S is size of knapsack, and N is number of items. In fact, if you start considering each item at each price to be distinct, you can argue the same for bounded knapsack (once again, it's pseudo-polynomial runtime). Space used is fairly small (O(S^2), considering that we have a list of no more than S items for each fullness).
Also, you can improve runtimes by considering notions of dominance (where one item is simply better than another) and, say, taking the top MAX_FULL fullness worth of foods for each fullness tier (in sorted order). Since my implementation assumed unbounded knapsack, I could improve this further by simply taking the best for each tier and then removing larger foods that were completely dominated by smaller foods.
So the expected run-time here for fullness would be some constant multiplied by S * (S + S/2 + S/3 + S/4 + ... + 1), or let's just overestimate with roughly 3*S^2 (where S is bounded above by 39, if you count for distension and melange). If we also count all the precomputing, that'll be a single loop over all items (or consumables, doesn't matter) to categorize all relevant items, and then another 3 * S log S to sort all the lists.
Recomputing diet after every item... would increase runtime by maybe a factor of S. Even so, Eleron's javascript calculator certainly doesn't do this -- as you mentioned, prices are cached.
To be honest, the real bottleneck is finding updated prices in a timely manner, since Mafia does not expose this information. I imagine fifth-mall-price is a good enough approximation for most cases, although it will skip items where a few (<5) suppliers have a large quantity at reasonable prices, and the rest are very overpriced.
Eleron's site has all the results precomputed and simply retrieves the result when you fill in your parameters, I believe. I remember him saying that the calculations would be slow and can't be done right when someone enters in their values.
I thought that his new version of the calculator computed it in the browser. Sure, the old version simply retrieved the result, but the current version seems to actually contain logic.
edit: also, xKiv: that's why the greedy solution to the knapsack problem ensures at least 50% of the optimal result
Suppose you have 20 fullness. Item A gives 11 fullness for value of 5000, item B gives 10 fullness for value of 4999. Greedy algorithm picks locally best item at every step. It certainly runs in linear time but only gives a value of 5000 versus 9998. (yes, this is even more contrived
)