Of course I think this is a bad idea as it would only be a matter of time before the scope of this would expand.
However, what if there exist serious world-problems for which the most effective solutions have a high potential for "expansion"? In that case, it would seem we would need to find some way to prevent or minimize that. How many world-problems go unsolved because of this possibility for the "expansion" of the solution (and also, for which this absolutely cannot be prevented or mitigated enough so as to be less of a problem than the original problem)? And how many cause massive harm from going unsolved for so long?