Q: I have the the following equations in my model:
obj.. y=e=min(xa,xb,xc)+xd; model m /all/; solve m using dnlp maximizing y;
MINOS was able to solve a very small instance of my model, but as the size increased MINOS was giving up.
The min function is a very dangerous one. To discourage users to use this function we don't allow this in an NLP model, and you are forced to declare the model as a DNLP model. The problem is that the min function is not differentiable, just like the max and the abs. The best advice I can give you is: stay away from them. In your case we can reformulate the model quite easily to make it a normal NLP, which solves without problems with MINOS:
obj1.. y=L=xa+xd ; obj2.. y=L=xb+xd ; obj3.. y=L=xc+xd ;
This is possible because we maximize y, which will assume automatically in the optimal solution the minimum of the three right-hand sides.
Chapter NLP and DNLP Models of the CONOPT Manual has some more examples of the reformulation of DNLP models.
There are also several GAMS intrinsic functions that smoothly approximate MIN(f,g). The motivation for putting these in GAMS was for use in reformulation approaches for MCP and MPEC models - the complementarity conditions can be cast as equations using the MIN function, and the smoothed MIN functions allow solution via NLP solvers. For example, the Fischer-Burmeister function and the Chen-Mangasarian function are both smoothed MIN functions available in GAMS, and since they are intrinsics you only need to write NcpF(f,g,mu) or NcpCM(f,g,mu), where mu is the smoothing parameter (like the delta below). You can get the exact definition for these functions and others that may also help you by looking at section NCP functions of the NLPEC solver manual.