I use the Sheared-LLaMA-2.7B-ShareGPT。When I ask a question, the answer stops after a certain number of words, is there any way to increase the token cap?
I’ve tried increasing the max-new-tokens value, but it doesn’t work。
The maximum of maxoutput_tokens is 128。Is this due to limitations of the model itself, or due to limitations of the hardware platform。
There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks
Is this still an issue to support? Any result can be shared?