Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
jjcm
5 months ago
|
parent
|
context
|
favorite
| on:
Qwen3-Next
As plenty of others have mentioned here, if inference were 100x cheaper, I would run 200x inference.
There are so many things you can do with long running, continuous inference.
sipjca
5 months ago
[–]
but what if you don't need to run it in the cloud
ukuina
5 months ago
|
parent
[–]
You will ALWAYS want to use the absolute best model, because your time is more valuable than the machine's. If the machine gets faster or more capable, your value has jumped proportionally.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
There are so many things you can do with long running, continuous inference.