Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It doesn't matter because only a minority of product companies worldwide (regardless enterprise or not) uses MCP.

You'll likely be OK until Google gets a penis. Then you're toast.

You likely pay for YouTube premium if you aren’t noticing adds

They can buy a local LLM from an expert provider.

Isn't it a little early to declare success? I think the bigger worry with the US though is not whether it is technically possible, but whether anyone in power cares to actually help kids versus using this it as an excuse to implement Orwellian surveillance upon citizens.

I did qualify it with "most people" because of people like you who enjoy that kind of work :).

I would hate that work, but luckily we have all sorts of different people in the world who enjoy different things. I hope you find something that you really enjoy doing.


Sorry, I was cracking a joke about the browser in a shader.

The GLSL I originally posted is from the "cursed mode" of my side project, and I use it to produce a data URI of every frame, 15 times per second, as a twisted homage to old hardware. (No, I didn't use AI :P )

https://github.com/Rezmason/excel_97_egg

That said, is `pow(vec4(2),-vec4(2,4,6,0))` really so bad? I figured it'd be replaced with `vec4(0.25, 0.0625, 0.015625, 1.0)`.


Are you arguing that USA can no longer build parking lots due to environmental concerns? If so, that would indeed be remarkable since parking lots seem to be the facility that almost every US town has been able to build more than enough of.

Twitter has been toast for quite some time now, well before the Elon effect...

I think pretty much every social platform has transitioned into the same "edgelordy-ness" we dealt with on smaller community forums & IRC back in the day...

Nobody talks about it on platforms out of a fear of retribution, but a handful of people are not meant to have this much control over massive groups of people...

It always devolves into a scheme that only serves the top & only profit at the top.

Twitter was a very useful real-time information tool, that slowly degraded into a payola promo haven, just like FaceBook, Instagram, and now even TikTok...

The time for useful tools of that kind has surpassed us, as most people on these platforms lie about their botted & payola boosted audiences & views.

Social media simply got too big, there aren't even categorized topics one can subscribe to anymore without it all being blended into a random timeline of everyone's posts... That's the final trumpet call for it, something totally different will need to come along and take over until it too becomes overran monopolized as an emotional manipulation tool.

A far better way to promote skills & business is to paint your info on the side of your car now... Far more people will see it than your social account, without needing to pay a membership or for to boost each post. shrug


This got me a chuckle.

> Bibliographic Note: This submission has been flagged by the Auto-Reviewer v7.0 due to high similarity with "Running DOOM on a Mitochondria" (2034).

for the article on "Running LLaMA-12 7B on a contact lens with WASM"

https://sw.vtom.net/hn35/pages/90100123.html


No, I don't blog. But I just followed the docs for starting an instance on lambda.ai and the llama.cpp build instructions. Both are pretty good resources. I had already setup an SSH key with lambda and the lambda OS images are linux pre-loaded with CUDA libraries on startup.

Here are my lazy notes + a snippet of the history file from the remote instance for a recent setup where I used the web chat interface built into llama.cpp.

I created an instance gpu_1x_gh200 (96 GB on ARM) at lambda.ai.

connected from terminal on my box at home and setup the ssh tunnel.

ssh -L 22434:127.0.0.1:11434 ubuntu@<ip address>

  Started building llama.cpp from source, history:    
     21  git clone   https://github.com/ggml-org/llama.cpp
     22  cd llama.cpp
     23  which cmake
     24  sudo apt list | grep libcurl
     25  sudo apt-get install libcurl4-openssl-dev
     26  cmake -B build -DGGML_CUDA=ON
     27  cmake --build build --config Release 
MISTAKE on 27, SINGLE-THREADED and slow to build see -j 16 below for faster build

     28  cmake --build build --config Release -j 16
     29  ls
     30  ls build
     31  find . -name "llama.server"
     32  find . -name "llama"
     33  ls build/bin/
     34  cd build/bin/
     35  ls
     36  ./llama-server -hf ggml-org/gpt-oss-120b-GGUF -c 0 --jinja
MISTAKE, didn't specify the port number for the llama-server

     37  clear;history
     38  ./llama-server -hf Qwen/Qwen3-VL-30B-A3B-Thinking -c 0 --jinja --port 11434
     39  ./llama-server -hf Qwen/Qwen3-VL-30B-A3B-Thinking.gguf -c 0 --jinja --port 11434
     40  ./llama-server -hf Qwen/Qwen3-VL-30B-A3B-Thinking-GGUF -c 0 --jinja --port 11434
     41  clear;history
I switched to qwen3 vl because I need a multimodal model for that day's experiment. Lines 38 and 39 show me not using the right name for the model. I like how llama.cpp can download and run models directly off of huggingface.

Then pointed my browser at http//:localhost:22434 on my local box and had the normal browser window where I could upload files and use the chat interface with the model. That also gives you an openai api-compatible endpoint. It was all I needed for what I was doing that day. I spent a grand total of $4 that day doing the setup and running some NLP-oriented prompts for a few hours.


Unfortunately there's too much distraction regarding the AI side of the discussion, to actually look at the generation tech itself.

For all their discussion of high temperature operation, it seems the only advantage at the end of the day is to eliminate water consumption in cooling. I question if that's really so valuable?


As documented at https://en.wikipedia.org/wiki/List_of_Google_Easter_eggs google search for "times new roman font" and the results are returned in that font. (https://www.google.com/search?q=Times+New+Roman+Font for the lazy). Looks terrible on my screen.

That might work well for Apple to be the consumer electronic manufacturer that people use to connect to OpenAI/Anthropic/Google for their powerful creative work.

Why would game engine developers want to make a game though? Plenty of devs prefer building the underlying frameworks and tools over the products those tools create.

Yeah well I don't know how to feel about EM

I gave him a chance. Twitter was unacceptably censoring any covid dissent. He freed some of it. Then you find out about the people killed in Tesla crashes. Or him calling the cave rescuer in Thailand a pedo


Weak.

If you see HN frontpage a few years back there's almost no LLM related posts. Now there are multiple each day. An LLM didn't grasp that change, it can't creatively invent new trend or invent a larger picture of what changes in the world. So in 10 years it paints it still half posts about AI.

You can spend a few dozen prompts on getting it to do what you want but then you might as well just spend a couple hours writing that front page yourself, be more satisfied and the result would be funnier.


Go and read the actual report of what the eSafety commissioner is requiring.

The company can't be found liable if they have put in reasonable age verification technology, particularly if the user lied about their age or found a way to circumvent the restrictions.

They clearly aren't going by just what the user says as the companies have implemented age verification tools that try to do that detection.


Thanks for the thoroughness! I look forward to the next steps as you all apply this approach in other unique ways to have even better results.

Yes

Edit: landscape seems to be a work around for me though.


> Automating away all the "boring jobs" leads to an economic collapse, unless you find another way for those people to earn their living.

Yes, that's what happens. All those people find other jobs, do other work, and that new work is usually much less boring than the old work, because boring work is easier to automate.

Historically, economies have changed and grown because of automation, but not collapsed.


The entire point of using something like this is so it's a relatively compact 128 bits. If you don't care about that, go ahead and use something fancy, or just glue 64 timestamp bits to sufficient random bits to make collisions impossible while being monotonic.

Grammar-check and clip-art work fine locally. There are local use-cases, but the powerful use-cases are very important.

I mean: Domino's also has a limited staff.

And most of the process is very similar between Domino's and Kroger.

Just pick out a selection of stuff on a website, and order it. They both provide timely status updates of that order. They both have varying staff levels and workloads. They both certainly have days when they're running very far behind, and days when they feel like they don't have much to keep busy with.

They both have pickup and delivery options; sometimes, with different per-item prices, deals, or fees for each option.

But that's where the similarities end.

If a person orders a pizza at 6:05 and it happens to be ready by 6:30, Domino's doesn't make that person wait until 8:00 to pick it up. They want it gone; the sooner, the better. A person can pick it up (in the store, or they'll bring it out to the car) as soon as it is ready. Domino's does not want any queues at all; neither inbound, nor outbound. And this makes sense: They're in the business of selling pizzas, not storing pizzas.

Kroger isn't like that. If a person orders groceries at 6:05 and the order is ready by 6:30, then: They hold the groceries hostage until 8:00. It's as if an otherwise-complete order just isn't ripe to be picked up by a customer until it has had time to purge itself in a waiting area -- regardless of workload. The queue is mandatory, and is governed not by the physical readiness of the order but instead by the clock on the wall.

This is inconceivably stupid and unnecessary. It serves no benefit to me, nor to the corporation, nor to the employees that work for that corporation. One might think that they'd be aware that they're in the business of selling groceries, but this mandatory purgatory shows otherwise.

(I'll betcha McMaster-Carr doesn't sit on stuff while a clock runs. That's a Kroger specialization. :) )


> all the way to the borderlands of active anxiety—not quite understanding what Claude just wrote.

This is a big issue, personally. I write Python and bash these days and I love that we're not bottlenecked by IDE-based autocomplete anymore, especially for dynamic languages and a huge amount of fixing and incremental feature work can be done in minutes instead of days thanks to AI being able to spot patterns. Simultaneously I'm frustrated when these agents fail to deliver small changes and I have to jump in and change something I don't have a good mental model of or, worse still, something that's all Greek to me, like Javascript.


> If a model draws a really good picture of a pelican riding a bicycle there's a solid chance it will be great at all sorts of other things.

Why?

If I hired a worker that was really good at drawing pelicans riding a bike, it wouldn't tell me anything about his/her other qualities?!


Aptos has been the default font for Microsoft Word since 2023.

I may live to see it; peak sqlite.

i'm sure they (meaning the administration) have already written the conclusion and they are busily making up the "supporting data".

My Sun Ray is back in style! $30 on eBay!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: