Hacker Newsnew | past | comments | ask | show | jobs | submit | ori_b's commentslogin

Yes. There's more than one thing that needs to change if we're going to make it through this.

Slop.

Edit: And if you believe in AI so strongly you can't be arsed to write your own articles, I don't see why you wouldn't just ask it to do the (obviously AI generated) optimization in the first place and not worry about the code.


> What do I do with those reports? Ignore them? Fix the bugs myself? Bleh.

"I don't have access to a test environment, but if you want to write a fix, let me know and I may be able to point you in the right direction" is a perfectly reasonable response.


This, a hundred times. The expectation shouldn't be that you must maintain all scenarios, environments, edge cases.

It's also reasonable to say I simply do not use this software the same way you do. The individual is empowered to fork, contribute and/or collaborate.


Or we can stop putting everything on the internet as a vector for enforced enshittification.

> no WordPress code was used to create EmDash.

Oh, neat. Which model wasn't trained on WordPress?


If they're taking on verification, are they also taking on liability? Do we get to sue them if grandma gets scammed through an app they allow onto their phone?

Yeah, let's hold Google accountable. Is there a way to practice anti-trust laws?

Nice try, but no, shit only flows down.

If you drop the premise of writing, drop the premise that you need something well written. Just give me the same information you would have given the LLM.

But a non well-written prompt is not a good prompt. What are you really going to do with a shit prompt? It's meta: we need better writers all the way down.

Whatever the prompt is, it is still the only information of value reflecting actual decisions made.

Everything coming out of LLM on any prompt is either someone else's decisions or same thing reworded in a different way.


Yes. But if it's good enough for an LLM it's good enough for me.

If you really feel the need, you can attach the LLM output as an appendix. I probably won't read it.


Do you really want five minutes of audio of me rambling, then some instructions for how to split it up and organise it?

Plenty of people make LLMs make text longer, but writing a short accurate text with the essential points is much harder.


What is the difference between you putting your 5 minute monologue into the LLM to summarize it versus me doing it?

I know what I'm trying to say, so I can sanity check the output. You can't, unless you listen to the monologue.

That's why I disagree with people that say "just give me whatever you gave the LLM." That's only useful if you, the writer of the prompt, have no intention of looking at the LLM output before sending it.


I can run it through voice recognition just fine.

Do you really want to read the whole conversation between the author and computer? I don't use AI to write prose but if I did I'd treat it like a critical editor so reading all that would not save you time.

Any time I stumbled on AI writing; in comments, work or articles, it was painfully obvious that not a single person has read it, including the author.

My website serving git that only works from Plan 9 is serving about a terabyte of web traffic monthly. Each page load is about 10 to 30 kilobytes. Do you think there's enough organic, non-scraper interest in the site that scrapers are a near-zero part of the cost?

> We aren't losing code; we are making the ability to code a universal human "literacy."

The same way that doordash makes kitchen skills universal.


You say it like it's a bad thing.

I say that like it's a thing. LLMs have the goal of replacing intellectual work with passive consumption. People seem to like that.

Basically, the selling point of LLMs is that you no longer need to think about problems, you can skip directly to results. Anything that you have to think about while using them today is somewhere on the product roadmap, or will be.

Many people think this is a form of utopia.


Just like computer is no longer a job description, yes.

No, they are saying it like the comparison doesn’t hold. Which it doesn’t.

Why aren't they delivering 4x more work? Does the world no longer need software?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: