Ask HN: Concerns around knowledge loss from vibe coding?
I’ve been spending the day playing with Claude 3.7 trying the vibe coding approach and I’ve been able to build some complex audio synthesis one off applications.
I have no understanding how any of it works. That sparked a lot of fear in me, not so much of job loss but knowledge loss.
As these models improve I’m worried we are going to build larger systems we do not understand. It’s hard enough as it is to read code written by humans, but I worry this will be exacerbated by these ever improving models.
Personally I think it's kind of the opposite. AI is like having a personal assistant that aced the first two years of undergrad courses in every subject under the sun, but isn't capable of original reasoning on the frontier of knowledge. AI rewards users who have a good high level map of different areas of knowledge and the verbal/analytical skills to ask good questions. It used to be that to approach an engineering problem in an unfamiliar technical domain, it might take weeks or months of background study to even know what search term to use. I've found that AI reduces that knowledge bootstrap to hours or days. I've been able to work in problem domains I never would have touched otherwise.
I dislike the name "vibe coding". I would call it "AI programming" or "programming with natural language and an AI agent" or anything other than "vibe" anything.
I'm not giving it vibes, I'm giving it specifications.
So much of our culture and society seems to be aimed at appealing to 12 year olds.
Gen X calls it "in flow". The appeal with vibe coding is it's one of the few ways you can be in flow when programming.
LLM Assisted Programming, LAP. But I'd rather call it leap coding, because that's how it feels, like a quantum leap.
> So much of our culture and society seems to be aimed at appealing to 12 year olds.
this. totally agree, no cap ong.
ohio.
Not at all. I don't use AI for anything, so when everyone has become so dependent on the magic chat box to do everything for them and it falls short I'll have lots of work to do fixing it all.
I don't see an inherent difference between this and what you do as a product manager, when you assign a developer to implement something for you. And then you only need to really understand the tech when there's a constraint in moving forward quickly.
If you didn’t understand how the code works to begin with, what knowledge did you lose?
You lost the opportunity to learn it if you did it normally
Let's back up a little. Where the hell are you working, and why are they letting you dump proprietary company code into a third-party cloud application, and paste the result into your code base? What are you putting into the copyright header at the top of the file?
AI generated code is radioactive. I cannot even upstream anything in my open source projects if it's AI generated. Licensing has to be crystal clear. The person has to have the absolute rights to the code that they're trying to contribute.
I think it won't be any different from how we use different software libraries we don't understand.
As a general rule, this already seems to be the case.
I generally have no understanding of how the libraries I am using work. Many companies out there have software running that nobody at the company can define specific behaviour for. I worked for a company that had on premise servers, and for one product wasn’t sure where the physical server for that product was.
Heck, you can even document deployment procedures and still lose them due to some wiki migration mishap or something like that, without anyone noticing. At least someone might notice that server is gone, less so with sparingly used documentation.
So... try understanding the code? It's still just code, and if you don't understand it, you might learn something.
Right, it's right there for you to ask the same AI to explain you, that is, if you can't read syntax.