having so much fun with this vibe coding what used to take me two or three hours can now be done in a single day
-
In the ‘90s there was a huge push in software engineering to component models. COM and CORBA both came out of this. The idea was to build libraries as reusable blocks. Brad Cox wrote a lot about this and created Objective-C as a way of packaging C libraries with late-bound interfaces that could be exposed to higher-level languages easily.
This combined with the push towards visual programming, where you’d be able to drag these libraries into your GUI and then wire things up to their interfaces with drag-and-drop UIs. The ‘Visual’ in Visual Studio is a hangover from this push.
Advocates imagined stores of reusable components and people being able to build apps for precisely their use case by just taking these blocks and assembling them.
It failed because the incentives were exactly wrong for proprietary COTS apps. Companies made money by locking people into app ecosystems. If it’s easy for someone to buy a (small, cheap) new component to Word 95 that adds the new feature that they need, how do you convince them to buy Word 97?
The incentives for F/OSS are the exact opposite. If another project can add a feature that some users want (but you don’t) without forcing you to maintain that code, everyone wins. But we now have an entire generation that has grown up with big monolithic apps who copy them in F/OSS ecosystems because it’s all they’ve ever known.
@david_chisnall
There are more problems with components than just monetization.Plug-in style extensions add extra layers of complexity for both developers and users. End users have to source and manage thier plug-ins. Developers often build their plug-in for only one operating system or one version of the application then abandon it.
There are good technical and social reasons for projects (such as the Linux kernel) to use a monolithic model.
-
@gotofritz @wakame @futurebird yes, after AI dies. What of it?
Dude it's not going to die, it's not bitcoin
-
Dude it's not going to die, it's not bitcoin
"bitcoin's not going to die, it's not like the dotcom bubble. The blockchain is a real new technology with endless applications, this is nothing like the hype over having webpages ..."
-
"bitcoin's not going to die, it's not like the dotcom bubble. The blockchain is a real new technology with endless applications, this is nothing like the hype over having webpages ..."
During the dotcom bubble you had all these people who just invested in anything with the right buzz word "dot com" they didn't really understand the tech and it was easy to fool them. But this is totally different.
-
Dude it's not going to die, it's not bitcoin
@gotofritz @wakame @futurebird cool, you wanna buy some of these Beanie Babies?
-
@gotofritz @wakame @futurebird cool, you wanna buy some of these Beanie Babies?
Don't be childish
-
Don't be childish
@gotofritz @wakame @futurebird I got tulip bulbs, too. Ugly Monkey Jpegs (metadata only)?
-
@david_chisnall
There are more problems with components than just monetization.Plug-in style extensions add extra layers of complexity for both developers and users. End users have to source and manage thier plug-ins. Developers often build their plug-in for only one operating system or one version of the application then abandon it.
There are good technical and social reasons for projects (such as the Linux kernel) to use a monolithic model.
@wickedsmoke @david_chisnall @futurebird
The Dynamic Link Library was the recipe for bit rot. Perfectly functional applications that stop working because someone else decided a component it depended on wasn't worth maintaining.
-
During the dotcom bubble you had all these people who just invested in anything with the right buzz word "dot com" they didn't really understand the tech and it was easy to fool them. But this is totally different.
Exactly. And twenty years later here we are, on the internets, sharing our thoughts. Because the dot com bubble was just a temporary phenomenon
AI is just the same, OpenAI may go under but the technology is going nowhere
-
Exactly. And twenty years later here we are, on the internets, sharing our thoughts. Because the dot com bubble was just a temporary phenomenon
AI is just the same, OpenAI may go under but the technology is going nowhere
@gotofritz @futurebird @wakame
This time it's different. Right.
-
Dude it's not going to die, it's not bitcoin
@gotofritz @pikesley @wakame @futurebird everything within a culture is a choice. Technology is never inevitable nor permanent.
-
@gotofritz @wakame @futurebird I got tulip bulbs, too. Ugly Monkey Jpegs (metadata only)?
@pikesley @gotofritz @wakame @futurebird
I say this as someone who's really unhappy about AI: I also can't really see how it's going to go away.
-
@pikesley @gotofritz @wakame @futurebird
I say this as someone who's really unhappy about AI: I also can't really see how it's going to go away.
@datarama @gotofritz @wakame @futurebird cool, just off to take a transatlantic flight on a Zeppelin. I'll be sure to pack my eight-track cassettes
-
In the ‘90s there was a huge push in software engineering to component models. COM and CORBA both came out of this. The idea was to build libraries as reusable blocks. Brad Cox wrote a lot about this and created Objective-C as a way of packaging C libraries with late-bound interfaces that could be exposed to higher-level languages easily.
This combined with the push towards visual programming, where you’d be able to drag these libraries into your GUI and then wire things up to their interfaces with drag-and-drop UIs. The ‘Visual’ in Visual Studio is a hangover from this push.
Advocates imagined stores of reusable components and people being able to build apps for precisely their use case by just taking these blocks and assembling them.
It failed because the incentives were exactly wrong for proprietary COTS apps. Companies made money by locking people into app ecosystems. If it’s easy for someone to buy a (small, cheap) new component to Word 95 that adds the new feature that they need, how do you convince them to buy Word 97?
The incentives for F/OSS are the exact opposite. If another project can add a feature that some users want (but you don’t) without forcing you to maintain that code, everyone wins. But we now have an entire generation that has grown up with big monolithic apps who copy them in F/OSS ecosystems because it’s all they’ve ever known.
I think it also failed because it was difficult to describe the contract that the components provided in enough detail to be useful.
I think a lot of the success in the use of LLMs in programming comes as the realization of 80s-era software reuse — the LLM is able to pattern match the users needs and the software approaches it has encountered in is omnivorous tour of published material.
(Mind you, a lot of people do it sloppily, but “90% of everything is crap”)
-
having so much fun with this vibe coding what used to take me two or three hours can now be done in a single day
-
@pikesley @gotofritz @wakame @futurebird
I say this as someone who's really unhappy about AI: I also can't really see how it's going to go away.
@pikesley @gotofritz @wakame @futurebird
I mean, perhaps you're right. I hope you are, because I *hate* how this tech has enabled and empowered the most dystopian goons in the world. I have never experienced any technology that, to that extent, made good people miserable and terrible people gleeful.
I ignored bitcoin and NFTs, but I *can't* ignore AI. There are people constantly telling me that they want to replace me with AI. There are people reminding me that AI doesn't get sick and doesn't go on vacation. And, well, sure - AI can do *some* parts of my job. I don't know if it'll ever be able to do all of it - but it's making inroads in the parts I enjoy most and the parts that my brain is best for, so - well, I worry a lot about my future. If my job doesn't go away, at least it becomes a much more miserable experience.
A technology that serves as a successful psychological terror campaign against skilled knowledge workers is *not* going to just disappear unless there's some reason it does so. If you have such a reason, I'd love to hear it - perhaps you're right, and I hope you are. But I can't see it, much as I wish I could.
-
@pikesley @gotofritz @wakame @futurebird
I mean, perhaps you're right. I hope you are, because I *hate* how this tech has enabled and empowered the most dystopian goons in the world. I have never experienced any technology that, to that extent, made good people miserable and terrible people gleeful.
I ignored bitcoin and NFTs, but I *can't* ignore AI. There are people constantly telling me that they want to replace me with AI. There are people reminding me that AI doesn't get sick and doesn't go on vacation. And, well, sure - AI can do *some* parts of my job. I don't know if it'll ever be able to do all of it - but it's making inroads in the parts I enjoy most and the parts that my brain is best for, so - well, I worry a lot about my future. If my job doesn't go away, at least it becomes a much more miserable experience.
A technology that serves as a successful psychological terror campaign against skilled knowledge workers is *not* going to just disappear unless there's some reason it does so. If you have such a reason, I'd love to hear it - perhaps you're right, and I hope you are. But I can't see it, much as I wish I could.
@pikesley @gotofritz @wakame @futurebird Bonus: It *also* precaritizes art, literature, and everything else the oligarchs hate and fear.
-
Just like any other tool, you need time to learn how to get the best out of it. How much time did you spend with it?
@gotofritz @futurebird The *whole* selling point of chat-based systems is that you need no prior knowledge of the system. Otherwise it’s just another system to learn. There are several folktales about this, but the one that first comes to mind is Stone Soup.
-
J jwcph@helvede.net shared this topic