I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
-
@jwcph A similar concern is the ongoing availability of a tool. Building up your workflows around a tool with sustainability issues or one which is solely controlled by subscriptions to one manufacturer has hurt other crafts time and time again. (e.g. Adobe products)
@fundamental @jwcph I am an iOS developer, I *know*
-
@art_codesmith @jwcph @fedithom
1. I think it is safe to say that competent #software engineers know their tools and an early step in any non-trivial project is to gather tools or write new ones if needed. But we don’t (and cannot) write all of them from scratch because it is too much to keep in our heads AND there are smarter people out there who’ve already done the work. We can do what we do only by leveraging the work of others.2. A tool created by automatic programming is just as useful as one created by a human. If you trust it to work in your use case then an AI-created tool is no different.
3. The question to be answered is the same for any software tool: Why do I trust it? If you are super-rigorous then you will want to use a formal logic-checking tool to prove the software is correct. That’s really hard and computationally intractable for non-trivial software.
4. ALL software contains residual errors, but our ways of justifying trust in software are incomplete and involve some kind of inductive leap that in the best case leaves you with a quantifiable idea of the risk of failure.
#AI is just software. Do with it what you do with any other software.
-
@Ponygirl You know, there's a lot of people who would respond to that with a bunch of hemming & hawing about how useful it can/will be for the right applications - but right now I'd say they have the burden of proof & to my knowledge, they're not lifting it.
I'm with you.
@jwcph @Ponygirl
"AI" is not "AI". I hate that "AI" has become the term people use to refer to ChatGPT or Gemini.You have to distinguish LLMs and other genAI that are being hyped by big tech from the kind of AI that's being used in science and has been used in science for decades.
For example, I use a neural network model to denoise my astrophotography.
"AI" should never have been made available to the general public. This is a thing for science and science alone.
-
@fedithom @art_codesmith @jwcph agreed that this seems like a meaningful distinction; im saying that for the vast majority of programmers, compilers fall into the category of 'things without which its not possible to get any work done'. writing any machine code at all is a fairly rare skill, and developing non-trivial applications using it is almost non-existent outside of certain specialized sub-domains. this seems to make programming unlike many other arts/crafts, where its the other way around (only certain specific sub-domains basically require specialized tools; many others are doable by hand by most practitioners)
@fedithom @art_codesmith @jwcph (nb i don't really know how relatively true this is for other crafts in general as opposed to programming. i would assume that somewhat adept at digital painting is probably also decent at hand sketching, but also that many/most painters couldn't make their own paints or brushes. so it likely depends on what part of the skill one considers incidental versus essential)
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph I've had the benefit of being fairly isolated from this kind gross over-dependence, and most of the people I've met who use these tools seem to have a realistic grasp on the scope of the problems they're trying to solve.
I'm glad I got to experience some struggle and growth while developing the more difficult skills of my trade before this crutch existed. The temptation NOT to seems to be pretty poisonous.
-
@art_codesmith @fedithom@social.saarland @jwcph
"If you're not mining and refining the materials and building the chips..."
You're conflating different scenarios to the point of absurdity.
-
@art_codesmith @fedithom@social.saarland @jwcph
"If you're not mining and refining the materials and building the chips..."
You're conflating different scenarios to the point of absurdity.
@ricardoharvin @jwcph Maybe? I don't know. It was defnitely not my intention.
Maybe I've read too deep into this but, for me, writing in assembly is the best analogy for woodworking with manual tools.
Using a high-level language would be like working with well-developed power tools.
(Using AI... well, the advocates think that it's like working with a super-fancy programmable machine but the motors are busted and the tolerances are between "frick" and "all".) -
@jwcph @Ponygirl
"AI" is not "AI". I hate that "AI" has become the term people use to refer to ChatGPT or Gemini.You have to distinguish LLMs and other genAI that are being hyped by big tech from the kind of AI that's being used in science and has been used in science for decades.
For example, I use a neural network model to denoise my astrophotography.
"AI" should never have been made available to the general public. This is a thing for science and science alone.
@ninafelwitch @Ponygirl I don't think We
️ have to distinguish. Whatever useful tools scientists & other highly specialised people have which technically fall under this, you guys can & will keep alive regardless of the flak rightfully directed at the hype version & its downfall (hopefully). - just as long as You
️ remember to distinguish, which sadly isn't always the case; I work at a tech institute (though not an engineer/scientist myself) where they don't & it's causing real problems... -
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph the employee who focuses on making himself indispensable and irreplaceable is one you must terminate.
-
Dump trucks are a tool. If we lose dump trucks, then we no longer have the ability & skills required to move large loads of gravel. Therefore, according to the general principle cited ( "If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability") it follows that dump trucks are a liability.
But, of course, dump trucks are not a liability. They make it possible to do what we could not do before. Same with LLMs.
-
@Downes So, you're an idiot?
@jwcph What kind of response is that?
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph That just validates my opinion on LLMs: they are just a tool, and if you can't code without them you shouldn't depend on them in the first place.
In a way, they are a multiplier: they can make a good coder more efficient, but for someone that doesn't know what they're doing they will just result in a lot more bad output. Just like an efficient saw can help a good woodworker, but also result in a lot more wood scraps if used by an unskilled one.
I do agree that the reliance on a handfull of companies is bad though. Since it takes so much resources, it's not like anyone can build a decent LLM, so the competition just isn't there, unlike other tools where there are usually many good options (including more ethical ones...) -
Dump trucks are a tool. If we lose dump trucks, then we no longer have the ability & skills required to move large loads of gravel. Therefore, according to the general principle cited ( "If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability") it follows that dump trucks are a liability.
But, of course, dump trucks are not a liability. They make it possible to do what we could not do before. Same with LLMs.
-
@jwcph the employee who focuses on making himself indispensable and irreplaceable is one you must terminate.
@keengrasp So... people are tools to you?
-
Dump trucks are a tool. If we lose dump trucks, then we no longer have the ability & skills required to move large loads of gravel. Therefore, according to the general principle cited ( "If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability") it follows that dump trucks are a liability.
But, of course, dump trucks are not a liability. They make it possible to do what we could not do before. Same with LLMs.
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph It's a good point, but I'm totally using AI to do programming things that I have the ability to do as a programmer. It's a tool.
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph @thomasfuchs @drikanis great point – a tool should enhance your skill, not supplant it!
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
We used to have a bookcase in every software house with tech docs on it and we normalised that being replaced with online docs.
Then Google-fu became important and knowing the right question to form a query around became a good skill for a young dev.
So we got to the point devs developed skills in asking good questions and identifying good answers. The issue is LLM answers don't have peer review like Stack Overflow and get trusted like official docs.
-
-
@Wifiwits @jwcph I am aware that dump trucks are not the same as LLMs. However, the original statement said "If loss of a tool..." and both dump trucks and LLMs are tools.
The original principle sounds appealing, but the appeal comes from its generality. But it is too broad. It captures too much. So we have to ask, why would this principle apply to LLMs if it doesn't apply to dump trucks?
Try not to respond with insults. It's far more interesting to actually engage with the point being made.