the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion
-
the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion
@glyph my assertion was that the singularity, as described by ray kurzweil, accurately describes the invention of writing, and i don't see why it would be more interesting if the self-improving intelligent mechanism were made of etched silicon instead of CHNOPS nanomachines. it is harder for etched silicon to self-reproduce, anyway. the CHNOPS nanomachines just do that.
i think human advancement *has* followed an exponential-*looking* curve since that point, albeit with a low base.
-
in order to be a singularity candidate, an AI would need to achieve vertical integration from silicon fabrication through logistics and integration, into operating systems and applications, with tight whole-system feedback from the robotics to the shipping to the power generation and back
we are not even remotely close to a single LLM meaningfully constructing even a portion of the pipeline to train another LLM. you can sort of argue around the edges that maybe under certain synthetic conditions this is borderline possible now, but on the "singularity" progress bar, that is 0.5%
-
we are not even remotely close to a single LLM meaningfully constructing even a portion of the pipeline to train another LLM. you can sort of argue around the edges that maybe under certain synthetic conditions this is borderline possible now, but on the "singularity" progress bar, that is 0.5%
if, in order to achieve your out-of-control doomsday robot scenario, a trillion dollars worth of human effort must be expended annually, and if any of it stops for even a moment than the whole thing implodes and grinds to a halt, _you can stop worrying_ that it is "the machines" which dominate us
-
if, in order to achieve your out-of-control doomsday robot scenario, a trillion dollars worth of human effort must be expended annually, and if any of it stops for even a moment than the whole thing implodes and grinds to a halt, _you can stop worrying_ that it is "the machines" which dominate us
doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day
-
it is so mind-meltingly frustrating to see people think that we are close to a "singularity" with current AI technology. here's a hint about when you could worry about a disruption so big that it might, even momentarily, *appear* to be a singularity:
a single corporation turning a profit even once
@glyph Ah yes, the Singularity: a thing that its religious adherents can't define but which will almost certainly be ushered in by chatbots that tell you to put glue on pizza.
Put me on Artemis III, I'm done here.
-
the idea that a "singularity" is possible is just the idea that you can turn "mistaking a sigmoid for an exponential" into a millenarian religion
tbf, it made for some *great* science fiction in the 90s.
-
doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day
RE: https://mastodon.social/@glyph/115076275195904439
I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy
-
RE: https://mastodon.social/@glyph/115076275195904439
I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy
put ME on CNN and MSNBC, you cowards.
-
tbf, it made for some *great* science fiction in the 90s.
@suetanvil WHY CAN OUR GENERATION'S SUPPOSEDLY GREATEST MINDS NOT DISTINGUISH BETWEEN REALITY AND FANTASY
-
@suetanvil WHY CAN OUR GENERATION'S SUPPOSEDLY GREATEST MINDS NOT DISTINGUISH BETWEEN REALITY AND FANTASY
@suetanvil it's ruining my ability to appreciate the fantasy!!!
-
if, in order to achieve your out-of-control doomsday robot scenario, a trillion dollars worth of human effort must be expended annually, and if any of it stops for even a moment than the whole thing implodes and grinds to a halt, _you can stop worrying_ that it is "the machines" which dominate us
@glyph Maybe the real singularity was the <s>friends we made along the way</s> black hole we dumped all our cash into
-
put ME on CNN and MSNBC, you cowards.
@glyph That would be such an incredible improvement over their current coverage *and* I would pay to see that. Both things can be true.
-
put ME on CNN and MSNBC, you cowards.
@glyph i would pay for that.
-
doomers might look at my rant here and think, "but wait, once it's self-sustaining, even a little, it's TOO LATE, it's already out of control!!!" and to that I say: no. not even close. look the evolution of *any* business. managing resource flows is really hard. there is an off-ramp every single day
@glyph Ants are self-sustaining, self-reproducing, more intelligent than any AI humans have managed to make, and capable of directly altering the physical world.
If 'self-sustaining' were really the break-point, humans lost well before we existed as a species. -
put ME on CNN and MSNBC, you cowards.
like if anyone had halfway-plausible "grey goo" nanotech that could do anything that looked like computation, that might be worrying. a locally viable self-reproducing platform that can make another one of itself from a pile of dirt, even if it's like, special dirt, that might scare me a little bit. but an overlord hive-mind that requires an uninterrupted global high-purity helium supply chain just to make ONE more of itself is supposed to be a threat?
-
RE: https://mastodon.social/@glyph/115076275195904439
I've written about this before and I will probably do it again. but I don't know what else to do but repeat myself when allegedly serious, internationally-renowned academic experts and influential public intellectuals are just going out there and saying stuff that would get you laughed out of a late night freshman dorm room conversation about philosophy
@glyph Or any other subject, really.
Even in STEM.
Like the introductory biology class I took with its toy population models that went sigmoid very quickly, simply because biologists understand that populations of living things hit barriers to growth. Or the control systems engineering class I took, where we figured out how to tell which parts of the system behavior would be good over the long term, which (to oversimplify only slightly) meant *no positive exponentials* anywhere in the math.
-
like if anyone had halfway-plausible "grey goo" nanotech that could do anything that looked like computation, that might be worrying. a locally viable self-reproducing platform that can make another one of itself from a pile of dirt, even if it's like, special dirt, that might scare me a little bit. but an overlord hive-mind that requires an uninterrupted global high-purity helium supply chain just to make ONE more of itself is supposed to be a threat?
seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?
-
seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?
casual thinkpieces and lazy attempts at scicomm are what has set me off but the actual thing I'm mad about is that we are ruled by people with a child's understanding of the world and the economy and that's actually really bad
-
seriously just imagine the plot of one of the movies that doomers seem to think are documentaries, like Terminator 2. imagine the scene where the T-1000 is getting pelted with bullets. instead of seamlessly autonomously healing, imagine it has to lie down and wait for a human to place an order for $1,000,000 of NVIDIA GPUs to be delivered in a shipping container and then a construction crew to set up a methane generator to run for two weeks straight before it got up again. is that still scary?
@glyph Meanwhile, the actual potential (but mitigatable) doom - the methane generators poisoning the air and worsening the severity and frequency of climate disasters, the technofascists spending obscene amounts of money undermining governments and trying to radicalize large parts of the population, while burning resources at a rate only a captain planet villain could find reasonable, etc. - goes largely unremarkedupon -_-;
-
@glyph Meanwhile, the actual potential (but mitigatable) doom - the methane generators poisoning the air and worsening the severity and frequency of climate disasters, the technofascists spending obscene amounts of money undermining governments and trying to radicalize large parts of the population, while burning resources at a rate only a captain planet villain could find reasonable, etc. - goes largely unremarkedupon -_-;
@glyph so much energy, and so many articles, are going into the scifi-fanfiction doom that it's seemingly crowding out the actual, tangible, presently-addressable, immanent problems, that have fuck-all to do with chatbot pseudogods, and everything to do with the people building them.