I have been watching this story simmer for several days.
-
@RogerBW @futurebird related: people cheering on the iranian military after they hit AWS data centers:
https://bsky.app/profile/richard.wickedproblems.earth/post/3mgf7hc3vcs2s/quoteslike, they're not only noticing that it is valid & legal to target enemy data centers during wartime; there's also tons of people writing words to the effect of "dear iran, please hit more of our data centers."
@RogerBW @futurebird i used to think i was anti-war, but really, i was always just anti-aggressor.
When you live in the aggressor country, being anti-aggressor usually translates to political demands to stop fighting. This was true for the u.s.-initiated war in Vietnam; the u.s.-initiated war war in Iraq; the 2nd u.s.-initiated war in Iraq; etc.
But this time, i don't just want the fighting to stop; i also want the u.s. military to lose, because a "victory" would mean that evil has won.
-
How?
-
RE: https://infosec.exchange/@hacks4pancakes/116192434654015384
I have been watching this story simmer for several days. I've been wary of it. It fits too neatly into the criticisms and warnings many of us have been raising. But it's starting to look like, yes, they are using an LLM to make critical decisions.
At the same time I have heard parts of speeches from the US Secretary of War.* I have been dismayed by his shallow thinking. It doesn't help that his speeches also sound like they are also composed by an LLM.
*formerly Defense
@futurebird If AI was used then all the directors of the company (Anthropic?) should be tried for war crime.
-
@gotofritz @futurebird the point is not whether the LLM selected a target that a human wouldn't have selected. The point is that an LLM was used for target selection. Even if the selection had been appropriate that would have been a game changer. The fact that it was inappropriate just highlights the problem, it's not *in itself* the problem, which is the manifestly reduced role of human decision making in lethal strikes.
And what is the "benefit" of reducing the human friction in such decisions?
It's easier to kill 1000 people by pushing a button than it is to look in their eyes and do the deed with your own hands.
But the results are the same, and as a war crime the button pusher is just as guilty. (And the people who set up the button, knowing what it could enable are guilty too.)
-
@RogerBW @futurebird i used to think i was anti-war, but really, i was always just anti-aggressor.
When you live in the aggressor country, being anti-aggressor usually translates to political demands to stop fighting. This was true for the u.s.-initiated war in Vietnam; the u.s.-initiated war war in Iraq; the 2nd u.s.-initiated war in Iraq; etc.
But this time, i don't just want the fighting to stop; i also want the u.s. military to lose, because a "victory" would mean that evil has won.
@RogerBW @futurebird (to be clear, that's also what victory meant/would-have-meant for the previous mentioned wars; it's just that now i'm more aware of it.)
-
@futurebird If AI was used then all the directors of the company (Anthropic?) should be tried for war crime.
I was not kidding in this post. I will never forgive the people who did this.
|
https://sauropods.win/@futurebird/113866111349182003 -
RE: https://infosec.exchange/@hacks4pancakes/116192434654015384
I have been watching this story simmer for several days. I've been wary of it. It fits too neatly into the criticisms and warnings many of us have been raising. But it's starting to look like, yes, they are using an LLM to make critical decisions.
At the same time I have heard parts of speeches from the US Secretary of War.* I have been dismayed by his shallow thinking. It doesn't help that his speeches also sound like they are also composed by an LLM.
*formerly Defense
@futurebird If they did was their targeting overall more or less accurate? The war in Afghanistan was plagued by constant human casualties due to either malice or error.
-
I was not kidding in this post. I will never forgive the people who did this.
|
https://sauropods.win/@futurebird/113866111349182003@futurebird Agreed. And no-one should be able to claim, "It wasn't me, the AI did it".
-
@RogerBW @futurebird (to be clear, that's also what victory meant/would-have-meant for the previous mentioned wars; it's just that now i'm more aware of it.)
@JamesWidman @futurebird Also "just because side A are bad guys, which we can all agree on, that doesn't make side B good guys." That falsity is so fundamental to media presentations of any sort of conflict or disagreement that I don't think it's salvageable.
-
@gotofritz @futurebird a technology that enables even greater inhumanity at scale, that enables even more misleading guidance, that acts human and intelligent to garner trust and hype, that generates believable results that feel thoughtful and not just factual—that encourages people to disconnect further from the sources of that information and be less thoughtful—is not the same as a “Python script or a Google search.” It becomes something new and multiplying. The animosity is warranted and rational.
-
But the problem with "serious study" of international politics is that when you learn more about a people, a culture you start to like them. You start to *become* like them.
Any tool that enables killing without such engagement is putting us on a fast track to atrocity.
Punch cards. Such a wonderful technology. (I collect punch cards, have always been fascinated by them). I can also never forget how they provided emotional distance for one of the greatest genocides.
@futurebird @gotofritz the Jacquard loom was one of the Luddites' targets for destruction.
-
My disgust for war, for the military, is not the result of being sheltered.
I hate your wars because I know exactly what they are and how when violence occurs it reverberates for generations.
Someone has lost a daughter and with her many of their reasons for living. What grim task will they devote themselves to?
What would you do, Pete, if they killed your family? Do you think you are the only person who loves their children?
I think you must.
I am not convinced any inside the MAGA movement views anyone outside their movement as human. They see everyone opposed to them as NPC obstacles to be mowed down with gunfire and bombed into oblivion.
-
My disgust for war, for the military, is not the result of being sheltered.
I hate your wars because I know exactly what they are and how when violence occurs it reverberates for generations.
Someone has lost a daughter and with her many of their reasons for living. What grim task will they devote themselves to?
What would you do, Pete, if they killed your family? Do you think you are the only person who loves their children?
I think you must.
@futurebird I think his speeches sound llm generated because a bad speechwriter would use an llm
-
My disgust for war, for the military, is not the result of being sheltered.
I hate your wars because I know exactly what they are and how when violence occurs it reverberates for generations.
Someone has lost a daughter and with her many of their reasons for living. What grim task will they devote themselves to?
What would you do, Pete, if they killed your family? Do you think you are the only person who loves their children?
I think you must.
@futurebird ah, but that's where you're mistaken.
Kegsbreath, the Fuhrer, etcetera? They all view children as nothing more than property and a prop. Any kid that isn't bringing in billions for them is worthless. Any *family member* that isn't actively supporting Nazism is worthless.
They genuinely do not care in the least about them. Just as religion is nothing but a tool or justification for Nazism.
-
RE: https://infosec.exchange/@hacks4pancakes/116192434654015384
I have been watching this story simmer for several days. I've been wary of it. It fits too neatly into the criticisms and warnings many of us have been raising. But it's starting to look like, yes, they are using an LLM to make critical decisions.
At the same time I have heard parts of speeches from the US Secretary of War.* I have been dismayed by his shallow thinking. It doesn't help that his speeches also sound like they are also composed by an LLM.
*formerly Defense
@futurebird By using the word 'targeting' you make it sound as if the harm to that school was deliberate. Any proof of that? Or is this just more uncritical Israel-bashing?
-
@gotofritz @futurebird that was a pile of nothingness without an argument in it other than “you’re a poo poo head who I disagree with.”
Talk about the content of my statements, not what you make up about what’s in my head.
I use LLMs extensively and have a background in machine learning and computer science; I know exactly what these tools are. That is the source of my evaluation. What’s yours?
-
The impulse to use an LLM, and other shortcuts to make such decisions is indicative of a lack of serious interest in the details of the situation.
A level of study and interest that ought to be present for decisions that kill so many people.
I think that's why it matters.
@futurebird @gotofritz also if a decision to make a strike, and a decision about coordinates of that strike, is made solely on the basis of the information provided by intern or by some python script cobbled together without very stringent quality control, then heads of everybody involved in making a decision to rely on such intern or python script for such purposes should roll. I don't see why it should be different with LLM.
(I'm not saying that wars are good or that conducting strikes on the basis of more reliable information is good. But this specifically is such an obvious violation of all possible norms of conducting wars, it's only possible if those conducting the war don't give even a slightest shit about civilians.)
-
Right wing war hawk commentators are saying that hitting a civilian target, a school filled with little girls is "good strategy, actually" we, the soft-handed peace-nicks are simply not smart enough to understand the strategic power of this action.
But, if the Department of war won't say it remains unclear if this atrocity is based on incompetence or dim-witted malice.
update: We can infer that they "missed"
https://shakedown.social/@AAronL1968/116193541496761440@futurebird "Better to boast of atrocities than to apologize for errors" is so familiar a pattern of authoritarian thinking I honestly can't tell yet whether this was an intentional hit based on bad intelligence, an intentional hit driven by indifference to the quality of inputs, or a cruel accident of war that placed a bomb where it wasn't intended to be. The idiotic noises coming allegedly from the SecDef add no clarity.
-
My disgust for war, for the military, is not the result of being sheltered.
I hate your wars because I know exactly what they are and how when violence occurs it reverberates for generations.
Someone has lost a daughter and with her many of their reasons for living. What grim task will they devote themselves to?
What would you do, Pete, if they killed your family? Do you think you are the only person who loves their children?
I think you must.
@futurebird I'd be very surprised if he felt love for his children. Ownership, yes. Love, no.
-
@gotofritz @futurebird I don’t believe these are arbitrary biased statements, they are characteristics uniquely of an anthropomorphized chat interface to a language model. It is not the same.
Thank you for arguing seriously, though I still respectfully disagree, I believe your own bias toward whitewashing all technology as the same in the spectrum of harm hides critical specificity which provides a much less truthful account of its impact.