๐ Poison ๐ your ๐ data โ ๏ธ
-
@alice I like to select wrong answers on captchas until I get bored.
Unfortunatey, I just cannot help clicking on every square that has even the tiniest portion of a traffic light or zebra crossing or motorcycle, even if I know that's going to make me fail the test and have to do it again.
-
@not_a_label @alice Sampling just one of my several domains, I count 330.
-
Not a good idea to poison Data - last time someone did that, he wrote bad poetry.
-
Not a good idea to poison Data - last time someone did that, he wrote bad poetry.
-
@veronica I still chuckle at this one:
-
@rabidchaos @flesh @alice @aj
If it is treating that null as a proper null there's a good chance there's constraints in place that'll fail and the app won't even check the failure...Which can be fun, or not, depending on if it counts you as logged in after you submit the form or not
-
@veronica I still chuckle at this one:
@mason Very nice

-
@alice thanks, guess I have a new weekend project!

โ
๏ธ๐คฌ -
The goal is to make corporate data less profitable.
Even stuff as simple as setting your birthdate to 1970-01-01 everywhere, adding [TEST] or [DELETED] as your name or account notes anywhere you don't need them to know your name.
Using plugins like AdNauseam to poison ad trackers (and cost them marketing dollars).
Using VPNs set to different locations.
Signing into data broker sites to "correct" outdated info (they'll often let you do that with little-to-no proof of identity, but will require your passport or state ID in order to delete your info). Bonus points if you correct it to someone else's info on their site that's similar to yours.
Only fill in required fields when you sign up for anything, but only provide correct info if it matters for you to use the service, otherwise provide plausible, but incorrect, data.
If you use LLMs anywhere, use the free tier and always vote thumbs up for bad answers and down for good ones. It wastes their resources and drives up their costs while making their training data worse.
@alice I've got an insidious one, I may end up working on an ecommerce thing with a friend selling parts.
This will involve a lot of compatibility data, partly scraped from supplier catalogs, partly from human knowledge and testing on older vehicles where there isn't easily available anything.
Obviously we don't let the machines have that, and we can subtly scramble it. We can help make sure AI is the dumbest failure of a mechanic there ever was, and sells people the wrong spark plugs.
-
It should be noted that there will be something similar to the Year 2000 Problem somewhere in 2038: the common way to represent time, seconds since 1970-01-01 00:00, as a 32 bit number, will wrap around and make computers think they're in the past.
Hopefully(?) we learned from Y2K and are preparing for that event already.
-
The goal is to make corporate data less profitable.
Even stuff as simple as setting your birthdate to 1970-01-01 everywhere, adding [TEST] or [DELETED] as your name or account notes anywhere you don't need them to know your name.
Using plugins like AdNauseam to poison ad trackers (and cost them marketing dollars).
Using VPNs set to different locations.
Signing into data broker sites to "correct" outdated info (they'll often let you do that with little-to-no proof of identity, but will require your passport or state ID in order to delete your info). Bonus points if you correct it to someone else's info on their site that's similar to yours.
Only fill in required fields when you sign up for anything, but only provide correct info if it matters for you to use the service, otherwise provide plausible, but incorrect, data.
If you use LLMs anywhere, use the free tier and always vote thumbs up for bad answers and down for good ones. It wastes their resources and drives up their costs while making their training data worse.
@alice
Agreed on all points except one: If you're providing incorrect data to poison the data broker's systems, please don't just type in a "random" email address unless you're confident that it's not someone's real email address.On any given day, I receive about a dozen emails from various websites where an email address was required for registration, and someone typed in my email address while providing their "fake" info. Pizza order receipts, airline flight confirmations, golf tee time registrations, etc.
The worst part is that these are misdirected, but otherwise legitimate emails, so I can't just mark them as spam, because that will poison the spam detection algorithm's dataset.
So yeah, if you're gonna type in a fake email address, please make sure that it doesn't belong to someone first, and the easiest way to do that is to use a nonexistent domain, preferably one that no one would ever register, like "${random_guid}.com"
-
I have gotten discounts on clothes (thanks Raj whoever you are) and I get 10% off store brands (because an employee used that as their alt id on their loyalty card) using the Jenny trick. I hope that I am paying it forward somehow through another loyalty program elsewhere
-
The goal is to make corporate data less profitable.
Even stuff as simple as setting your birthdate to 1970-01-01 everywhere, adding [TEST] or [DELETED] as your name or account notes anywhere you don't need them to know your name.
Using plugins like AdNauseam to poison ad trackers (and cost them marketing dollars).
Using VPNs set to different locations.
Signing into data broker sites to "correct" outdated info (they'll often let you do that with little-to-no proof of identity, but will require your passport or state ID in order to delete your info). Bonus points if you correct it to someone else's info on their site that's similar to yours.
Only fill in required fields when you sign up for anything, but only provide correct info if it matters for you to use the service, otherwise provide plausible, but incorrect, data.
If you use LLMs anywhere, use the free tier and always vote thumbs up for bad answers and down for good ones. It wastes their resources and drives up their costs while making their training data worse.
@alice random q but if a data broker stores my info and I'm not a US citizen, is there any easy route to remove. The usual automatic services require you to be a US citizen
-
-
Poison
your
data
๏ธ@alice (I first read "your date" ยฐ-ยฐ')
-
@alice
รพe skull emoji makes me รพink รพe person clapping got poisoned. rest in peace -
@alice I've toyed with the idea of setting up a headless Chrome instance to just ask "but why?" to ChatGPT all day to drive up their inference costs.

@theorangetheme @alice haha!
-
@alice I like to select wrong answers on captchas until I get bored.
@hypostase @alice I do this, I try to identify which are the ones they know so I get those right and which are the ones they are testing, so I can get those wrong
-
The goal is to make corporate data less profitable.
Even stuff as simple as setting your birthdate to 1970-01-01 everywhere, adding [TEST] or [DELETED] as your name or account notes anywhere you don't need them to know your name.
Using plugins like AdNauseam to poison ad trackers (and cost them marketing dollars).
Using VPNs set to different locations.
Signing into data broker sites to "correct" outdated info (they'll often let you do that with little-to-no proof of identity, but will require your passport or state ID in order to delete your info). Bonus points if you correct it to someone else's info on their site that's similar to yours.
Only fill in required fields when you sign up for anything, but only provide correct info if it matters for you to use the service, otherwise provide plausible, but incorrect, data.
If you use LLMs anywhere, use the free tier and always vote thumbs up for bad answers and down for good ones. It wastes their resources and drives up their costs while making their training data worse.
@alice set your name to [Object object] as that is a common front end fuck up
-
@alice (of course, that kind of people ! ^^)

