Religious fucknut doesn't read their own fucking "holy" text in rush to alienate the "other".
"This is not “The Ten Commandments” that can be found in any Bible. It’s “The Ten Commandments” that Hollywood used to promote DeMille’s 1956 blockbuster The Ten Commandments."
This is your reminder to keep boycotting
[source]
Source
Source
So, bounties. They're putting bounties on people. 21st century slave catchers...
They're only war crimes if there are repercussions, and the US will burn the world down to defend these atrocities.
Israeli soldiers film themselves burning houses in Rafah as a punitive measure, which is a war crime under international law.
BDS added this section to their boycott page and I think people really need to read it:
please remember, pushing unorganized boycotts without carefully fact-checking every company in the list can be actively HARMFUL to the boycott movement.
A new tool lets artists add invisible changes to the pixels in their art before they upload it online so that if it’s scraped into an AI training set, it can cause the resulting model to break in chaotic and unpredictable ways.
The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission. Using it to “poison” this training data could damage future iterations of image-generating AI models, such as DALL-E, Midjourney, and Stable Diffusion, by rendering some of their outputs useless—dogs become cats, cars become cows, and so forth. MIT Technology Review got an exclusive preview of the research, which has been submitted for peer review at computer security conference Usenix.
AI companies such as OpenAI, Meta, Google, and Stability AI are facing a slew of lawsuits from artists who claim that their copyrighted material and personal information was scraped without consent or compensation. Ben Zhao, a professor at the University of Chicago, who led the team that created Nightshade, says the hope is that it will help tip the power balance back from AI companies towards artists, by creating a powerful deterrent against disrespecting artists’ copyright and intellectual property. Meta, Google, Stability AI, and OpenAI did not respond to MIT Technology Review’s request for comment on how they might respond.
Zhao’s team also developed Glaze, a tool that allows artists to “mask” their own personal style to prevent it from being scraped by AI companies. It works in a similar way to Nightshade: by changing the pixels of images in subtle ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it actually shows.
Continue reading article here