Sex is getting scrubbed from the internet, but a billionaire can sell you AI nudes

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

In the fascinating new reality of the Internet, adolescent girls cannot find out about the periods on Reddit and independent artists cannot sell Smutty games on ITCH.IO, but a military entrepreneur will make you fans of Taylor Swift non -consensual by taking its summit for $ 30 per month.

Early on Tuesday, Elon Musk’s XAI launched a new image and video generator called Grok imagines with a “spicy” mode whose exit goes from suggestive gestures to nudity. Because Grok Imagine also does not have perceptible railing against the creation of real people images, this means that you can essentially generate softcore pornography of anyone who is famous enough for Grok to recreate (although, pragmatically, it seems mainly to produce a serious NSFW production for women). Musk has boasted that more than 34 million images were generated during the day following the launch of operations. But the real coup shows that XAI can ignore the pressure to elude the content of adults out of its services while helping users to create something largely insulting, thanks to legal shortcomings and the political lever effect that no other business has.

XAI’s video function – which made its debut almost at the same time as a romantic chatbot companion called Valentine – seems to be surprisingly bizarre, because it came out for a period when sex (to the word itself) is pushed to the margins of the Internet. At the end of last month, the United Kingdom began to apply age rules that forced X and other services to block sexual or otherwise “harmful” content for users under the age of 18. At the same time, a militant group called Collective Shout managed to put pressure on Steam and Itch.io to repress adult games and other media, leading Itch.io in particular to the mass of the whole NSFW.

Deepfake Porn of Real People is a form of non-consensual intimate imaging, which is illegal to intentionally publish in the United States under the care law, signed by President Donald Trump earlier this year. In a statement published Thursday, the RAPE, Abuse & Incest National Network (Rainn) called the Grok functionality “being part of an increasing problem of sex -based abuse” and quipped that Grok clearly “did not obtain the memo” on the new law.

But according to Mary Anne Franks, professor at the George Washington University Law School and president of the Civil Cyber Rights Initiative (CCRI) for non -profit (CCRI), there is “little danger of grok confronted with all kinds of responsibility” under the law on taking. “The criminal arrangement requires a” publication “, which, although unfortunately not defined in the law, suggests making content available to more than one person,” explains Franks. “If Grok only makes videos visible to the person who uses the tool, this does not seem to be enough.”

Regulators have not applied laws against large companies even when they apply

Grok is probably not necessary to remove the images under the removal of the reduction – although this rule is so widely wide that it threatens most social media services. “I do not think that Grok-or at least this particular Grok tool-even be considered as a” covered platform “, because the definition of the covered platform requires that it mainly provides a forum for the content generated by users ”, she says. “The content generated by AI often involves user inputs, but the real content is, as the term suggests, generated by AI.” The arrangement of withdrawal is also designed to operate thanks to people who report content, and Grok does not publicly publish the images where other users can see them – this makes them incredibly easy to create (and publish almost inevitably on social networks) on a large scale.

Franks and the CCI called the limited definition of a “covered platform” as a problem for other reasons ago. This is one of the several ways whose decrease law fails to serve people affected by non -consensual intimate imaging while presenting a risk for web platforms acting in good faith. This might not even prevent Grok from publicly publishing obscene images modified by real people, said Franks Spitfire News In June, in part because there are open questions about the question of whether Grok is a “person” affected by law.

These types of failures are an execution theme in Internet regulations which are apparently supposed to suppress harmful or inappropriate content; The United Kingdom’s mandate, for example, has made it more difficult to disseminate independent forums while being fairly easy for children.

To worsen this problem, especially in the United States, regulatory agencies have failed to impose significant consequences for all kinds of rule rules by powerful companies, including many musk companies. Trump gave companies belonging to muscles an almost total pass for misconduct, and even after officially leaving his powerful position at the Ministry of Government, Musk probably maintains a huge lever effect on regulatory organizations like the FTC. (XAI has just obtained a contract of up to $ 200 million with the Ministry of Defense.) So even if Xai raped the law on taking, it would probably not face an investigation.

Beyond the government, there are guards of guards who dictate what is acceptable on platforms, and they often adopt a low vision of sex. Apple, for example, pushed Discord, Reddit, Tumblr and other platforms to censor NSFW equipment with different levels of success. Steam and Itch.io have reassessed adult content under the threat of losing relationships with payment processors and banks, which previously put the screws on platforms like only fans and pornhub.

In some cases, like Pornhub, this pressure is the result of platforms allowing unambiguous harmful and illegal downloads. But Apple and Payment Processors do not seem to maintain hard and uniformly applied online policies. Their application seems to depend considerably on the pressure of the balanced public with the quantity of power of the target, and despite its fall with Trump, practically no business has more political power than musk. Apple and Musk have faced themselves several times on Apple’s policies, and Apple mainly held firm on things like its cost structure, but it is apparently remote to smaller problems, in particular by returning its advertisements to X after having drawn them from the infested platform of Nazis.

Apple has prohibited small applications to make nudes generated by AI of real people. Will this exert this kind of pressure on Grok, whose video service has been launched exclusively on iOS? Apple did not respond to a request for comments, but do not hold your breath.

Grok’s new feature is harmful for people who can now easily have non -consensual nudes in their on a major AI service, but this also shows how the promise of a “safer” Internet proves. Small platforms face the pressure to eliminate the consensually recorded or entirely fictitious media made by human beings, while a company led by a billionaire can earn money on something in certain circumstances, downright illegal. If you are online in 2025, nothing concerns sex, including sex – which, as usual, is a question of power.

Follow the subjects and the authors From this story to see more like this in your personalized home page flow and to receive updates by e-mail.


Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button