July 13, 2024

After the world saw Hunter Biden’s laptop, Biden wants to see yours.

To supposedly stop people from exchanging non-consensual artificial intelligence (AI) images of a sexual nature, President Biden wants to probe everyone’s smartphones as part of a sweeping surveillance effort.

press release from the White House explains the Biden regime’s desire for the tech and financial industries to take charge in stopping the creation and spread of abusive sexual imagery created by AI robots.

According to Biden’s handlers, “mobile operating system developers could enable technical protections to better protect content stored on digital devices and to prevent image sharing without consent.”

The plan is to have mobile operating systems such as Android and iOS automatically scan and analyze people’s private photos to determine which ones are sexual or non-consensual. Users would not have the ability to keep any of their images private from government spooks.

It might sound like a good thing until you recognize the privacy implications of such an arrangement. Do we the people really want to allow the government direct access to our photos?

Beyond the search and analysis framework, the Biden regime also wants mobile app stores like Apple’s App Store and Google Play to “commit to instituting requirements for app developers to prevent the creation of non-consensual images.”

(Related: AI is just one component among many of the dystopian present.)

Do we really want Biden looking at our photos?

If Biden gets his way, a full range of apps would be subject to government intrusion, including apps that allow users to edit and draw images and other art. Beyond that, the sky is the limit.

“Once this technology of on-device monitoring becomes normalized, this level of scrutiny could extend beyond the initial intent, potentially leading to censorship of other types of content that the administration finds objectionable,” warns Reclaim the Net‘s Ken Macon.

“The administration’s call to action extends to various sectors, including AI developers, payment processors, financial institutions, cloud computing providers, search engines, and mobile app store gatekeepers like Apple and Google. By encouraging cooperation from these entities, the White House hopes to curb the creation, spread, and monetization of nonconsensual AI images.”

Previously, the Biden regime was able to secure voluntary commitments from Amazon, Google, Meta (Facebook and Instagram), and Microsoft to implement safeguards on their respective AI systems. This is not enough, though, as the White House wants even more intrusion.

Privacy experts worry about the “mission creep” aspects of the plan, which essentially gives the government free rein to do as it wishes in the name of stopping “crime.”

On-device surveillance of any kind means that someone, somewhere will always be looking at and monitoring everyone’s photos. AI is new, yes, but there have always been programs like Photoshop that allow people to create and edit images on their phones, so why the sudden concern about non-consensual and sexual imagery?

“This could set a precedent for more extensive and intrusive forms of digital content scanning, leading to broader applications beyond the original intent,” Macon warns.

One of our own readers noted in an article about AI and the dystopian present that humanity is “going mach speed to extinction” with all of this Orwellian tech and associated government invasion of privacy.

“The extinction event is happening right now,” he further contends. “AI is telling us that the first president, George Washington, was black. Why is Google programming it to lie? What other billion lies has Google not yet been caught on?”

This writer’s best advice comes straight from the Bible in Revelation 18, which says in verses four and five:

“Come out from her, my people, so that you will not participate in her sins and receive of her plagues.”

BREAKING: Disease X Is Here, Warns Dr. Peter McCullough

Leave a Reply

Your email address will not be published. Required fields are marked *