Google’s Project Nimbus Is The Future Of Evil

When utilized in dreadful ways, AI can be frightening. Google’s new project Nimbus is the future of evil.

Google's Project Nimbus Is The Future Of Evil

Google frequently acts foolishly. The same applies to all large organizations. However, it takes extra work to accomplish something truly awful. On the spectrum, this is where Google’s Project Nimbus fits in.

Project Nimbus is a collaboration between Google, Amazon, and the Israeli government that uses cutting-edge machine learning algorithms to offer futuristic surveillance capabilities. Whether you like it or not, that project is going to be a part of state security in the future, and it will not be any worse than many other ones. Many of us use comparable technology inside and outside of our houses.

(Image credit: Chris Wedel/Android Central)

What Google says regarding Project Nimbus’ use of the company’s technology is where things get murky and ugly:

Nimbus training documents emphasize “the ‘faces, facial landmarks, emotions’-detection capabilities of Google’s Cloud Vision API,” and in one Nimbus training webinar, a Google engineer confirmed for an Israeli customer that it would be possible to “process data through Nimbus in order to determine if someone is lying”.

Yes, the company behind the dreadful YouTube algorithms now wants to offer algorithms that can tell whether a person is telling the truth to the authorities. Think about that. In light of its inherent flaws, Microsoft has given up on this science.

[jetpack_subscription_form title="Subscribe to GreatGameIndia" subscribe_text="Enter your email address to subscribe to GGI and receive notifications of new posts by email."]

Unfortunately, Google strongly disagrees and retaliates against those within the firm who disagree with it.

Watch the video below:

We will not get too into the politics involved here, but the entire initiative was created so the Israeli government could conceal its activities. Project Nimbus aims to “preventing the German government from requesting data relating to the Israel Defence Forces for the International Criminal Court” according to Jack Poulson, a former head of security for Google Enterprise, according to The Intercept. (As per some people’s interpretation of the laws, Israel is believed to be perpetrating crimes against humanity against Palestinians.)

There is no good reason to provide this sort of technology to any government at any scale.

However, it really does not matter how you feel about the Israeli-Palestinian issue. Giving any government this kind of technology on any scale is not justified. By doing so, Google becomes wicked.

Even if Google’s Cloud Vision API were always accurate, the alleged powers of Nimbus are unsettling. Think about police body cameras that utilize AI to determine whether to file charges against and/or detain you. But when you think about how often machine learning vision systems get things wrong, everything gets frightening.

Not just Google is affected by this issue. All that is required is to look at content moderation on Facebook, Twitter, or YouTube. Computers use moderation algorithms that frequently make the erroneous conclusions to complete 90% of the initial task. However, Project Nimbus could end up costing you your life in addition to simply deleting your rude comment.

This type of AI cannot be offered by any organization until the technology has reached a point of perfection, which will never occur.

Look, like the majority of people, we are all for catching the bad guys and taking action against them. We are aware that law enforcement, whether provided by a neighborhood police unit or the IDF, is a necessary evil. It is unnecessary to use AI to accomplish this.

We are not advocating that Google should stop attempting to expand and instead just focus on creating the software that runs the phones you adore. We are merely saying there is a right way and a wrong one, and in this case Google selected the wrong approach and is now stuck because the agreement’s provisions forbid it from ceasing participation.

Never follow someone on the internet who is on a soapbox; instead, create your own opinions. But you should also be aware of when a business that was founded on the motto “Don’t Be Evil” makes a complete 180-degree turn and turns out to be the very evil it warned us about.

GreatGameIndia is being actively targeted by powerful forces who do not wish us to survive. Your contribution, however small help us keep afloat. We accept voluntary payment for the content available for free on this website via UPI, PayPal and Bitcoin.

Support GreatGameIndia

Leave a Reply