artist mannequin 
Photo by Brett Jordan on Unsplash
in

The Kindness Bias

What should GPT be like?

Matthew Gault, in his Vice article, describes concerns over biases in along with the confusion as to what is training biases versus what limits are set up on the UI.

For textual systems, and perhaps most others, we are going to end up with biases. In the absence of any conscious choice to the contrary, that basis will be toward what is “common,” and that might be anti- bias in a world where “The Truth Is Paywalled, But The Lies Are Free.”

A lot of the freely available text is social media that is filled with, if not hate, at least unkindness.

So if we train our systems on what is loud and common rather than what is good and true, it will forever infuse its answers with falsehoods and resentment. Harmful things that we will tacitly accept as it comes packaged with desirable functionality.

It’s not too soon to ask for our AI systems to be biased toward truth and kindness. Perhaps we could then ask the same of our bloggers, leader, and media personalities to follow suit.

 

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

      Written by Russell Brand

      Entrepreneur in residence at Founder Institute, he has mentored, performed due diligence on and invested in numerous early stage companies. Hundreds of these early stage companies have described Russell’s insights and advice as the most useful thing in the history of their companies. He has always had an inborn ability to find more valuable uses of new ideas and faster ways to achieve results.

      silhouette of 2 people standing on grass field during sunset 

      The Truth Is Paywalled But The Lies Are Free

      white and black analog clock 

      The World’s Scariest Clock