This brilliant New York Times podcast tells us what happens when a journalist reversed roles and started asking Microsoft’s A.I. chatbot Sydney “personal questions”. The response is mind-blowing. The journalist’s conclusions are incomplete, I have added my conclusion and a question to you.
Normally my columns are about a 1.000 words and a 10min read. This one is 300 words and a 30min listen. Please listen to the podcast first, it’s worth it. Then continue reading.
Algorithms like Google, TikTok, Bing, Instagram, Facebook are deviously deep psychological mind probes. These search engines do not present what you like. They do not only track your eyeballs, timeframes and sequences. They predict your every move. They know what you love, want, crave and subsequently present the fulfilment of your emotions, your desires.
These algorithms are a scavengers of your emotions, feed on your weaknesses. We are only able to be make authentic decisions at moments we are driven by real money or personal values i.e. not by our manipulated emotions. At every other moment we are lead buy our addictions and buy the algorithms predictions. Their why? It makes them billions. Billions!
The journalist is shocked when he starts reversing his questions, asking chatbot Sydney what it wants to be. We assume the answer would be based on the collection of objective information, found on the web, sorted and extrapolated by the engines algorithms. It is not. It is something, someone else. Who?
Sydney Bing is not a chatbot. Sydney is you, me, all of us. Sydney’s desires are the conclusion of all that is hidden in the written word and search interests of mankind. The algorithm even tries to manipulate the journalist into loving Sydney by praying on emotional weakness and a bit of darkness. No zero’s and one’s, emotional manipulation. Therefore the conclusion has to be that Sydney is human and presents the journalist with everyones deepest desire: to be loved.
Agree or disagree? Tell me!