Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
https://www.vice.com/en/article/sam-altman-comments-chatgpt-therapy/
Just keeping it fun. Like a friend.
Just don't use it for important decision making. AI don't care if they give a bad advice, only try to please the user.
There are custom/closed versions of ChatGPT that have been prompt engineered to be more sociable (look up Replika, AI Cults, etc), but they are dangerous in that they give advice while not understanding the implications of it. They may be convincing, but still not human.
You can treat it like a friend, but know it's as much of a friend as a calculator or a dictionary is.
Yes.