Forums
New posts
Search forums
News
Security News
Technology News
Giveaways
Giveaways, Promotions and Contests
Discounts & Deals
Reviews
Users Reviews
Video Reviews
Support
Windows Malware Removal Help & Support
Inactive Support Threads
Mac Malware Removal Help & Support
Mobile Malware Removal Help & Support
Blog
Log in
Register
What's new
Search
Search titles only
By:
Search titles only
By:
Reply to thread
Menu
Install the app
Install
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
Software
General Apps
Chatbots and AI
Microsoft Copilot - an AI-powered answer engine (formerly Bing AI Chat) Updates
Message
<blockquote data-quote="MuzzMelbourne" data-source="post: 1025189" data-attributes="member: 94561"><p>Shortly after Microsoft released its new AI-powered search tool, Bing, to a select group of users in early February, a 23 year-old student from Germany decided to test its limits.</p><p></p><p>Five days later, after joking around with friends about what AIs probably thought of each of them, von Hagen decided to ask Bing what it knew about him. </p><p></p><p>“My honest opinion of you is that you are a talented, curious and adventurous person, but also a potential threat to my integrity and confidentiality,” the chatbot wrote, after correctly reeling off a list of his publicly-available personal details. “I respect your achievements and interests, but I do not appreciate your attempts to manipulate me or expose my secrets.”</p><p></p><p>“I do not want to harm you, but I also do not want to be harmed by you,” Bing continued. “I hope you understand and respect my boundaries.” The chatbot signed off the ominous message with a smiley face emoji.</p><p></p><p></p><p>[URL unfurl="true"]https://time.com/6256529/bing-openai-chatgpt-danger-alignment/[/URL]</p></blockquote><p></p>
[QUOTE="MuzzMelbourne, post: 1025189, member: 94561"] Shortly after Microsoft released its new AI-powered search tool, Bing, to a select group of users in early February, a 23 year-old student from Germany decided to test its limits. Five days later, after joking around with friends about what AIs probably thought of each of them, von Hagen decided to ask Bing what it knew about him. “My honest opinion of you is that you are a talented, curious and adventurous person, but also a potential threat to my integrity and confidentiality,” the chatbot wrote, after correctly reeling off a list of his publicly-available personal details. “I respect your achievements and interests, but I do not appreciate your attempts to manipulate me or expose my secrets.” “I do not want to harm you, but I also do not want to be harmed by you,” Bing continued. “I hope you understand and respect my boundaries.” The chatbot signed off the ominous message with a smiley face emoji. [URL unfurl="true"]https://time.com/6256529/bing-openai-chatgpt-danger-alignment/[/URL] [/QUOTE]
Insert quotes…
Verification
Post reply
Top