advertisement
advertisement
advertisement

In less than a day, Microsoft’s chat bot went full-on racist

Tay, the company’s social media experiment with AI that was designed to talk like a teenager, was shut down at midnight after spewing racist comments on Twitter like this:

“(Expletive) like @deray should be hung! #BlackLivesMatter”

You’d think someone at Microsoft would have known that there are a lot of bad people on the Interwebs, sad to say.MB