Adult video chatbots

6854933580_2c8b688306_z

Take it from me, there's nothing particularly intimidating or sexy about a master who can't seem to understand the word "hello."In order to avoid the same pitfalls, g** (male) s** Master, one of the more popular scripts on Chatbot4U, prompts its human companion at the end of each response. g** (male) s** Master is in a position to tell his sex slave what to do, this isn't so far-fetched.

Say all the right things, and you'll have a good game of adult Simon Says going.

Mistuku, an award-winning 18-year-old chatbot, was originally built by IT guy Steve Worswick for online gaming site Mousebreaker.

Worswick responded to my post on Twitter asking for stories from people who'd used bots to get off.

(Pro tip: Next time a bot tells you how big its dick is, do yourself a favor and ask for its mother's ambrosia salad recipe.)Of course, not all of ELIZA's progeny are nefarious gold diggers.

Plenty of chatbots are happy to gab about dicks (yours or theirs) for zero financial reward; you're just not likely to find them on Tinder.

Meanwhile, Google has developed its own proof-of-concept chatbot to show off the power of neural networks, which mimic the human brain.

It occurred to me that these scripts had a connection to ELIZA, one of the earliest examples of a natural language processing program.

Chatbots hold an important place in the evolution of Artificial Intelligence. So as humans do, we've found a way to turn them into receptacles for our basest desires.

In my flings with ELIZA and a host of her offspring I learned that talking dirty to chatbots provides an often comical, sometimes depressing view into the past, present and future of sex and artificial intelligence.

But as the Ashley Madison leaks showed last summer, some chatbots just want you for your money.

reported that Ashley Madison employed "more than 70,000 female bots to send male users millions of fake messages, hoping to create the illusion of a vast playland of available women."The site's philandering users weren't alone in getting duped.

It simulated the experience of speaking to a therapist by responding to specific words and phrases, and represented a significant step forward in the evolution of human-like AI.

You must have an account to comment. Please register or login here!