According to the statistics, three-in-ten adults from the United States say they have used an app or website dedicated to dating. However, about six-in-ten online daters claim to have had an overall positive experience while using dating platforms. Nevertheless, online dating has become significantly more prevalent in recent years.
There’s an important aspect that is either jeopardizing the future of online dating or making it better. A growing number of dating apps and websites are leveraging Artificial Intelligence to introduce many aspects of smart matching and combating spam. Services such as Chatib, Tinder, Chatiw heavily relies on AI to keep their product environment clean. Initially, the scheme was a lot more simple: users just had to swipe through photos and check easily who they like and who they don’t.
Too much AI could be bad
Showing to a user what the AI thinks he would like will automatically have to rule out other profiles that may have been suitable. If someone is interested in women who have a passion for heavy metal music, for example, that doesn’t automatically mean that he won’t like gals who enjoy symphonic music. But the AI can’t think like that, so it will only show to the guy those females who are into heavy metal. The AI will undoubtedly do a great job when it comes to bringing together people with similar tastes and interests, but it cannot guess all the possible alternatives.
Detecting vulgar images using AI
We all recognize porn when it happens on a screen, but it can be a lot more difficult for Artificial Intelligence than we think. AI is based on machine learning algorithms, and experts are trying to solve the major issue of getting virtual robots to recognize sexual content. Brian DeLorge is one of those who work hard to solve the long-lasting problem. DeLorge is CEO of Picnix, a company that is selling customized AI technology. Iris is one of the company’s products, meaning a client-side app meant for detecting pornography in order to help people. DeLorge believes that pornography is harmful.
But AI proved in the past that it could even function the other way around when it comes to pornographic images, meaning that it can see such pictures where they don’t actually exist. A few years ago, Google’s Deep Dream was part of a male genitals hallucination by a piece of software. Computer scientist Gabriel Goh was the man in charge, as he created a neural network that mashes two programs. One of them is a Deep Dream-like image generator that uses deep learning to analyze libraries of pictures and create similar images. The second is an open-source software from Yahoo that can automatically detect and filter pornographic content.
In the end, you know what they say that nobody and nothing is perfect in this world, and it looks like the rule applies even for Artificial Intelligence.
Agnes is a technical writer, being in touch with reports to come up with the latest tech leaks.