The research has potentially troublesome implications for citizens' privacy and safety.
Police around the world, including in countries with questionable human rights records, are increasingly turning to facial recognition to monitor crowds, and the combination of CCTV and an algorithm that can detect sexuality could have worrying consequences in states where homosexuality is outlawed.
The algorithm was able to tell if a man is gay or straight using one picture 81pc of the time, and could determine a woman's sexuality 74pc of the time. Humans were much less accurate in comparison, correctly guessing just 61pc of the time for men and 54pc for women.
When the computer was given five pictures of a person, it answered correctly 91pc of the time for men and 83pc for women.
The researchers trained the AI using pictures of 36,630 men and 38,593 women, taken from online dating profiles of gay and straight people. The algorithm was able to detect subtle differences in facial structures that humans are incapable of picking up.
The differences may relate to the level of hormones such as testosterone that foetuses are exposed to in the womb, which may determine sexuality, the researchers told 'The Economist'.
Facial recognition technology is becoming increasingly speedy, reliable and accurate. It is being included in the latest smartphones as a security feature and being employed by governments to tackle crime.
The UK's Metropolitan Police has used facial recognition technology during the Notting Hill Carnival for the last two years, albeit with limited success, while crowds around the Champions League final in Cardiff were also monitored.
A Russian facial recognition app called FindFace is working with local police to identify suspects, and last week Chinese police used the technology to catch criminals at a beer festival.
Homosexuality is illegal in dozens of countries, and hate crimes against gay, lesbian, bisexual and transgender people in the UK have skyrocketed in recent years, so the technology could put gay people at risk.
"Given that companies and governments are increasingly using computer vision algorithms to detect people's intimate traits, our findings expose a threat to the privacy and safety of gay men and women," Michal Kosinski and Yilun Wang, the researchers behind the project, said.