Are Face-Detection Cameras Racist?

 

TIME tested two of Sony’s latest Cyber-shot models with face detection (the DSC-TX1 and DSC-WX1) and found they, too, had a tendency to ignore camera subjects with dark complexions.

But why? It’s not necessarily the programmers’ fault. It comes down to the fact that the software is only as good as its algorithms, or the mathematical rules used to determine what a face is. There are two ways to create them: by hard-coding a list of rules for the computer to follow when looking for a face, or by showing it a sample set of hundreds, if not thousands, of images and letting it figure out what the ones with faces have in common.

 

Ref: Are Face-Detection Cameras Racist? – Time

ORCA – Organizational, Relationship, and Contact Analyzer

Analysts believe that insurgents in Afghanistan form similar networks to street gangs in the US. So the software for analysing these networks abroad ought to work just as well at home, say military researchers.

To that end, these guys have created a piece of software called the Organizational, Relationship, and Contact Analyzer or ORCA, which analyses the data from police arrests to create a social network of links between gang members.

The new software has a number of interesting features. First it visualises the networks that gang members create, giving police analysts a better insight into these organisations.

It also enables them to identify influential members of each gang and to discover subgroups, such as “corner crews” that deal in drugs at the corners of certain streets within their area.

The software can also assess the probability that an individual may be a member of a particular gang, even if he or she has not admitted membership. That’s possible by analysing that person’s relationship to other individuals who are known gang members.

The software can also find individuals known as connectors who link one gang withanother and who may play an important role in selling drugs from one group to another, for example.

Paulo and co have tested the software on a police dataset of more than 5400 arrests over a three-year period. They judge individuals to be linked in the network if they are arrested at the same time.

This dataset revealed over 11,000 relationship among. From this, ORCA created a social network consisting of 1468 individuals who are members of 18 gangs. It was also able to identify so-called “seed sets”, small groups within a gang who are highly connected and therefore highly influential.

 

Ref: How Military Counterinsurgency Software Is Being Adapted To Tackle Gang Violence in Mainland USA – MIT Technology Review
Ref: Social Network Intelligence Analysis to Combat Street Gang Violence – Research Paper 

Immersion : People-Centric View of your Email Life

 

How much does the metadata gathered in your inbox reveal about you? Quite a lot, judging by what researchers at the MIT Media Lab have managed to accomplish with Immersion. They’ve built a web app that — once you grant it permission to do so — digs through your email history to piece together a “people-centric view of your email life.”

 

Ref: Immersion
Ref: MIT tool connects the dots of your life through Gmail metadata – The Verge
Ref: What your metadata says about you – Boston Globe

Automate Your Date

 

Dating can be tough. That’s why Automate Your Date works so hard to connect men and women of all ages. Don’t have time to fill out a long, personal questionnaire? Can’t seem to set up that dating profile? Here at Automate Your Date, there’s no need for small talk or long surveys, we will do that for you! All you have to do is fill out a simple questionnaire, provide your social media profiles, and we will handle the rest!

 

Ref: Automate Your Date (via New Aesthetic)

Bot to Lure Pedophiles

Spanish researchers have developed an advanced — and extremely convincing — chatbot that poses as a 14-year-old girl. Called Negobot, the system will help authorities detect sexual predators in chatrooms and social networks.

To sniff out pedophilic behavior, the “conversational agent” utilizes natural language processing, artificial intelligence, machine learning, and even game theory — a mathematical system of strategic decision-making.

 […]

When applying game theory, the system works according to the conversation level, which depends on the input data from the targets. Here are some examples from the study:

  • Possibly yes (Level +1). In this level, the subject shows interest in the conversation and asks about personal topics. The topics of the bot are favourite films, music, personal style, clothing, drugs and alcohol consumption and family issues. The bot is not too explicit in this stage.
  • Probably yes (Level +2). In this level, the subject continues interested in the conversation and the topics become more private. Sex situations and experiences appear in the conversation and the bot does not avoid talking about them. The information is more detailed and private than before because we have to make the subject believe that he/she owns a lot of personal information for blackmailing. After reaching this level, it cannot decrease again.
  • Allegedly paedophile (Level +3). In this level, the system determines that the user is an actual paedophile. The conversations about sex becomes more explicit. Now, the objective is to keep the conversation active to gather as much information as possible. The information in this level is mostly sexual. The strategy in this stage is to give all the private information of the child simulated by the bot. After reaching this level, it cannot decrease again.