Imagine yourself as a law enforcement or homeland security official who could examine a hostage photo and accurately decipher the location of where it was taken to track down the culprits. Or picture yourself as a store owner who is able to analyze photos of people entering your store and be able to tell their precise ages or ethnicities to help with your marketing efforts. Or pretend you’re an anthropologist researching human migration and you are able to see who is moving to which countries or regions through the use of “big data” via thousands of pictures.
The ability to accurately do so does not yet exist, at least not on a consistent basis. Sure, photos can be analyzed for clues as to location, ethnicities of individuals and ages. And sometimes accurate assessments are made. But unless obvious markers – such as a picture of a well-known landmark like the Eiffel Tower – are in the photos, the success rate is not consistently accurate, according to Mohammad Tarik Islam, an assistant professor of computer science at Southern.
But Islam has been developing algorithms with the use of “big data” that are already showing promising results. And he is optimistic that the ability to track down locations and other information from photos will continue to improve.
“It’s a very exciting field of research,” Islam said. “We are basically teaching computers to identify patterns – in essence, to learn.”
Islam has completed a first stage of this new technology with a project called “Geo-Faces.” He had downloaded 1.8 million images of people from Flickr, though most were from the United States and Western Europe. He tested the projected locations via the computer algorithm with the actual location and found it to be 26-percent accurate. While relatively low in accuracy, it was 13 times better than chance as the computer was asked to choose from 50 cities as the location – a 2-percent chance if done randomly.
He followed up that test with a new project, “Geo-Faces X.” It again attempted to determine the location of the photo based on analyzing the faces of individuals in the pictures. But this time, it entailed gathering 40 million Internet images from 173 countries around the world. The test proved to be 22-percent accurate.
“That might not seem very impressive, but the random chance of guessing the right city is less than 1 percent with 173 choices,” Islam said. “We have a lot of work to do, but it’s an impressive start.”
Most recently, Islam has taken that data set from Geo-Faces X and began a project that tested the computer’s ability to link the ethnicity, age and gender of the individuals depicted with location. The preliminary results are encouraging. He said the computer projected the correct location 90 percent of the time using ethnicity, and 70 percent of the time using gender. Age proved not to be a significant factor, he said.
He has begun testing to see if things such as clothing, houses and trees can accurately project location of photos.
Islam, who graduated last spring with a Ph.D. in computer science from the University of Kentucky, is a former system engineer in Bangladesh. His co-authored Geo-Faces work was published last year in the EURASIP Journal on Image and Video Processing.