What Makes a Terrorist? Science Explores How Extreme Ideology Impacts Brain

c. 2005 Religion News Service (UNDATED) Tawfik Hamid stood with 200 other students for afternoon prayers inside a mosque at the University of Cairo. It was important for them to stand with their feet touching, as the Quran teaches, so that even in prayer they were prepared for war: “Truly Allah loves those who fight […]

c. 2005 Religion News Service

(UNDATED) Tawfik Hamid stood with 200 other students for afternoon prayers inside a mosque at the University of Cairo.

It was important for them to stand with their feet touching, as the Quran teaches, so that even in prayer they were prepared for war: “Truly Allah loves those who fight in his cause … as if they were a solid cemented structure.”


The mosque was the religious home of a burgeoning radical Islamic movement, Al-Gama Al-Islamiyya.

After prayers that day in 1978, Hamid stepped into the courtyard and saw a slightly older, bearded man who was deep in thought.

It was Ayman al-Zawahiri, future confidant of Osama bin Laden and an architect of the attacks on the USS Cole, the Pentagon and the World Trade Center.

“Tawfik is a new member of Al-Gama Al-Islamiyya,” a friend of Hamid’s said, by way of introduction.

Zawahiri looked at Hamid and spoke: “The future of Islam is dependent on you young Muslims.”

Hamid, who believed in the Salafi, or literal, reading of the Islamic texts, surged with pride. He felt he had been chosen for a special task, just as it states in the Quran:

“Slay the idolaters wherever you find them. Arrest them, besiege them, and lie in ambush everywhere for them.”

What turns an otherwise ordinary human into a terrorist? It is a question that has puzzled the world with a special urgency since the attacks of Sept. 11, 2001.


Now the scientific community is close to providing some answers.

Social cognitive neuroscientists are using technologies that normally detect illnesses to better understand the behavior and motivation of terrorists.

Mathematicians and computer scientists are looking at models like game theory and network analysis to understand the formation of terrorist cells.

While the government received failing grades earlier this month from the Sept. 11 commission for not adequately responding to security concerns, academic institutions are providing a road map to a safer world.

Consider these advances: high-tech cognitive detection sensors that literally can look into the brain to uncover malevolence, computer modeling of terrorist behavior and the development of mathematical indexes to disrupt terrorist networks.

At the core of this scientific inquiry, however, is the terrorist himself.

Tawfik Hamid was a member of Al-Gama Al-Islamiyya for less than a year. Yet the emotional power of group identification and the quick slide into thoughts of violence and martyrdom are a testament to what most social scientists now believe _ that it is easy for an ordinary, educated man to be swayed by extreme ideology.

“In just a few months, I became like a beast,” said Hamid, 27 years later. “Every time I went to lectures or to prayers at the mosque, I felt the burning of wanting to be a martyr, to kill infidels.”


Today, Hamid is a 44-year-old psychologist, married, with children, and he no longer lives in the Mideast. Because he has renounced the Islamism of his former mentor, he prefers not to reveal where he lives. Several years ago, he changed his name to protect his family.

He says it is difficult to explain how he broke away from the radical group he joined in college.

“When my conscience started to come back, I had contradictory feelings,” he said. “They were very faint, very deep in me, and after a period of time they grew and I had to reject those other beliefs.”

Hamid said his intellectual curiosity and his abiding belief in the sanctity of life eventually led him to reject extremism, but how he could have contemplated murder, even suicide, in the name of jihad still haunts him.

Defining the unthinkable _ removing the veil from the militant mind _ is what cognitive researchers now are doing. Success lies in their ability to manipulate medical technologies in locating the neural roots of emotion, behavior and belief.

Using sophisticated brain scanners, such as functional magnetic resonance imaging and positron emission tomography, that measure blood flow in the brain, scientists can watch the mind in motion.


Functional neuroimaging has identified the involvement of the amygdala, a walnut-sized bulge in the brain’s medial temporal lobe, in the processing of emotions about groups. Because the amygdala (pronounced ah-MIG-duh-lah) also is the center of emotional learning, scientists believe it plays an important role in the creation of bias.

Last year, researchers at Northeastern University in Boston and the University of Massachusetts-Amherst published a paper in the journal Psychological Science titled “Prejudice From Thin Air.” In their experiments, subjects randomly were assigned to “in-groups” and “out-groups” (“us” versus “them”) and tested for automatic attitudes by measuring the speed with which they classified a picture as being part of one group or the other.

Participants also were asked to recall specific emotional memories, times when something made them angry, sad or neutral. Subjects did not express bias when they were in sad or neutral states, but did register prejudice against the out-group when their responses were coupled with anger memories.

Anger, in other words, created an automatic bias where previously there was none.

Certainly this was true for Hamid. He grew up in the Garden City section of Cairo with secular parents and Christian friends. Prejudice against non-Muslims emerged dramatically only when he joined Al-Gama Al-Islamiyya in college.

“You feel you belong to a powerful group,” he said, “and the verses we read and the talks, they were very powerful in stimulating the feelings of anger _ that others are your enemy and you must join with other Muslims to fight the infidels.”

The amygdala is linked to the more recently evolved part of the brain called the prefrontal cortex, located just behind the forehead. It is here that habits of mind _ along with sense of self and sense of others _ originate.


Jordan Grafman, chief of the cognitive neuroscience section at the National Institute of Neurological Disorders and Stroke in Bethesda, Md., has shown how easy it is to turn off the “socialized” part of the brain in normal subjects.

Four years ago, Grafman asked volunteers, after they were placed in the doughnut-shaped PET scanner, to imagine three scenes, all involving their mothers being attacked in an elevator. In the first, the subject imagines doing nothing. In the second, the subject fights the attackers, but loses and is physically restrained. In the third, the subject attempts to either injure or kill the men.

The PET scans revealed that when the subjects imagined the two aggressive scenarios, there was a significant decrease in blood flow to their orbitofrontal cortex, the center of a person’s beliefs and motivations. The empathic part of their brains had been deactivated.

“The prefrontal cortex helps modulate aggressive tendencies. It controls activities and brain structures that ordinarily would be quite active to provocation,” said Grafman. “There’s no doubt that even if you just fantasize a lot about violent activity, your brain gets better at shutting down the part that holds onto social rules and empathy.”

This was what energized and eventually frightened Hamid, and it still puzzles him.

“I felt evil _ thinking of killing non-Muslims, of raping their wives _ because all of it was justified in the books I was reading. Hatred developed gradually. I became very extremist. I became fanatical.”

Katherine Taylor, a physiologist at Oxford University, has studied how repetition and reinforcement of extreme ideas shape the brain. She calls the neural pathways or cognitive networks created by thoughts “cogwebs.”


“The idea behind the cogweb,” said Taylor, also the author of a book on brainwashing, “is the more something is repeated, and the stronger the emotions associated with it, the stronger that cogweb becomes. And as those connections get stronger, the more likely it is that there will be a quick step through to the action phase. … And that, of course, is what an ideologue is aiming for.”

With scientists now able to locate and study prejudice and aggression in action, the technology of detection _ being able to pick people out of a crowd who have malevolence on their mind _ has become an experimental, if controversial, idea.

The Department of Homeland Security and other agencies are funding development of “noninvasive neurologic sensors” and airline passenger screening technologies.

Other programs are focused on automated detection of deceptive intent. In August, Rutgers University’s Center for Computational Biomedicine, Imaging and Modeling, along with scientists at Lockheed Martin, received $3.5 million from the Department of Homeland Security to develop a new kind of lie detector.

Dimitris Metaxas, director of the Rutgers center, aims to develop a camera-assisted computer program that can analyze a person’s subtle nonverbal cues and correlate them with an intention to deceive. The technique could be used at border crossings, as well as at high-security locations.

Peter Rosenfeld, principal member of the engineering staff at Lockheed Martin’s Advanced Technology Laboratories in Cherry Hill, N.J., is working on the visual component.


“What we want is a kind of scanner that can take a 3-D image of the head _ and by applying a virtual mask to the person’s face, we can read his eyebrow positions, the lines in his face, his eye movements. They can be correlated to a person’s psychological state. Basically, it’s a simple portable device to quickly see if a person is lying.”

Britton Chance has taken deception screening one step further. He believes his remote brain sensor has the potential to identify people planning acts of violence.

The 92-year-old University of Pennsylvania bioengineer is developing a kind of high-tech “pre-lie” detector that he said could one day screen for terrorists at airports.

Instead of a large scanner like the fMRI, Chance straps a Palm Pilot-sized near-infrared imaging device to the forehead. Near-infrared light can pass through the scalp and skull and highlight the brain’s red blood cell activity.

Chance’s “cognosensor” also is being tested as a remote device, with the near-infrared machine emitting a concentrated light beam targeted at the orbitofrontal cortex.

With funding in part from the federal government, Chance said he has been able to detect lying in volunteer subjects.


“It’s not the emotional stress of having made a decision (what a conventional lie detector measures), but about what goes on in the decision making,” he said. “We detect signals in the forebrain right before a decision is made. Because we are not fully in control of our own brain, it would tell me this person is having trouble deciding to tell the truth.”

The potential application to homeland security and airport screening is obvious to Chance, who acknowledges the considerable ethical and legal hurdles.

“If it was commercially available, we could decide if a person waiting for a plane is contemplating violence or malevolence”

While social cognitive neuroscientists are peering into the militant mind, other researchers have moved further up the terrorist evolutionary scale to understand what happens after recruits join terrorist cells. Before Sept. 11, 2001, this was something Valdis Krebs could not have dreamed he’d be doing.

An expert in social network analysis, Krebs, who works from his home in Westlake, Ohio, usually spends his days consulting with Fortune 500 companies about how they can improve the flow of information among their personnel.

That all changed on the morning of 9/11 when Krebs and his wife heard the grinding roar of a westbound jet banking hard and turning back to the east. They both sprinted into the backyard but never saw the plane that eventually went down in a field in Pennsylvania. That day, Krebs heard the words “terrorist network” for the first time.


Soon he was using his expertise to try to understand exactly what the term meant. He began by combing through newspapers and reading the Web. Every time he stumbled on a new piece of information about the 19 suicide terrorists, he entered it into his computer, processing it through network-analyzing software called InFlow, which he had invented a decade earlier.

Six weeks after the attacks, Krebs knew that if the federal government had been using InFlow in 2000 and 2001 to track just a few of the 19 hijackers, it might have prevented the attacks.

His insights draw on social network analysis _ a combination of mathematics, computer science and anthropology used mainly to examine business efficiency. The study of complex systems has become an important tool in the war on terrorism only in the past two years because of major advances in computing power.

Government investigators primarily work top-down, seeking terrorists by obtaining personal details about a vast pool of people to pick out those who might pose a threat.

Krebs believes this approach to data analysis produces too many false positives. Instead, he uses a bottom-up approach, building knowledge by “snowballing” the data _ starting with the most likely members of a terrorist cell and letting each new bit of information lead to a clearer view of the network.

Just days after the Sept. 11 attacks, Krebs typed three names into his database: Mustafa Ahmed al-Hawsawi, the mission’s paymaster; Mohamed Atta, its ringleader; and Nabil al-Marabh, another financier.


Using InFlow, Krebs connected the dots. The program weighed the importance and frequency of the links between the terrorists. Strong ties, such as living in the same apartment and communicating frequently, were assigned more importance.

Krebs eventually made a startling conclusion: “Back in the year 2000 we knew two of those guys, Khalid Almihdhar and Nawaf Alhamzi. The CIA had spotted them at a meeting in Malaysia and they’d been in the U.S. So if you knew that, and you knew they were associated, they’d have been a good pair to start mapping. … In 2001, when someone said we have to find these guys, they couldn’t. If someone had had this map, they could have found them in a day.”

Kathleen Carley looks at social networks as dynamic relationships that change over time. When two people interact, the network changes, and sometimes small-scale events result in large-scale effects.

The promise of dynamic network analysis lies in its ability to identify emerging threats, which is why Carley, director of the Center for Computational Analysis of Social and Organizational Systems at Carnegie Mellon University in Pittsburgh, constantly runs network scenarios through her computer.

Like Krebs, Carley begins each analysis by collecting data from publicly available sources. She uses a text-mining tool called “Auto Map,” which looks through newspapers, documents and Web pages (100,000 a day) to identify relationships.

Then she runs a series of virtual experiments where she removes each identified member, one by one, and watches what the computer “learns” about how the network will react.


“We collect data on real-world groups. We set up beginning conditions _ who is talking to who and at what time. Then we set up `what-if’ scenarios.”

Last year, Carley ran such a scenario to see what her software would predict if Sheik Ahmed Yassin, the spiritual leader of the Palestinian terrorist group Hamas, were assassinated. Its answer: Abdel Aziz Rantisi would rise to power and temporarily strengthen Hamas.

Two weeks later, when Israeli missiles killed the sheik, Rantisi took over, briefly revitalizing the militant group.

“DNA (dynamic network analysis) can’t predict specific actions, but it can tell who is getting stronger,” said Carley. “The ability to use these tools for forecasting about terror has happened just now, this year … (and) it could have predicted Mohamed Atta was a key operative.”

The “what-if” scenario also is the bread and butter of Bruce Horne, director of “Project Albert” (named after Albert Einstein) at the Marine Corps Warfighting Laboratory in Quantico, Va. The project’s goal is to find a networking tool that can assess an enemy’s next move and offer an offensive maneuver to thwart it.

“Our tool isn’t for predicting, but to better understand what course of action needs to be taken,” Horne said.


Hamid, who now practices a nonviolent form of Islam and believes the Koran has been distorted by the jihadis, decided to begin with a simple act.

Before leaving college, he carried two boxes of books _ all espousing jihad _ into his yard. He poured gasoline over the pile and set it on fire.

A healed mind, he felt, could rise only from the ashes of hate. While he watched the blaze, he remembered reading the Bible on his own in high school, and the words that had so moved him.

“Blessed are the merciful, for they will be shown mercy.

“Blessed are the pure in heart, for they will see God.”

“Blessed are the peacemakers, for they will be called sons of God.”

PH END NUTT

(Amy Ellis Nutt is a staff writer for The Star-Ledger of Newark, N.J.)Editors: To obtain a photo of Tawfik Hamid and Valdis Krebs, along with an illustration, go to the RNS Web site at https://religionnews.com. On the lower right, click on “photos,” then search by subject or slug.

Donate to Support Independent Journalism!

Donate Now!