After 30 years of studying the brain as a guide to building faster computers, Kwabena Boahen may have given his fellow researchers a much-needed template for finishing the job.
His story, as a Stanford professor and Ghanaian emigrant, exemplifies what America has to gain — or lose — by inhibiting immigration.
You might not know it from the promising talk of conversational computers, self-driving cars and lifelike human prosthetics, but computing is confronting a crisis. After decades of rapid acceleration, the speed of transistors — computerization’s fundamental building blocks — is hitting a wall.
But to the relief of researchers and industry leaders, a superstar scholar who immigrated to the United States from Ghana may have found a way forward. It concerns neuromorphic computing — mimicking the processes of the human brain — which has long been recognized as both a hugely daunting technological challenge and the likely key to solving the transistor problem.
Amid broad scientific uncertainties over how best to proceed, Kwabena Boahen, a professor of bioengineering and electrical engineering at Stanford University who is one of the field’s pioneers, has now outlined a badly needed road map for finishing the job.
It appears to be a “transitional moment” for neuromorphic computing, said R. Stanley Williams, a senior fellow at Hewlett Packard Labs.
The central barrier to a future of ever-smarter products is well understood: Given the simple constraints of matter, transistors are now being made almost as small as possible. Researchers sounding the alarm bells “have been at this for a long time,” said Randal E. Bryant, a professor of computer science at Carnegie Mellon University. As Mr. Bryant acknowledged at a recent scientific conference, “there’s a certain point where atoms are atoms.”
That’s where Mr. Boahen and his fellow “neuromorphs” come in. Neuromorphic computing is an extremely complicated mix of advanced sciences, including human biology, physics, mathematics, and chemistry. The goal is to build automated systems far faster and more efficient than current computer-chip technologies allow, largely by using the human brain as their template.
A key insight is a back-to-the-future concept: Give up our heavy reliance on digital systems, and return to analog.
The Analog Brain
At its foundation, the transistor is a simple device: three wires sticking out of a tiny sandwich of materials arranged so that a sufficient electrical signal applied to one of those wires enables current to flow between the other two.
By taking the basic on-or-off switching ability of a single transistor — a signal that could be represented as either a zero or a one — and multiplying it thousands of times within a device, scientists and engineers have made generations of astounding products, including the computers that helped put humans on the moon.
The yes-or-no character of those signals provides some clear advantages over analog, which refers to a signal that varies within a broad range of possibilities. Old analog TVs, for instance, were notoriously susceptible to electrical interference from changes in weather, regularly giving viewers fuzzy and faded pictures as slight electrical disturbances from distant broadcast antennas were reflected in the televised images.
Digital signals, by contrast, put a priority on being flawless. For a TV, or a computer, one strategy to avoid transmission errors is to send signals at a relatively high rate of power to ensure each unit of data is clearly read as a zero or a one.
But that demand for power limits size and performance. And a reliance on digital is just one way in which modern computer systems remain, at least as compared to the brain, highly inefficient. While modern digital computers feed their processing tasks into thousands of “cores,” the brain’s army of processors — neurons — number in the billions.
The brain is also, in computer terms, an ever-adjusting mix of software and hardware that grows and disposes of its processing equipment as needed. A digital representation does help computers process data very accurately. But a computer’s inability to seamlessly trade precision for speed, and to widely share and adjust processing duties, impose far more critical limitations.
The brain knows these things. A product of billions of years of evolution, the essential organ of humanity is a marvel of efficiency. Mr. Boahen calculates that the human brain has the computational power of a refrigerator-sized supercomputer that is about 50 times heavier, takes up 100 times more space, and consumes 100 times more power than the brain.
The brain does use a type of binary signal to relay data, he said. But those signals are many thousands of tiny blips of electrical information that are processed by the brain in essentially an analog, or continuous, fashion. That combination of digital and analog, Mr. Boahen said, is “fundamental to the difference between the computer and the brain.”
A Neuromorph’s Manifesto
Mr. Boahen has been working for 30 years to make computers act more like brains. But it was in October, at the Institute of Electrical and Electronics Engineers’ first International Conference on Rebooting Computing, that his efforts gained a significantly new level of appreciation.
The son of Albert Adu Boahen, the professor of history at the University of Ghana who helped lead his nation to democratic rule, Mr. Boahen did not arrive at the session as a revolutionary. But he may have left as one.
“During his talk you could see people’s eyes growing to the size of whale’s eyes, and smoke coming out of their ears.”
In a 53-minute presentation, Mr. Boahen outlined five main challenges to producing a working computer based on neuromorphic principles. Each point was highly technical — covering such challenges as developing circuits that “gracefully” respond to signals and work without external timing cues — but the effect was of a salvo.
“During his talk,” said Mr. Williams, who is also an adjunct professor of chemistry at UCLA, “you could see people’s eyes growing to the size of whale’s eyes, and smoke coming out of their ears.”
Many in the audience had spent years pursuing neuromorphic computing, Mr. Williams said, and Mr. Boahen had just given the research community a concise manifesto making clear which avenues of exploration deserved more attention and which were probably a waste of time.
“He effectively convinced everybody else in the room, who thought they were doing neuromorphic computing, that they didn’t actually know what it was,” Mr. Williams told last month’s annual conference of the American Association for the Advancement of Science in Boston. “In his talk, he defined for the first time really what neuromorphic computing is.”
The presentation also showed how far ahead of the pack Mr. Boahen appears to be. He already has built a small robot with a functioning mechanical arm using neuromorphic chips. Among the five challenges he listed, Mr. Boahen already has largely solved four, Mr. Williams said in an interview. “Others maybe have gotten to one or two,” he said.
The last remaining obstacle, Mr. Boahen explained, involves figuring out ways to accurately convey the continuously changing signals in the series of “spikes,” which are those blips of electrical signal that the brain uses to pass along data. Patterns of those spikes are like super-precise versions of the bar codes found on supermarket items, and reading their timing and placement is central to the brain’s internal communication.
Mr. Boahen’s success is a tribute to the value of thinking slowly and carefully, and not worrying about “the latest trend,” Mr. Williams said. It’s also a reminder, at this moment in American history, of the value of immigrants, he said.
A ‘Traumatizing’ Climate
After a recent speech at the University of Illinois at Urbana-Champaign, Mr. Boahen met with a graduate student from Lebanon who had a long list of questions about his work. The two also talked about U. S. politics, which had become “really scary for him,” Mr. Boahen said of the student. Lebanon is not among the majority-Muslim countries affected by President Trump’s travel-ban proposals, and the student is Christian.
But the student spoke of fear that his elderly parents might die if denied entry at a U.S. airport and forced to make back-to-back 20-hour flights. He described being warmly accepted on campus but encountering racism not too far out of town. And the student wondered if those conditions, combined with a lukewarm American commitment to funding research, mean that he’ll eventually have to take his computing talents elsewhere.
Mr. Boahen said he had tried to reassure the student, who asked not to be identified by name, that circumstances change. “But it’s really a shame,” Mr. Boahen said, “that people who just came here don’t have that context, and it’s really traumatizing what we are putting them through.”
At least Mr. Boahen is not worried for himself: He obtained U.S. citizenship last year. “I could see the writing on the wall,” he said. “My father fought a dictatorship, and I don’t want to try.”
Source: Chronicle of Higher Ed.