MADURAI, India – Aravind Eye Hospital treats anyone coming through the door, with or without money.
Every day, more than 2,000 people from all over India and sometimes other parts of the world come into the halls and waiting rooms of this 43-year-old hospital in the southern part of the country. On a new morning, Vt Muthusamy Ramalingamm, a local resident, entered a second-floor room, sat down and rested his chin on a small stationary unit that pointed a camera in the eyes.
A technician dropped on a screen at the back of an eye scanner, and within a few seconds a diagnosis occurred on a computer against the wall. Both eyes showed signs of diabetic retinopathy, a condition that can cause blindness if left untreated.
The project is part of a widespread effort to build and distribute systems that can automatically detect signs of disease and disease in medical scans. Hospitals in the US, UK and Singapore have also been running clinical trials with systems that detect signs of diabetes blindness. Researchers around the world are exploring techniques that detect cancer, stroke, heart disease and other conditions in X-rays and in M.R.I. and CT scans.
Last month, regulators have certified the eye system for use in Europe under the Verily name. And the Food and Drug Administration recently approved a similar system in the United States. But hospitals enter easily because they believe that they use systems that are very different from technology traditionally used for healthcare.
Aravind’s founder, Govindappa Venkataswamy, an iconic figure in India known as “Dr. V” and died in 2006, foresaw a network of hospitals and vision centers that function as McDonald’s franchises, and systematically reproduce cheap forms of eye care for humans all over the country. There are more than 40 vision centers around India.
A neural network is the same technology that rapidly improves facial recognition services, talking digital assistants, non-car drivers, and instant translation services like Google Translate.
As these systems learn from vast amounts of information, researchers are still struggling to fully understand how they work – and how they ultimately behave. But some experts believe that when they are female, tested and properly deployed, they can basically improve health care.
At Aravind, computer screens on the walls translate into waiting rooms information into the many languages spoken at the hospital. 19659002] During his graduation, Ramalingamm, 60, spoke Tamil, the ancient language of southern India and Sri Lanka. He said he was comfortable with a machine that diagnoses his eye condition, partly because it happened so quickly. After the initial screening of A.I. system, doctors can treat the eyes, perhaps with laser surgery, to avert blindness.
Doctors can sometimes make a diagnosis for cataracts and blurred eye scans. The Google system is still struggling to do so. It is widely trained on clear, unlimited retina images, although Google explores the use of lower quality images.
Even with this limitation, Dr. Kim said, the system can increase what doctors can do on their own. Aravind already runs small vision centers in many of the towns and villages that surround Madurai. The hope is that Google’s system can make the eye screen easier on these facilities and perhaps other places in southern India.
Today, they take the techniques in the eye ducts and send them to doctors in Madurai for examination. Automated diagnosis can streamline and expand the process and reach more people in more places – the type of “McDonaldization” adopted by Dr. V.
The technology is still facing regulatory hurdles in India, partly because of the difficulty of navigating the country’s bureaucracy. And while Google’s eye system is now certified for use in Europe, it’s still waiting for approval in the US.
Luke Oakden-Rayner, director of medical image research at Royal Adelaide Hospital in Australia, said that these systems might need new regulatory frameworks, as existing rules are not always sufficient.
“I’m not convinced that people care enough about the security of these systems,” he said.
Although these deep learning systems are new, they are hardly the first effort to support diagnosis through computer technology. As Dr. Oakden-Rayner pointed out, software called Breast CAD – approved by the Food and Drug Administration in 1998 – has been undertaken in the United States to help with breast cancer detection, partly because Medicaid offers discounts when using technology. However, studies have shown that the patient outcome was not improved and in some cases declined.
“On paper, Google’s system works very well,” said Dr. Oakden-Rayner. “But when you roll it out to a large population, there may be problems that don’t show up for years.”