GNN - Genome News Network  
  Home | About | Topics
Genomes and Medicine
New Drugs, New Challenges: FDA Tests Genome Data
By Edward R. Winstead

Genomes & Medicine
New Drugs, New Challenges: FDA Tests Genome Data
Genomics Leads to New Leukemia Treatment
Genome Influences Effects of Antidepressants
Genes, Early Lifestyle Affect Onset of Breast Cancer
Diagnosing Ovarian Cancer by Proteomics

Rapid progress in genomics is driving the U.S. Food and Drug Administration to think long and hard about how microarray data, DNA tests, proteomic studies, and other genomic technologies will fit into the regulatory picture that governs approval of new drugs and therapeutic devices.

FDA, PhRMA and others struggle to deal with massive amounts of microarray data.

"The FDA believes, as do I and most people in the field, that genomic technologies and data are going to be important in developing new drugs and therapeutic devices," says James MacGregor, FDA's deputy director of the National Center for Toxicological Research in Bethesda, Maryland.

But the technologies are new, their results are not always reproducible, and in some cases their significance has yet to be confirmed.

"The agency in general is trying to learn more about these technologies through meetings, seminars, and scientific workshops, but we're still very much on a learning curve," says Joseph L. Hackett of the FDA's Center for Devices and Radiological Health, in Rockville, Maryland—one of the FDA's many specialized centers.

"We might see a test for cancer diagnosis, or a test for identifying a microorganism, or a test for pharmacogenomics to see whether a patient is likely to respond to a drug," he notes. These tests are going to be based on genomic data so FDA reviewers will need to be conversant in the technology.

This week the FDA met with representatives of the Pharmaceutical Research and Manufacturers Association (PhRMA) in Washington, D.C., at what was supposed to be a two-day meeting on genomics, but it was cut short by Hurricane Isabel. About 180 people at the FDA, including scientists and reviewers, exchanged ideas with industry experts.

“We recognize that a lot of the innovations we’re going to be seeing soon started in industry, and we want to be up to speed on them,” says the FDA’s Suzanne Fitzpatrick, who organized the meeting.

“The meeting was strictly about the science, and not about how the FDA is going to regulate this or that,” Fitzpatrick adds.

At a meeting of an FDA advisory panel this past June, some scientists said it would be premature to create regulatory standards because genomic technology is changing all the time. But without standards, FDA cannot use genomic data in its regulatory decisions. So the question is: when is the technology solid enough to use and how will the agency decide that.

Indeed, at the moment, the ability to generate data far exceeds the ability to make sense of it.

At the June advisory panel meeting, William D. Pennie, an expert on microarray data at Pfizer Inc., raised the issue of genomic data overload.

Pennie said that the ability to generate vast quantities of data in a single experiment, initially seen as a great advantage among the field, had quickly turned into a great challenge for the people who are managing, storing and interpreting the many millions of data points.

These problems have been the focus of the Microarray Gene Expression Data Society, an international organization of biologists, computer scientists, and data analysts. Their goal is to facilitate the sharing of microarray data, and they have proposed standards that many in the field have adopted.

Still, many people are betting that microarrays will help drug makers identify people who are genetically predisposed to responding to a one drug better than another. Microarrays could also reveal the genetic profile of people who are likely to get sick from taking certain medicine.

But given the questions about the validity of microarray data, pharmaceutical companies are nervous that the FDA could reject an application after misinterpreting data. The FDA believes that the sooner it becomes familiar with microarray data the better it will be for everyone, including the public.

“We’re hoping manufacturers will submit data as educational tools for ourselves, and there will be a joint workshop with industry in November to discuss this,” says Hackett.

The November meeting will explore the idea of a “safe harbor” in which companies can submit microarray data without worrying that it will count against them. The idea came out of a joint workshop last year. The proposal, which is still a work in progress, invites “exploratory” data, defined as data “generated by technologies whose validity was uncertain, and whose interpretation was tenuous, uncertain or not predictive of outcomes.”

The FDA has several research centers, and each has programs to evaluate tools such as DNA microarrays, which scientists use to track activity across the genome. In the laboratory, microarrays have been used to distinguish between patients who appear to have the same form of cancer, for instance, but whose diseases have distinct genetic signatures—indicating they are not the same disease at all.

Before tests such as these, which are still in research phases, can be applied to the general patient population, the FDA needs to establish criteria for evaluating their accuracy and usefulness.

“We are trying to educate our staff so they can deal with this technology, and we are working with industry to develop guidelines” for submitting genomic data, says Raj K. Puri, acting director of the Division of Cell and Gene Therapies at the FDA’s Center for Biologics Evaluation and Research.

Adds MacGregor: "We're learning about the technologies, but the fact is everyone else is learning about these technologies because they're new."

The newness is a problem and a challenge. There are no universal standards for analyzing and reporting microarray or other genomic data. What’s more, it’s difficult to compare data from one study to the next without detailed information about how experiments were done.

Meanwhile, the FDA has two pilot projects to explore the use of genomic databases and technologies for storing and submitting genomic data. In one project, Expression Analysis, Inc., of Durham, North Carolina, is preparing a “mock submission” of microarray data for the FDA. The data come from a toxicology study conducted by Schering-Plough Corp. for a drug that did not work out.

“We are helping the FDA to better understand how microarray data might be used in regulatory submissions,” says Steve McPhail, CEO of Expression Analysis. “Our role is to work with the FDA and industry to establish regulatory and quality benchmarks for this technology.”

“The FDA certainly sees the potential of this technology in personalized medicine,” says McPhail. “It is personalized medicine in the sense that microarrays can appropriately identify people as responders or non-responders to drugs based on their genetic profiles.”

Janet Woodcock, who directs the FDA’s Center for Drug Evaluation and Research, told the advisory panel in June: “This is really about the translation of innovative science to bedside medicine.”

"There are a lot of actual research activities going on within the FDA," adds James MacGregor. "It's a misconception that we don't understand the technology; we have a number of experts, and we're evaluating the tools, although many of the reviewers may not be familiar with them yet."

. . .

Back to GNN Home Page