When you received a diagnosis of diabetes, it likely was based on the results of one of three tests (or a combination of these tests): a fasting plasma glucose test, an oral glucose tolerance test, or a random plasma glucose reading. For many years, these have been the standard tests for assessing people with symptoms of diabetes and for screening people believed to be at risk of developing diabetes.
In 2010, the American Diabetes Association began recommending that another blood test, the glycosylated hemoglobin test — abbreviated either HbA1c or A1C — be used for diagnosing Type 1 or Type 2 diabetes. The A1C test has already proved its usefulness for monitoring the level of blood glucose control over time in people already diagnosed with diabetes. In its new, diagnostic, role, it is not intended to completely replace the older options, which still may be preferred in certain circumstances. But the A1C test is believed to have certain advantages over the older options, particularly when screening for Type 2 diabetes.
Changes over the years
This is not the first time in history that new methods of diagnosing diabetes have been developed or have become more widely accepted by the medical profession. Indeed, diabetes has been around a lot longer than any modern laboratory blood test.
The word “diabetes” is derived from a Greek word that means to siphon, pass through, or rapidly drain. (Accounts differ as to when it was coined; some say 200 BC, others during the second century AD.) But many centuries before the term was ever applied to the condition we now know as diabetes, a disease in which too much urine is eliminated was recognized. Somewhere along the line, it was noticed that this urine had a sweet smell and taste (and attracted insects and other animals), and for hundreds of years, the sweet taste of urine was the key to diagnosing diabetes.
The relationship between sweetness in the urine and sweetness in the blood was observed in 1776 by English physician Matthew Dobson. When Dobson evaporated the urine of people with diabetes, he found a substance that looked and tasted like brown sugar, and he also observed that people with diabetes had a sweetish taste to their blood. This led to the understanding of diabetes as not just a disease of the kidneys but as a system-wide disorder.
In the 1800’s, the first laboratory tests for glucose in the urine were developed, sparing physicians from having to taste it. The first tests simply identified that glucose was present in urine; later ones were able to measure the amount of glucose. A further advance, attributed to British physician Frederick Pavy, was to establish a quantitative relationship between the level of glucose in the blood with the level of glucose in the urine.
The 1900’s brought better ways to measure glucose in urine, as well as the first blood glucose tests.
As might be expected, the first blood glucose tests were not all that accurate, but over time, more and more accurate blood tests have been developed, leading to a greater understanding of the disease itself.
How today’s tests work
As mentioned earlier, the four tests currently used in the United States to diagnose diabetes are the fasting plasma glucose (FPG) test, the oral glucose tolerance test (OGTT), the random plasma glucose test, and the A1C test.