While running does place a lot of force on a person's knees, cartilage may be able to adapt to that force and grow thicker, reducing a runner's risk of developing osteoarthritis in their knees later in life, according to a study published in PeerJ, Gretchen Reynolds reports for the New York Times' "Well."
In previous research, Ross Miller, an associate professor of kinesiology at the University of Maryland and a co-author of the new study, found that while people hit the ground harder while they ran when compared with walking, their stride while running was longer than when they walked. That finding suggested that, cumulatively, the force exerted on an individual's knees over time would be roughly the same whether someone was walking or running, Miller theorized.
But Miller later wondered if that hypothesis really accounted for why running wasn't more damaging to people's knees, or if cartilage in runners' knees might actually adapt over time. So, for a new study, he and other researchers asked 22 healthy young adults—12 men and 10 women—to run at self-selected "normal and comfortable" speeds around a 50-meter indoor track. At the study's onset, none of the participants reported having injuries that affected their ability to walk within the past year.
The researchers fitted a 12-meter section of the track with eight force platforms and 12 motion cameras to determine the force generated by the participants while walking and running. The researchers then used the numbers measured by the force plates, alongside data from previous studies of cartilage that had been tested in a lab and other sources, to run computer simulations to determine what would happen to an adult's knee cartilage if they walked for six kilometers a day for years, or if walked three kilometers and ran three kilometers each day for years. The researchers tested the scenarios for each of those situations, looking at what would happen to the participants' knees if:
- Knee cartilage did not repair itself or adapt at all;
- Knee cartilage would repair itself after it was damaged but not change otherwise; and
- Knee cartilage would grow stronger and thicker to adapt itself to increased demand on the knee.
The researchers found that running does place more force on the knee than walking. For example, their simulations determined that, without accounting for cartilage repairing or adapting itself, people who walked every day had a 36% risk of developing arthritis in their knee by the age of 55. If cartilage repaired or adapted, that risk dropped to 13%, which is what research quantifies as the "real-world arthritis risk for otherwise healthy people," according to "Well."
In comparison, people who ran every day had a 98% risk of developing arthritis in their knee by the age of 55 if the cartilage did not repair itself or adapt, and a 95% risk of developing arthritis if the cartilage did repair itself. If the cartilage adapted, however, runners' risk of developing arthritis dropped to 13%, the researchers determined.
According to Miller, the new study's results suggest that cartilage is malleable, but researchers need measurements of molecular and other changes that occur within cartilage after running to verify their findings. Even so, the study seems to suggest that "running is unlikely to cause knee arthritis by wearing out cartilage," Miller said.
The researchers cautioned that while previous research on animals such as dogs and horses suggests that cartilage does adapt to increased demand by growing thicker and becoming more elastic, "similar data on running training in humans [is] not available, and even these animal model studies are not repeated-measures designs."
However, the previous research on animals suggests the cartilage adaptations that the researchers modeled for humans in the study "are not implausible over years of consistent training," they wrote (Reynolds, "Well," New York Times, 10/22/20; Miller/Krupenevich, PeerJ, 8/5/2020).