Summers are shattering heat records; the sun’s UV intensity has been increasing over the past decades, and the messaging around sunscreen and sun protection has moved beyond its cautionary undertone.
Wear an SPF-30 broad-spectrum sunscreen at all times, reapply every couple of hours, and avoid the sun as much as possible, they say. The American Academy of Dermatology recommends sunscreen for everyone: “Sunscreen use can help prevent skin cancer by protecting you from the sun’s harmful ultraviolet rays. Anyone can get skin cancer, regardless of age, gender, or skin tone.”
It was surprising, therefore, when a review published in the Journal of the American Institute of Dermatology concluded that the universal recommendation to use sunscreens (to prevent skin cancer) wasn’t backed by robust data.
“Sunscreen is a multi-billion dollar industry, and its efficacy in the prevention of skin cancer is often taken as fact,” the authors wrote. “Despite this, there are only four prospective studies that examine sunscreen’s role in preventing skin cancer, and none of these examine the efficacy of sunscreen in preventing skin cancer in otherwise healthy individuals.”
Does this mean that the overpowering dissonance around sun safety is a direct result of poor scientific scrutiny? Maybe. One in seven adults under 35 believes daily sunscreen use to be more harmful to the skin than direct sun exposure.
People question whether sunscreens are safe; if they’re picking the right “type” or formulations; if sun protection is a sham; or if sunscreens themselves might be causing cancer.
Here’s the short answer: No, sunscreens are not bad for you. People making false claims about sunscreens sometimes cite data showing the exploding rates of melanoma. Since the mid-1970s, the incidence rate of melanoma has skyrocketed over threefold; it is now one of the most commonly diagnosed cancers in the United States. Coincidentally, this happened over a period when sunscreen technology and adoption improved considerably.
But as multiple analyses suggest, increased UV exposure isn’t the primary driver of this epidemic — instead making the case that by screening more people, doing more biopsies, and classifying more ambiguous lesions as cancer, healthcare providers have been flagging too many harmless skin spots as cancerous; it’s a crisis birthed by overdiagnosing.
To be clear, this doesn’t rule out the connection between UV exposure and skin cancer risk, or play down the threat of skin cancer. Overwhelming evidence shows that over time, exposing the skin to
the sun causes wrinkling, sagging, age spots, and yes, occasionally even skin cancer.
The argument here isn’t against sunscreens either. Dermatologists almost unanimously agree that sunscreen is safe, and it has been shown to cut melanoma rates by more than half when used as directed.
The problem is that information on precisely how effective sunscreen is, and in what formulations, is far more scarce. It’s as if you’re asking consumers to apply sunscreen and simply hoping that it works.
Concerns around sunscreen safety were in part stoked by the FDA itself, after it published a study in 2020 that found some sunscreen ingredients in human bloodstream in trace amounts.
The Food, Drug, and Cosmetic Act of 1938 effectively limits the FDA’s ability to approve chemical filters in sunscreens since it classifies sunscreens as drugs, rather than as cosmetics as they are in much of the world, requiring significantly more testing. Oxybenzone — one of the most common filters in sunscreens, and central to this controversy — hasn’t yet been proven harmful to humans, even if we know that our bodies absorb it.
As a result, the FDA has not approved a single new sunscreen filter since the 1990s, insisting instead on retesting existing ingredients for yet unproven risks.
And clinical results usually rely on animal studies, results which may not be replicable in humans. The agency may even cite tests that bear no resemblance to real-world situations, like feeding rats sunscreen ingredients and exposing them to oxybenzone levels that no human would reach even after applying sunscreen every day for 277 years.
More expensive U.S. products are better than they used to be in terms of texture; they may not leave behind a white film anymore, but their active ingredients remain unchanged — even though tens of millions of Americans have used sunscreen daily for decades with none of those hypothetical problems ever seen. Things are muddled further since
health authorities themselves remain divisive over sun exposure and its range of drawbacks (and benefits).
The long-standing consensus of most American public health authorities has been to avoid the sun as much as possible — ignoring a growing body of science that suggests there are benefits from getting a little sun, benefits for improved bone health, mood, circadian rhythms, and vision.
A recent article in the Atlantic noted how a consortium of Australian public health groups now recommends a modest amount of sun exposure based on some of that evidence, despite Australia reporting the highest incidence of skin cancer globally.
“Exposure to UV radiation may have benefits independently of vitamin D, particularly for the immune system,” read the group’s statement. It advises people with low to intermediate risk of skin cancer to get ample sun exposure to acquire these benefits.
All things told, sunscreen is indispensable in self-care, since its benefits far outweigh the proposed risks. But a prudent approach toward picking on might be understandable, especially for Americans, who are unlikely to get decent sunscreens — ones that better block ultraviolet rays (both UVA and UVB), or are less readily absorbed by the skin.
(Featured Image: Generated with AI)