Can't Touch This

This blog / podcast is also available via the podcast link.

In full disclosure, I am an investment advisor, and I practice what I preach. I have a method that I believe works, but as every compliance officer will remind you, past performance is no guarantee of future performance. At the end of this podcast, you will hear the full disclaimer, and I encourage you to listen carefully. But let me set the stage for where I am coming from. I favor large, well-capitalized companies over small speculative ones. I focus on megatrends, not on flipping stocks, day trading, or chasing the latest meme stock making headlines. I am old-fashioned, old-school, and unapologetically traditional in my approach.

My emphasis is on income over growth, because the people I work with are not 22-year-old TikTok traders. Nearly every one of my clients is over the age of fifty, many are fully retired, and the average client is probably somewhere between their mid-seventies and early eighties. That perspective matters. When you are living on the results of decades of hard work, your primary concern is cash flow, preservation, and stability—not wild speculation.

Because of that, I have no problem calling things the way I see them, even when it ruffles feathers. In this discussion, you are going to see that I am willing to take on the medical industry, the insurance industry, Big Pharma, and any other entrenched interest that profits at the expense of the public. If you are ready for a frank, fact-driven conversation about health, medicine, industry, and how these forces intersect with your financial life, buckle up—this is going to be a good one.

From a historical standpoint, there is no doubt that human longevity has been influenced more by evolving data, environmental awareness, and fundamental healthcare advances than by any modern medical breakthrough. Disease, famine, unsanitary conditions, and the lack of emergency medical care once shaped the very structure of society. The most profound improvements in public health came not from miracle cures, but from sanitation. Clean, running water, the recognition of basic nutritional needs, and the simple but powerful use of vitamins radically altered survival rates and extended lifespans across populations.

Consider polio as an example. Statistically, between 95 and 99 percent of those infected were asymptomatic—meaning they carried the virus but showed no outward signs of illness. They did not develop paralysis, fever, or the visible markers often associated with polio, yet they were counted in the broad measures of infection. This reality created a significant divide between perception and actual risk. The majority experienced no symptoms, but the minority who did suffer visible and lasting effects shaped the public’s understanding and fueled fear. The same dynamic resurfaced with COVID-19. A large percentage of individuals who tested positive were asymptomatic, experiencing no meaningful illness or impairment. Yet fear—driven by daily case counts, political decisions, and media amplification—overrode the statistical reality. Public policy was shaped less by the weight of evidence and more by the amplification of fear, much as it had been during the polio era.

Another dimension of the polio story is less frequently discussed: the role of vaccination itself in producing harm. Many individuals suffered adverse effects from early polio vaccination campaigns. In certain cases, recipients of the vaccine not only developed illness but contracted polio as a direct result of the vaccination process. This was most notably exposed in the Cutter Incident of 1955, when a batch of improperly inactivated vaccines led to thousands of cases of vaccine-induced polio. Some children were permanently paralyzed, and others died. When the numbers are compared, the sobering reality emerges: for a measurable period, the number of individuals harmed or infected by the vaccination process itself rivaled or even exceeded the number who would otherwise have been naturally infected and symptomatic. While the intention was prevention, the outcome was a tragic inversion—the cure, in some cases, became the cause.

The Ford Administration’s swine flu program in 1976 reinforces the pattern. After the death of a soldier at Fort Dix, New Jersey, health officials feared a repeat of the 1918 pandemic. President Gerald Ford authorized an ambitious plan to vaccinate every American. Within months, more than forty million people received the shot. But reports surfaced of Guillain–Barré Syndrome, a serious neurological disorder, developing in vaccine recipients. Although the numbers were small in relative terms, the pattern was undeniable, and the program was halted in December of that year. The anticipated pandemic never arrived, but the damage was done. Litigation followed, and Congress had already moved to indemnify vaccine manufacturers under the National Swine Flu Immunization Program Act. Liability shifted from industry to government, and taxpayers ultimately paid tens of millions in settlements. This episode set a lasting precedent: pharmaceutical companies would participate in rapid or large-scale vaccination campaigns only if shielded from responsibility. The corporate shield was reinforced, and public trust was shaken when harm came not from the disease itself but from the intervention meant to prevent it.

When we step back from these episodes, a broader pattern emerges. It is not simply about one disease or one administration’s failure. It is about the origins and trajectory of the very institutions that came to dominate American public health. From their inception, these organizations were not neutral arbiters of science. They were born out of crisis, funded through political channels, and quickly intertwined with corporate interests that shaped both their direction and their credibility.

The Centers for Disease Control and Prevention, for example, was created during World War II to stop malaria from spreading through the American South and undermining wartime productivity. Its first major initiative was not a medical breakthrough—it was a chemical campaign. In partnership with Monsanto, the CDC promoted and deployed DDT, spraying vast swaths of land and exposing generations to compounds later tied to cancer and other long-term health issues. Few people realize that the CDC’s headquarters in Atlanta sits on land gifted by Coca-Cola. Fewer still are aware that a Freedom of Information Act request uncovered emails showing CDC officials coordinating with Coca-Cola executives to suppress damaging information about sugar.

The same story repeats itself with the Environmental Protection Agency. In FOIA documents, a senior EPA official boasted in writing to Monsanto that if he could stall or derail disclosures about the risks of glyphosate, they owed him a medal. This was not regulation—it was collusion. And the pattern is universal. The FDA, CDC, EPA, NIH—all developed hand in hand with the industries they were supposed to oversee. Their budgets, their buildings, and often their intellectual frameworks were underwritten by the same corporations whose products they regulated. Oversight became a closed loop: industry produced, regulators approved, industry profited, and regulators enjoyed influence, funding, and revolving-door career opportunities.

Now let us go back and talk about sugar. For all the attention given to oil barons and railroads, one of the most powerful forces in shaping both American law and American health has been sugar. In the late nineteenth century, the American Sugar Refining Company controlled up to 99 percent of the nation’s refined sugar. This dominance was the catalyst for the Sherman Antitrust Act of 1890. People assume antitrust began with oil or railroads, but the sugar trust was the original trigger. The reason it was called “antitrust” is because, in those days, corporations as we know them did not exist. Large business entities were structured as trusts—legal arrangements where a handful of trustees controlled entire industries by holding the shares of many smaller companies. Breaking up those trusts was the focus, hence the term “trust-busting.”

The transformation deepened when the Supreme Court decided *Santa Clara County v. Southern Pacific Railroad* in 1886. While the case itself was about taxation, the Court accepted the idea that a corporation was a “person” under the Fourteenth Amendment. From that moment, corporations gained legal personhood. They could own property, enter contracts, sue and be sued. More importantly, they could shield the individuals running them from personal liability. Responsibility shifted from the people making the decisions to the corporate entity itself. If wrongdoing occurred, it was the corporation that bore the penalty, which in practical terms meant shareholders—ordinary citizens investing their savings—paid the price. The executives responsible often walked away untouched, sometimes even rewarded.

This recognition of corporate personhood institutionalized the corporate shield. It allowed for rapid industrial expansion but also created fertile ground for misconduct. As corporations became publicly traded, the burden of wrongdoing spread even further. The great scam was not only that corporations could act with impunity, but that the costs of their actions could be socialized across the investing public. Combined with increasingly complex financial products, this structure reinforced the distance between action and accountability. Risk was privatized at the top, reward was concentrated, and losses were absorbed by shareholders and taxpayers.

Sugar’s story did not end with monopoly power. What was once a luxury product became a hidden staple of the American diet. Processed sugar worked its way into nearly every food product, from cereals to soft drinks, from baked goods to condiments. The human body, designed to handle modest amounts of natural sugar found in fruit or honey, was bombarded with quantities it could not process. The consequences are now in plain view: obesity, diabetes, cardiovascular disease, chronic inflammation, liver damage, cognitive decline, and weakened immune systems. Excess sugar consumption fuels insulin resistance and drives depression and fatigue, yet it remains one of the most aggressively marketed ingredients in the food system.

The power of sugar is not just in its addictive quality but in the way it has shaped culture and commerce. Entire industries—soft drinks, snacks, breakfast cereals—depend on it. Advertising linked sugar to happiness, energy, and the American lifestyle. Scientific studies that questioned its safety were suppressed or funded into silence by sugar interests. Regulators turned a blind eye, even partnering with corporations like Coca-Cola to downplay the dangers.

The uncomfortable truth is that our food supply has been manipulated and contaminated for profit. Acknowledging this history forces us to confront how vulnerable society becomes when oversight agencies are not independent but entwined with the industries they regulate. The revolving door between regulators, Big Pharma, and big manufacturers ensures that conflicts of interest are not exceptions but the norm. When someone says “trust the government” or “trust the science” without scrutiny, it reflects either ignorance or complicity. Blind acceptance of federal agencies and their pronouncements is not only naïve but dangerous. History shows us that fear, profit, and influence have repeatedly outweighed truth, safety, and accountability. That pattern continues to this day, and it must always be questioned.

On the violence question, the public record is mixed and often sealed, but there are clear, documented instances where perpetrators were on antidepressants or had recently been prescribed them. Eric Harris had therapeutic levels of fluvoxamine (Luvox) on board at Columbine; Dylan Klebold’s tox screen showed no meds. Kip Kinkel had been prescribed fluoxetine (Prozac) prior to the Thurston High shooting; contemporaneous accounts note the prescription and discontinuation before escalation. ([Wikipedia][3]) In Red Lake, Jeff Weise was on fluoxetine, with family reporting a recent dose increase to 60 mg/day. At Northern Illinois University, Steven Kazmierczak’s girlfriend stated he had been prescribed Prozac (with Xanax and Ambien) and stopped the SSRI several weeks before the attack. In the Navy Yard shooting, the House Oversight report and major outlets documented Aaron Alexis being prescribed trazodone, an antidepressant often used for sleep, shortly before the killings. Go back to 1988 and you find Laurie Dann on clomipramine (Anafranil) plus lithium at the time of the Winnetka school attack. There are also smaller, less-fatal school cases that are still instructive: Elizabeth Bush’s 2001 parochial-school shooting, widely reported alongside her treatment for depression, and Jason Hoffman’s 2001 Granite Hills shooting, reported at the time as occurring while he’d been treated with antidepressants—both covered contemporaneously by national outlets, though medical specifics are thinner in open records. Add two more that often appear in public discussions but require care in wording: Virginia Tech’s Seung-Hui Cho had a significant mental-health history documented by the official review panel, but released records do not establish current antidepressant use at the time of the attack; Sandy Hook’s Adam Lanza’s official reports similarly do not confirm antidepressant medication at the time.

Here’s the bottom line for this section. The mechanistic “serotonin-deficiency” story that sold SSRIs has not held up under comprehensive review, while average drug benefits over placebo are modest. ([Nature][1]) Boxed warnings acknowledge an early-treatment suicidality signal in youth. ([U.S. Food and Drug Administration][10]) And although there are well-documented cases where mass shooters were on—or had recently been on—antidepressants (Harris/Luvox; Weise/Prozac; Kazmierczak/Prozac; Alexis/trazodone; Dann/clomipramine; Kinkel/Prozac), claims that “almost every” shooter was medicated are not supported by uniformly public, verifiable records; in many high-profile incidents, either medication was absent, unknown, or undisclosed. ([Wikipedia][2]) That is precisely why a rigorous yin-yang approach is needed: treat antidepressant benefits and risks with the same scrutiny we demand elsewhere; stop outsourcing conclusions to marketing narratives; and insist that media, regulators, and manufacturers disclose medication status and timing with the same urgency they devote to every other variable when tragedy strikes.

Now let’s shift to the subject of mental health and the way it has been framed, marketed, and manipulated. Depression, anxiety, ADHD, and a host of other psychiatric diagnoses are treated as if they are definitive conditions with measurable markers, yet there is no blood test, no brain scan, no laboratory measurement that confirms their existence. These diagnoses are subjective, built on questionnaires and clinical impressions, which leaves them wide open to influence. The entire theory that depression is the result of a serotonin imbalance has been disproven, yet it was one of the most successful narratives ever sold by Big Pharma. It created the justification for an entire generation of selective serotonin reuptake inhibitors, SSRIs, promoted as fixing a chemical deficiency that was never there in the first place.

The tools used to diagnose and expand these markets were not neutral. The screening protocols for ADHD and depression were in many cases designed by consultants with ties to pharmaceutical companies. The purpose was clear: make the process simple enough that a primary care physician could prescribe a psychiatric drug based on a few check-box questions. As a result, millions of Americans found themselves medicated, not because of clear biological necessity, but because the system was engineered to prescribe. Yet the evidence of effectiveness is weak. Meta-analyses show that antidepressants score, on average, only about two points higher than placebo on a 52-point depression scale. That difference is statistically negligible. By contrast, exercise has been shown to be five times more effective, and therapies such as red-light treatment perform even better.

The consequences of this overreliance on psychiatric drugs are far from benign. SSRIs blunt emotions, numb the highs and lows that are part of human experience, and damage relationships. They have long carried warnings about suicidal thoughts, particularly in children and adolescents during the first weeks of treatment. The black box warnings exist because the risks are real. Yet this connection is often ignored when tragedy strikes. There is a consistent thread in some of the most notorious acts of violence in modern America. Eric Harris, one of the Columbine shooters, had therapeutic levels of Luvox in his system. Kip Kinkel, who killed his parents and attacked Thurston High School, had been prescribed Prozac. Jeff Weise at Red Lake was taking high doses of Prozac, reportedly 60 milligrams a day. Steven Kazmierczak, who opened fire at Northern Illinois University, had recently been prescribed Prozac and discontinued use in the weeks leading up to the shooting. Navy Yard shooter Aaron Alexis had been prescribed the antidepressant trazodone shortly before his attack. Go back further and you find Laurie Dann, who carried out the 1988 Winnetka school shooting while on clomipramine and lithium. Cases such as Elizabeth Bush’s parochial school shooting and Jason Hoffman’s attack at Granite Hills were also linked to antidepressant use.

The list continues. What is remarkable is not that every incident can be traced to medication—because that would be inaccurate—but that so many of these cases involve individuals who were either actively taking or had just discontinued antidepressants. The pattern is there, but the media refuses to connect it. If it were any other substance—cannabis, testosterone, or even caffeine—the coverage would be relentless. Yet because antidepressants are backed by some of the most powerful companies in the world, with direct financial ties to media outlets through advertising, the subject is avoided. Pharmaceutical sponsorship ensures that the news will not investigate too deeply.

This is where the yin and yang of true scientific inquiry is missing. We are told to “trust the science” and to “trust the government,” yet the evidence is selectively presented, the risks are minimized, and the consequences are often buried. To progress, we cannot silence uncomfortable data. We must look at the full picture, not just the parts that serve corporate interests. Without that balance, the narrative is not science—it is marketing disguised as medicine.

Now let’s transition into another area that is every bit as significant as the misuse of psychiatric drugs, and that is the role of nutrition—and more specifically sugar—in shaping both physical and psychological health. If you want to understand why depression rates surged in the 1980s, you cannot stop at the so-called chemical imbalance theory. You must look at the shift in diet. That decade marked the explosion of ultra-processed foods, laden with sugar, additives, and refined carbohydrates. As Americans filled their pantries and refrigerators with convenience foods and sugary drinks, depression, obesity, and chronic illness rose in lockstep. The connection is not subtle. It is direct, and it is devastating.

Inflammation and insulin resistance, not serotonin deficiency, are far more reliable predictors of depression. A single can of soda contains eight to twelve teaspoons of sugar, more than enough to overwhelm the body’s natural balance. When repeated daily—and often several times a day—this intake triggers fatigue, mood swings, cognitive decline, and systemic inflammation. The psychological consequences of sugar are real: irritability, depression, and loss of focus. But they cannot be separated from the physical outcomes. Obesity, especially childhood obesity, has become an epidemic. The numbers are staggering, and yet policymakers often mock reform efforts. They laugh off attempts to remove soft drinks from school lunches, and they belittle nutrition education as though it were optional.

This is about more than weight or blood sugar. It is about the unraveling of community itself. Helicopter parenting, constant relocation, and a highly mobile society have weakened the fabric of neighborhoods. What were once multi-generational communities where children were known, watched, and guided by more than just their parents have become fragmented enclaves. Instead of continuity, we have disconnection. When cultures are thrown together without shared values or responsibilities, cohesion collapses, and crime follows. Science and technology can be extraordinary tools, but they can also be perverted into weapons of control. The manipulation of the food supply—saturating it with sugar for profit—has not strengthened society; it has hollowed it out.

Corporations thrive when people are sick, dependent, and distracted. Basic principles of health—sound nutrition, clean water, regular exercise, and personal responsibility—do not create profits for them. They create independence. And independence threatens the business model. That is why those who spew hatred, justify violence, or ridicule responsibility are, knowingly or not, carrying water for corporate interests. They are unwitting fools in a larger system that profits from decline. This is not an exaggeration—it is a profound truth that must be understood: the American public has been manipulated for corporate profit.

The Bible reminds us of this timeless lesson. In the Gospel of Matthew, chapter 7, verse 3, Jesus said: “Why do you look at the speck of sawdust in your brother’s eye and pay no attention to the plank in your own eye?” The wisdom is as relevant now as it was then. We condemn individuals for their shortcomings while ignoring the massive corruption in the systems that shape our lives. Until we remove the plank—corporate manipulation, corrupted institutions, and perverted science—we will never see clearly enough to fix what is wrong.

And now let’s begin to wrap this up, and I want to make it perfectly clear that this is not a condemnation of true science. True science is the never-ending process of examination, of course-correction, and of humility—the willingness to admit when we are moving in the wrong direction. You do not learn by suppressing data, censoring ideas, or silencing uncomfortable thoughts. And you certainly do not advance by assassinating people—whether that is the literal assassination of Charlie Kirk, the character assassination of Robert Kennedy Jr., or the reputational destruction of any voice that dares to question orthodoxy. You know you are striking a nerve when assassination—in any form—becomes the default response.

Doctors themselves are, by and large, extraordinary people. They dedicate their lives to helping others, often under immense pressure. But the system in which they are trained deserves scrutiny. Medical education, as it stands, is not built around nutrition, fitness, hydration, or preventative health. It is built around insurance-driven “checkbox medicine,” where a diagnosis fits a billing code, and the treatment aligns with a formulary list, not necessarily with the patient’s long-term wellbeing. Preventative care is too often dismissed outright, even by insiders at the FDA, because it does not generate revenue the way pharmaceuticals do. Screening thresholds for conditions such as cholesterol and osteoporosis have been pushed downward by panels dominated by industry consultants, conveniently expanding the number of patients who qualify for a prescription. The effect is a culture of medicine in which the reflex is always toward the pill, the procedure, or the referral—not the fundamentals of health.

And yet, alternatives exist. Exercise, fitness, proper nutrition, hydration, stress management, and emerging therapies such as peptides or regenerative medicine consistently demonstrate promise. But they are not easily monetized at scale, so they remain marginalized. Instead of curiosity and encouragement, they are often met with hostility from entrenched interests. Insurance companies will not reimburse for them, regulators erect barriers to their development, and corporate media derides them as fringe. What is lost is not just opportunity, but lives. Only the extremely wealthy or the politically connected often gain access to cutting-edge procedures and innovative therapies, while the masses are funneled into one-size-fits-all protocols that prioritize cost-containment and conformity.

At the philosophical core, the problem is not capitalism itself. It is the monopolization and co-opting of capitalism by entrenched interests. True free enterprise, when disciplined by ethics and competition, produces innovation and prosperity. What we have instead is a system where laws, regulations, and media are captured by a handful of corporations, ensuring that maximum profit always outweighs human health. That is not a free market—it is a captured market.

The root issue is simple and profound: the right to medical choice. Individuals should have the freedom to control their healthcare journey, especially when terminally ill or facing chronic disease. That means access to alternatives, access to information, and freedom from manipulation. Progress does not come from blind trust in agencies or slogans about “the science.” It comes from the courage to question, the humility to acknowledge mistakes, and the determination to place human wellbeing above corporate profits. Anything less is not science—it is salesmanship. And as history has shown us, salesmanship dressed up as science is a danger to both body and soul.

Paul Truesdell