15 Common Mistakes When Studying Public Health (And How to Fix Them) | LearnByTeaching.ai
Public health requires thinking at the population level rather than the individual level, and this paradigm shift is harder than it sounds. Students trained in clinical thinking must learn to design interventions for communities, interpret epidemiological data, and navigate the politics of health policy — all while mastering the biostatistics that underpins evidence-based practice.
Thinking at the individual level instead of the population level
Students with clinical backgrounds instinctively think about helping one patient at a time. Public health operates at the population level — designing interventions that shift outcomes for thousands or millions of people, even if the effect on any individual is small.
A student proposes addressing the obesity epidemic by teaching individuals to make better food choices, ignoring that population-level interventions (sugar taxes, food labeling requirements, zoning laws affecting food deserts) have far greater impact than individual counseling.
How to fix it
For every health problem, force yourself to think upstream: what policies, environments, or systems could change the default behavior for an entire population? The goal is not to change individuals one at a time but to change the conditions that make unhealthy outcomes more likely for everyone.
Confusing incidence and prevalence
Incidence (new cases per population per time period) and prevalence (total existing cases at a point in time) measure different things and answer different questions. Confusing them leads to incorrect interpretations of disease burden and flawed study designs.
A student claims that HIV prevalence in a country decreased because new treatments were introduced. In reality, effective treatment increases prevalence (people live longer with the disease) even as incidence may decrease. The student confused fewer new cases with fewer total cases.
How to fix it
Always specify which measure you mean and why it is appropriate. Incidence measures risk (useful for studying causes). Prevalence measures burden (useful for healthcare planning). Remember: prevalence ≈ incidence x duration. Effective treatment increases duration, so prevalence can rise even as incidence falls.
Underestimating biostatistics
Biostatistics is the most-failed course in many MPH programs because students expect a qualitative, humanities-oriented field and are unprepared for the mathematical rigor required. Avoiding or superficially engaging with biostatistics cripples every other area of public health.
A student cannot interpret a logistic regression output in an epidemiology paper because they treated biostatistics as a course to survive rather than a skill to develop, and now they cannot critically evaluate the evidence base for any public health intervention.
How to fix it
Invest heavily in biostatistics from the beginning. Work through problems by hand before using software to build intuition. Practice interpreting real study results: what does an odds ratio of 2.3 with a 95% CI of 1.1-4.8 actually mean for public health decision-making? Statistical literacy is the foundation of evidence-based public health.
Not understanding the social determinants of health framework
Students can define social determinants of health (income, education, housing, racism) but cannot apply the framework to actual program design or policy analysis. The concept is easy to recite but difficult to operationalize.
A student writes a paper acknowledging that poverty causes health disparities but then proposes only clinical interventions (more doctors, more clinics) without addressing the upstream social conditions — the determinants they just acknowledged.
How to fix it
For every health problem, map the social determinants that contribute to it and design interventions that target those determinants. If asthma rates are high in a neighborhood, consider not just medication access but also housing quality, air pollution, pest control, and environmental regulations. The framework must change what you propose, not just what you acknowledge.
Confusing relative risk, odds ratio, and attributable risk
These three measures of association answer different questions and are appropriate in different study designs. Students who conflate them misinterpret study results and draw incorrect conclusions.
A student reads that smokers have a relative risk of lung cancer of 15 compared to non-smokers and interprets this as meaning 15% of smokers will get lung cancer. The RR of 15 means smokers are 15 times more likely than non-smokers, not that their absolute risk is 15%.
How to fix it
Learn when each measure is appropriate: relative risk in cohort studies, odds ratio in case-control studies, and attributable risk for public health impact. Practice converting between them and interpreting what each means in plain language. Always distinguish relative measures (how many times more likely) from absolute measures (what is the actual probability).
Assuming association means the intervention worked
Students evaluate public health interventions by looking at whether health outcomes improved after the intervention was implemented, without considering whether the improvement would have happened anyway or was caused by other factors.
A student concludes that a smoking cessation program caused the decline in smoking rates in a city, ignoring that smoking rates were declining nationally due to tax increases, advertising bans, and cultural shifts — the program may have had no additional effect.
How to fix it
Evaluate intervention effectiveness using the same causal inference standards as any scientific study. Ask: was there a comparison group? Could secular trends explain the change? Were there confounding factors? Learn the hierarchy of evidence for intervention evaluation: RCTs, quasi-experiments, before-and-after with comparison, simple pre-post.
Writing vague program proposals without measurable objectives
Students propose public health programs with goals like 'improve community health' or 'reduce disparities' without specifying measurable, time-bound objectives that allow evaluation.
A student proposes a nutrition education program that 'aims to improve eating habits in the community' without specifying: a 10% increase in daily fruit and vegetable servings among participants within 12 months, as measured by pre/post dietary recall surveys.
How to fix it
Use SMART objectives: Specific, Measurable, Achievable, Relevant, Time-bound. Every program proposal should state exactly what will change, by how much, in whom, and by when. Without measurable objectives, you cannot evaluate whether the program worked.
Ignoring health economics and cost-effectiveness
Resources for public health are limited. Students who propose interventions without considering cost-effectiveness ignore the reality that spending on one intervention means not spending on another that might save more lives per dollar.
A student advocates for an expensive universal screening program without comparing its cost per QALY (quality-adjusted life year) to alternative interventions that might achieve greater health impact for the same budget.
How to fix it
Learn basic health economics concepts: cost-effectiveness analysis, cost-benefit analysis, QALYs, and DALYs. When proposing or evaluating interventions, always consider the opportunity cost: what else could this money fund? Public health decisions are inherently about allocating scarce resources.
Not learning to critically appraise epidemiological studies
Students accept published study conclusions without evaluating the methodology, biases, and limitations. Critical appraisal is the core skill that separates evidence-based public health from opinion-based practice.
A student cites a cross-sectional study as evidence that a risk factor causes a disease, without recognizing that cross-sectional studies cannot establish temporal sequence (did the exposure come before the disease?) and are therefore limited in establishing causation.
How to fix it
Learn to appraise each study type: for cohort studies, assess selection bias and loss to follow-up; for case-control studies, assess recall bias and control selection; for RCTs, assess randomization and blinding. Use the STROBE and CONSORT checklists as guides for critical appraisal.
Treating public health as purely technical
Students focus on the science of disease prevention without understanding the political, economic, and cultural contexts that determine whether evidence-based interventions are actually implemented.
A student proposes mandatory vaccination based solely on the epidemiological evidence for herd immunity, without considering the political feasibility, community trust, cultural beliefs, and communication strategies needed for successful implementation.
How to fix it
For every intervention, consider the implementation context: who supports it, who opposes it, what are the cultural sensitivities, and what political conditions are needed? Study cases where evidence-based interventions failed due to poor implementation or political resistance.
Not engaging with local public health practice
Students study public health theory in the classroom without connecting to the practical realities of how health departments, NGOs, and community organizations actually operate.
A student graduates with an MPH but has never visited a local health department, attended a community health meeting, or spoken with a practicing epidemiologist, and is unprepared for the messy reality of public health practice.
How to fix it
Volunteer at your local health department, shadow a community health worker, or participate in a health needs assessment. The gap between textbook public health and practiced public health is significant, and bridging it during your education makes you a much stronger professional.
Overlooking ethical issues in public health interventions
Public health interventions sometimes involve tradeoffs between individual liberty and population benefit (quarantine, mandatory vaccination, food regulation). Students who ignore these ethical dimensions propose interventions that face predictable opposition.
A student proposes a sugar tax without acknowledging the regressive nature of consumption taxes (they burden low-income populations disproportionately) or addressing how the revenue should be redirected to mitigate this inequity.
How to fix it
Study the ethical frameworks in public health: autonomy vs. beneficence, individual rights vs. community welfare, and distributive justice. For every intervention, explicitly identify the ethical tradeoffs and address them in your proposal. The Nuffield Council's intervention ladder is a useful framework.
Not learning epidemiological calculations by hand
Students rely on software to compute rates, ratios, and measures of association without understanding the underlying calculations. When software produces unexpected results, they cannot identify errors.
A student uses statistical software to compute an odds ratio but enters the data in the wrong format, producing an OR of 0.5 when the true value is 2.0. Because they cannot compute an OR by hand from a 2x2 table, they do not catch the error.
How to fix it
Practice computing incidence rates, prevalence, relative risk, odds ratios, NNT, sensitivity, specificity, and predictive values by hand from 2x2 tables. Build intuition for what these numbers mean before relying on software to compute them at scale.
Studying each public health discipline in isolation
MPH programs teach epidemiology, biostatistics, health behavior, health policy, and environmental health as separate courses. Students who don't integrate these disciplines miss the holistic perspective that defines public health.
A student excels in epidemiology but cannot write a comprehensive grant proposal because they never integrated their epidemiology skills with health behavior theory, policy analysis, and program evaluation from other courses.
How to fix it
After each course, explicitly connect what you learned to your other courses. How does this biostatistics technique apply to the epidemiology study you are reading? How does health behavior theory inform the policy analysis you are writing? Build a portfolio of integrated projects that draw on multiple disciplines.
Procrastinating on the culminating experience or capstone
MPH programs typically require a culminating project (thesis, capstone, or practicum) that integrates all coursework. Students who start late produce rushed work that does not demonstrate the competencies they developed.
A student begins their capstone project three months before graduation, discovers that their data source is inadequate, and produces a superficial analysis that does not demonstrate their epidemiology or biostatistics skills.
How to fix it
Start thinking about your capstone topic in your first year. Identify potential data sources and mentors early. Build your capstone into your coursework: use course assignments to develop your literature review, methods, and analysis plan. The capstone should be the culmination of iterative work, not a last-minute effort.
Quick Self-Check
- Can you explain the difference between incidence and prevalence and why effective treatment can increase prevalence?
- Can you interpret a relative risk of 3.0 and an odds ratio of 2.5 in plain language?
- Can you design a public health intervention with SMART objectives rather than vague goals?
- Can you identify at least three confounding variables in a given observational study?
- Can you explain why a before-and-after study without a comparison group is weak evidence for intervention effectiveness?
Pro Tips
- ✓For every health problem, force yourself to think upstream: what policies, environments, or systems could change the default behavior for an entire population?
- ✓Practice computing epidemiological measures (RR, OR, NNT) by hand from 2x2 tables until the calculations are automatic — this builds intuition that software alone cannot provide.
- ✓Invest heavily in biostatistics from the beginning; it is the most-failed MPH course and the most essential skill for evidence-based public health practice.
- ✓When proposing interventions, always address cost-effectiveness, ethical tradeoffs, and political feasibility — the technical evidence alone is never sufficient for implementation.
- ✓Analyze real public health case studies (COVID-19 response, Flint water crisis, opioid epidemic) using your epidemiological frameworks; these cases make abstract concepts concrete and memorable.