For benefits leaders looking for a weight-loss program, there are a dizzying number of options out there. And each vendor seems to have its own marketing claims and survey results, touting great engagement and robust outcomes.

But comparing those results can appear tough. Some programs claim higher engagement, others claim greater weight loss. And sometimes these numbers are based on small, select groups of participants. Using that filter, any offering can appear enticing.

Yet if you roll that program out for your entire employee population, you may find that your company’s results vary. Sure, you might have a few super-motivated employees who stick with the program and get great results, but the office-wide numbers may tell a far different story. It’s not your employees who are to blame for missing the advertised mark — it’s that the program didn’t work for everyone at your company. You need a program that scales.

Why Scale Matters

The stakes are high, too. Healthier employees tend to have higher rates of engagement and productivity and lower rates of absenteeism. They can also have significantly lower medical costs, with those savings easily eclipsing and surpassing the cost of getting a program up and running. In a recent Real Appeal® claims analysis, participants had medical costs up to 16 percent lower than non-participants, with the most engaged participants having medical costs $674 lower, on average. That’s motivation enough to find a wellness program that really delivers, both for you and your employees.

But how can you figure out which programs will scale for your employee population? At Real Appeal, we’ve had some experience with that. The prestigious journal Obesity recently published our largest study to date. We tracked 52,461 people in our online intensive lifestyle weight-loss intervention program, and found that nearly one-fourth achieved a 5 percent weight loss. And the 27,000 plus who completed nine or more weeks lost 4.3 percent of their body weight, on average.

Our view is that you don’t have to be a statistician or ace investigator to dive into the numbers. It just takes a bit of scrutiny and a willingness to ask follow-up questions. Here are  three areas to focus on when you’re looking at weight-loss programs.

Look for Big Numbers

You want a weight-loss program that will deliver results across your entire employee population, and we’re guessing they’re a fairly diverse bunch. A large-scale study is the best — maybe only — way to ensure that the program will work for the range of people at your company. If the program is handpicking participants or surveying a small number of people, it’s more likely that the results are skewed toward a certain type of person. That means the program may work like gangbusters for super-motivated millennials or at companies with on-site pilates studios and weekly wellness retreats. But it might fall flat for your own employees.

For our large-scale, 12-month study, we cast a giant net to include more than 50,000 participants, across different kinds of companies and industries and various individual health profiles. That makes it easier to feel confident that the results could be replicated at your company because odds are a similar company with a similar employee population is already folded into these findings. As you size up a vendor’s marketing claim, make sure to ask: what sample size is this based on? If they’re not touting that number — front and center — it might mean that the data set is surprisingly small.

Look for a Broad Population

In the world of medical research, you’ll often hear the phrase “intent to treat.” Yes, it sounds clinical, but it simply means that the study includes everyone who showed up for that first session, whether they stuck around for the rest of the program or not. It might seem obvious that weight-loss programs would include in their data everyone who joined the program, but the opposite is actually true: Most studies have an engagement cutoff of, say, four or nine sessions. So people who attend past that threshold are included in that data, and those who dropped out before then aren’t tallied in the results.  

There are sometimes good reasons to restrict the reporting outcomes. You want to know what happens when people get some traction with the intervention or really focus on those who stuck with it. In fact, the Centers for Disease Control and Prevention uses the same metrics of attending four plus sessions or nine plus sessions in its own weight-loss studies. But you want to know up front what the reporting parameters are. If the supercharged engagement and dramatic outcomes are only from people who attended nine or more sessions, you might not be able to translate a study’s results to your entire population.

The Best Approach

In our Obesity journal study, we set out from the start to have an intent-to-treat analysis. That means we included every person who showed up for the first session in our results, even if they dropped out the day after that first meeting. And guess what? Some people did drop out — that’s human nature — and that means they did ding the overall results. But on average, participants still lost 2.8 percent of their body weight, with 23 percent achieving 5 percent weight loss. Of course, those who stuck around longer achieved even better results: For the 27,000 plus people who completed 9 or more weeks of the program, 36 percent achieved a 5 percent weight loss. And research shows that losing that amount can be enough to lower cholesterol and blood pressure and reduce the risk of developing type 2 diabetes by 50 percent.

We could have filtered the results to showcase only results achieved by the super-motivated or highly engaged. But taking an intent-to-treat approach means sharing all of the findings, so employers can feel confident about replicating these results at their own companies — even if some employees inevitably drop out.

Look for Peer Review

A press release or a survey published on a company website is fine, but when it comes to best-quality study design and statistical analysis, peer review is the gold standard. Not only does it signal that the claims, by definition, have been scrutinized by third-party, independent medical experts, it also signals that the program provider is invested in contributing to the scientific community. In short, it’s science above sales.

At the onset of our 12-month, large-scale study, Real Appeal turned to our paid advisory panel, which includes some of the leading experts in the field of diabetes prevention and obesity medicine specialists: Louis J. Aronne, MD; Rena R. Wing, PhD;  Donna H. Ryan, MD; and William D. Johnson, PhD. From the start, we focused on objective science and a large-scale, inclusive data set, and that commitment carried through to the very end.

When the results were published this month, the team at Real Appeal cheered. We’re proud to have proved the scalability and efficacy of our online intensive lifestyle intervention. We’re proud to be changing the lives and health of employees every day, helping them show up to work with more pep in their step and less fear about a diabetes diagnosis or a heart attack in their future. And we’re proud to share the results of our study — fully and transparently — with employers looking to do well by their employees and do well in business.

Steve Olin is senior vice president in charge of health solutions at Rally HealthSM and the former CEO of Real Appeal, which merged with Rally®in November 2017. 

Start the conversation - give us a call at 1-844-901-REAL (7325) or email us at