We literally do not keep those statistics ourselves, for the simple reason that the only statistically valid way to make such a comparison is to calculate from an official test to an official test, with our course directly in between. Not enough people follow this formula to calculate a statistically significant number, and it's a waste of time to calculate with the few people who do follow this formula - a non-statistically-significant number does not have any valid meaning.
Most companies that do publish such numbers do so from that company's own test to an official test. Again, this number does not have any real meaning; they are not comparing apples to apples. In the past, some companies have even accused some other companies of providing very difficult first tests to depress initial scores and thereby drive up the "score improvement." We also don't want people to have to wonder whether we're playing those kinds of number games. (In any event, even if a company tries to mimic the real test as best it can, it still can't make it close enough for a statistically valid comparison - plus, students know it's not the real thing, so they are not affected by nerves or other issues that can impact your score.)
The statistically valid and statistically significant number we do publish is simply how our students do when they take the real test after our course. Our students' median score is about a 690, or about the 91st percentile; median (as I'm sure you know from your studies!) means that half our students score higher and half score lower on the official test.
You should also be aware that any company gathers its scores via self-reporting; that is, the students report their scores themselves. Higher-scoring students are more likely to report scores than lower-scoring students, so any published scores or score improvements you see (including ours) are likely a bit inflated from the true number due to this reporting bias. (And if any of our students are reading this and you haven't reported your scores yet, please ignore whether you liked your scores and email your teacher or the office at
studentservices@manhattangmat.com - we really want this data from absolutely everyone!)
Moral of the story: beware someone who tries to sell you statistically suspect data and / or doesn't tell you of any inherent weaknesses (ie, reporting bias) and subsequent consequences (ie, inflated scores) in the collection or manipulation of that data.