My office has taken the position that any comparison of "pre-ACA" rates and "post-ACA" rates is irresponsible; the plans just aren't the same. I can't replicate MI's results so I cannot say that I understand their methodology based on the sparse description provided, so I can't speculate if the "real" increase is 100+% or not.
Your bringin up "crap" plans is another issue I have with the study. I've mentioned that the methodology appears unreplicable and I've mentioned that only using exchange plans is hugely misleading. What I haven't mentioned is their determination of the "pre-ACA" rates.
Setting aside their assumptions on upcharging, their tactic of "compare the five least expensive plans to the five least expensive plans" misses one key component: enrollment. Because the ACA mandates the ten essential health benefits a reasonable assumption would be that plan enrollment will cluster. What I mean is that a hypothetical Plan A and Plan B, offering roughly the same benefits at roughly the same prices, will have comparable enrollments. This is very clearly not the case in the "pre-ACA" world. Two plans, let's call them Plan Y and Plan Z, could have very different benefits and very different premiums and the enrollments will vary widely as well. If a bare-bones Plan Z costs 1/5 what Plan Y costs but has an enrollment figure 1/100 of that of Plan Y (since the risk-averse avoid it) then Plan Z is not representative of anything, should be considered an outlier, and should not be included in the study. Manhattan Institute didn't control for that.
Put another way: Before the ACA I have 6 plans in my market. 5 of them cost $20/month and each have an enrollment of 50 people. The sixth costs $300/month and has an enrollment of 99,750. After the ACA I have 6 plans, they all cost about $350/month, and they all have an enrollment of ~16,666 people. The Manhattan Institute will tell you that the "average" increase in my market was 1,650% but the real average increase is closer to 17%. The uncompensated use of statistical outliers invalidates statistical analysis.