I'd be much more interested in the whole process if you designed a metric to account for the fact that what a 'normal' average and a 'normal' strike rate are have changed other time. Hell, I'd even be willing to do it for you. Striking at 85 in 2015 is not the same as striking at 85 at 1997.
The AI adjusts for all that. It calculates how aggressive each batsman should be relative to his average by calculating the resources of the remaining batsmen (i.e how many runs they are likely to contribute off how many balls) and the average economy rates and strike rates of the opposition bowlers. So an older batsman isn't necessarily disadvantaged in a modern setting because the high economy rates of the modern bowlers will mean he can increase his run scored at little or no increase to his level of aggression (and thus his chances of getting out). So old-school batsmen naturally adjust to modern settings.
I agree that striking at 85 in 2015 is not the same as striking at 85 in 1997. Yet some batsmen still did it. Lance Cairns struck at 104, Kapil Dev at 95, and Viv Richards at 91. The real question is why. The simulator AI suggests to me that, back in the day, top order batsmen scored slower, to a large degree, because the batsmen coming after them were much poorer on average and therefore there was a premium on conserving wickets over risking scoring runs. When batsmen didn't have that pressure, like Cairns and Dev lower in the order or Richards with quality batting coming after him, i.e if the risk/reward equation was as it is for modern batsmen in similar scenarios, they tended to score like modern batsmen do. Opportunities to do this, however, were rarer owing to the fact that batting sides were in strong positions less often.
When I put batsmen like Andrew Jones in the simulator (BatSR 57.9, BatAve 35.69) he simply ups the run rate to about 70 at a miminal risk to his average. Sometimes both his average and strike rate go up. I put this down to the fact that in 1992, the batsmen coming after Jones were, Crowe aside, very poor. So Jones had to play a suboptimally slow game in order to make sure NZ didn't get bowled out. If he had quality batsmen coming after him he wouldn't have had to do this and his average could well have gone up.
it is because of this that the "higher strike rate necessarily means lower average" isn't always true if you think about it. You have to put yourself in the minds of the batsmen for example.
Corey Anderson is currently striking at about 120. In another time and place, without Guptill, Williamson and Taylor ahead of him, Anderson would have to play more recovery jobs and his strike rate would go way down. So a player's stats are also dependent to a large degree on the players above them. This becomes evident if the simulator AI is good.
This can also be seen in the drop in bowling strike rates over the decades, which have gone down faster than batting strike rates have gone up. Hadlee struck at 39.1, which is insipid compared to Shane Bond's 29.4. But Hadlee had an economy rate of 3.3, because opposition batsmen tried primarily to just survive against him. It's really fascinating when you look into it.
What I'd be really interested in knowing is the average economy rate and average strike rate for bowlers over the last 5 or 10 years, in all international matches (at least top 8 nation ones). This would help with the simulator a lot.
Also good to know would be the average number of runs scored off each over in an average match in ODIs. I mean like 5.05 off over 1, 5.07 off over 2, 7.65 off over 45, etc. This would help me to calibrate the AI aggression to be super-accurate.