Look, I believe you, but I’ll admit I’m having trouble reconciling a few things about it. If it’s a CPU-bound problem, I’d expect it to get worse as the CPU gets faster, and my PC now is much faster than the one I played Fallout 1 on about a decade earlier, yet my encounter rates were remarkably similar. Not only were they remarkably similar, but they were remarkably similar to every other RPG I’ve played like it, such as Baldur’s Gate and Wasteland 2. Looking at heat maps of encounter rates on a wiki, I definitely had more in the red zones, but it was maybe two encounters per square rather than a dozen, and a dozen sounds miserable; I, too, would come to the conclusion that something was wrong if I saw significantly more encounters than I did. I ran Fallout 1 on Windows back in the day and Fallout 2 via Proton, so we can eliminate that as a variable that may have caused the game to behave differently. A streamer I watch played Fallout 1 for the first time via Fallout CE and had extremely similar encounter rates, and not only are we running very different machines, but surely that project unbound the encounter rates from the CPU. If we’re hitting some kind of cap on encounter rates, why do they all appear to be at about the rate I experienced? And why would we not assume that that cap was the intended design?
cap on encounter rates, why do they all appear to be at about the rate I experienced?
Well it’s clearly not a cap if you’re seeing people having more frequent encounters than you are.
And why would we not assume that that cap was the intended design?
Because they tied the encounter system to CPU frequency and the highest consumer CPU frequency at the time was like 500mhz. Why on earth would you assume that the developers designed the rate not around what hardware was capable of at the time, but what would be capable 15 years later?
You’re suggesting that the developers got into a room together and said “Let’s design this so that it won’t play the way we intend for it to be played until 15 years pass”
Look, I believe you, but I’ll admit I’m having trouble reconciling a few things about it. If it’s a CPU-bound problem, I’d expect it to get worse as the CPU gets faster, and my PC now is much faster than the one I played Fallout 1 on about a decade earlier, yet my encounter rates were remarkably similar. Not only were they remarkably similar, but they were remarkably similar to every other RPG I’ve played like it, such as Baldur’s Gate and Wasteland 2. Looking at heat maps of encounter rates on a wiki, I definitely had more in the red zones, but it was maybe two encounters per square rather than a dozen, and a dozen sounds miserable; I, too, would come to the conclusion that something was wrong if I saw significantly more encounters than I did. I ran Fallout 1 on Windows back in the day and Fallout 2 via Proton, so we can eliminate that as a variable that may have caused the game to behave differently. A streamer I watch played Fallout 1 for the first time via Fallout CE and had extremely similar encounter rates, and not only are we running very different machines, but surely that project unbound the encounter rates from the CPU. If we’re hitting some kind of cap on encounter rates, why do they all appear to be at about the rate I experienced? And why would we not assume that that cap was the intended design?
Well it’s clearly not a cap if you’re seeing people having more frequent encounters than you are.
Because they tied the encounter system to CPU frequency and the highest consumer CPU frequency at the time was like 500mhz. Why on earth would you assume that the developers designed the rate not around what hardware was capable of at the time, but what would be capable 15 years later?
You’re suggesting that the developers got into a room together and said “Let’s design this so that it won’t play the way we intend for it to be played until 15 years pass”