My main vice these days is Warcraft 3. (Not WoW -- I don't have nearly enough time for that.) It's been frustrating, though, since it could get extremely choppy. Maybe 5 fps choppy. Dropping all the settings to lowest didn't help, which really puzzled me since although a GeForce fx Go5200 isn't the most powerful 3d card around by a long shot, a 2.8 GHz p4 should have been able to render this stuff in software.
I didn't do anything about this at first but I got good enough at the game that I started losing games because of the choppiness. So a couple nights ago I went on a killing spree with Task Manager to see if it was a background task causing the problem. I didn't see any likely candidates, and sure enough, it didn't help.
I did notice my Insprion 5160 runs rather hot, though, and I wondered if it could be underclocking the CPU and/or GPU to cool off. This program verified this theory: my cpu clock oscillated every few seconds between 2.8 and 1.8 GHz.
Now, even the lowest setting of 1.8 GHz is plenty for wc3. Apparently Blizzard did something dumb (a friend who knows more about windows programming than I suggested it might actually be a win32 API problem) and sets its timer based on the maximum clockspeed, and doesn't adjust when it drops down. Turning speedstep off in my BIOS, which sets the cpu clock to its lowest setting permanently, fixed my warcraft problem. I'd be pretty ticked if I still had to run, say, Eclipse, but 1.8 GHz is also plenty for Emacs. So I'm happy for now.
I first experimented with WSL2 as a daily development environment two years ago. Things were still pretty rough around the edges, especially with JetBrains' IDEs, and I ended up buying a dedicated Linux workstation so I wouldn't have to deal with the pain. Unfortunately, the Linux box developed a heat management problem, and simultaneously I found myself needing a beefier GPU than it had for working on multi-vector encoding , so I decided to give WSL2 another try. Here's some of the highlights and lowlights. TLDR, it's working well enough that I'm probably going to continue using it as my primary development machine going forward. The Good NVIDIA CUDA drivers just work. I was blown away that I ran conda install cuda -c nvidia and it worked the first try. No farting around with Linux kernel header versions or arcane errors from nvidia-smi. It just worked, including with PyTorch. JetBrains products work a lot better now in remote development mod...
Comments
If enabled it will also draw a nice diagram of processor speed vs load.
I think I saved $100 getting the p4 model instead of a Pentium M. Wasn't worth it.