Is there any way to lower the fps on a mac? - macos

So I play this game on safari (mac). Freeriderhd.com to be exact. I want to know if there is any way to lower the fps on that specific game while I am playing it on safari because it would make it a lot easier. If there is no way to do this, can I use a macro to click the spacebar an infinite amount of times with x amount of time in between each click? If someone can help with any of these questions that would be great. Thanks.

I don't really know - sorry - but this seems unlikely. FPS is usually controlled internally by the game software. I have seen a very few games that allow the user to change the frame rate, or at least the requested frame rate. This is usually for the purpose of raising the FPS, not lowering it. Usually the game has an optimum frame rate that it strives for. Especially browser-based ones (which I've written some of).
Your alternative sounds like a means to "bog down the browser" which might peg your processor; not a thing I would tempt the Fates with, personally.

Related

Speed of player movement in SDL game

I have been writing a game which I have put on github for others to play around with, but I have a question about hardware when running the game.
When I run the game with a dedicated GPU, the player moves perfectly but if I run it on a laptop with integrated graphics, then the player seems to move very slowly. However, I added a player speed button and if I increase the player speed then the player moves perfectly again. What I can't understand is how come the player moves slowly on a laptop but if I change the speed of the player she moves how she should? Is there a way to detect if a dedicated GPU is not installed and so I can set the speed slightly higher? Thank you.
Since there's no code post, I can't help that much. I will tell you instead what I suspect the issue is.
Are you doing proper delta timesteps? If your game loop looks like
while (not quit) {
readInput();
runSimulation();
render();
sleep(16.666); // Or whatever you want here
}
this would be bad because the game will run different depending on the power of the hardware. If your game takes 1ms to render on a good GPU, but takes 10ms to render on a bad GPU, you will naturally run slower since each "game loop" takes longer on one machine.
What you can do is find the amount of time that everything ran, and then delay yourself by some amount so that it sums up to your target FPS.
For example, you time how long readInput, runSimulation, and render take, assign this to timeTaken, and if you are targeting 60 fps then you would do 16.666 - timeTaken. This way, every frame regardless of computer will take the exact same time.
This is not the best way either (it already has problems I can see), there are much better ways than this so please do your research, but this is a quick answer before your question gets closed. See this for more info, or search up Timesteps on Gamedev Stackexchange.

How can I detect artificial mouse movements?

We wrote a game where you shoot targets as quickly as you can, and someone wrote a program to automatically shoot the targets (via simulated mouse movement/clicks).
How can I detect this? I thought of monitoring mouse speed, etc., but it seems too easy to get around. If it helps, the game runs on Windows / is written in C++. Is there no way to just tell that the movement isn't coming from hardware?
See How to detect if mouse click is legit or automated? Essentially it says that it can't be done reliably, because of several reasons, one of which is that the programmer will find a way to get around whatever you try to do.
As suggested, heuristics seem to be a viable option. Monitor variable mouse speed, miss rates and accuracy, pauses, sudden jumps, et c. The better the heuristics get, the more the programmer will have to make his bot closer to a real player, and the less successful he gets.
PS: Even for well known shooter games like Counter Strike, there are hacks that shoot everything perfectly for you , so they haven't figured out how / don't want to stop it either.

Windows 7 GDI Acceleration Mystery: Can we Enable it Programmatically? Yes we (kind of) can! But how?

Note: This might seem like a Super User question at first, but please read it completely -- it's a programming question.
So they removed GDI acceleration from Windows 7, and now the classic theme animations look horrible. And because it was a fundamental design change, there's no way to fix it, right?
Wrong!
I was really surprised today when I switched to Classic view (which turned off Aero) when VLC media player was running. Normally, the maximize/minimize animations look horrible (they barely even show up), but while VLC was running, the animations were perfect, just like on XP! As soon as I closed VLC, they became horrible again. (They were better when media was playing than when the player was idle.)
I'd reproduced this sometime before when I'd fired up a 3D game and noticed that the animations had improved, but I'd assumed that it it had been a DirectX-related issue. I'd tried to figure out which function calls had caused the improvement, but with no luck. So I was really surprised today when I noticed the same behavior with VLC, because it was not playing video, only audio (not even visualizations!) -- and yet playing audio improved my GDI graphics performance, making me think that maybe, just maybe, Windows 7 does have some sort of GDI acceleration after all. (?)
In the case this makes a difference, my graphics card is an NVIDIA GT 330M, and PowerMizer is off. I've controlled for every variable I can think of except for whether or not VLC was running, so I can pretty much rule out anything related to features of the graphics card.
So, now for my question:
Does anyone have any idea which API call(s) might be causing this improvement, and whether they are actually related to graphics or not?
I've tried making a program that calls IDirectDraw::CreateSurface and simply runs in the background (hoping that it would do the same thing as my 3D game did), but no; there wasn't any difference. I'm not even sure if it's a graphics-related API call that might be causing this, since like I said, VLC was playing music, not video. It's a mystery to me why the performance would improve when a multimedia app is running, so any insight to what's going on inside this would be appreciated. :)
It could just be a factor of the system clock tick period. Running VLC probably changes the clock tick to every 1ms, causing the animations to run more smoothly.
Use Clockres to check the system timer resolution with and without VLC running to see if it's making a difference.
See the timeBeginPeriod function for how to set the time period yourself. Keep in mind that the shorter the period, the less time your CPU will be able to sleep between ticks, and the hotter it will run.

Voice Alteration Algorithm

Could somebody point me to a voice alteration algorithm? Preferably in Java or C? Something that I could use to change a stream of recorded vocals into something that sounds like Optimus Prime. (FYI- Optimus Prime is the lead Autobot from transformers with a very distinctive sounding voice... not everybody may know this.) Is there an open source solution?
You can't just change the sample rate. The human voice has formants. What you want to do is move the formants. That should be your line of research.
Read about vocoders and filter banks.
Could you provide a link as example? I haven't seen the film so I'm just speculating.
Audacity is an open-source wave editor which includes effect filters - since it's open source you could see what algorithms they use.
Not knowing what it sounded like, I figured it would be a vocoder, but after listening to a few samples, it's definitely not a vocoder (or if it is, it's pretty low in the mix.) It sounds like maybe there's a short, fast delay effect on it, along with some heavy EQ to make it sound kind of like a tiny AM radio, and maybe a little ring modulator. I think that a LOT of the voice actor's voice is coming through relatively intact, so a big part of the sound is just getting your own voice to sound right, and no effects will do that part for you.
All the above info is just me guessing based on having messed around with a lot of guitar pedals over the years, and done some amateur recording and sound-effects making, so I could be way off.

How does the Half-Life 2 multiplayer protocol work?

I was wondering how the Half-Life 2 multiplayer protocol works in mods like Counter-Strike: Source or Day Of Defeat: Source. I believe that they use some kind of obfuscation and proprietary compression algorithm. I would like to know how different kinds of messages are encoded in a packet.
Half-Life 2, Counter-Strike:Source etc all use Valves Source engine. Valve has a developer wiki which covers a lot of stuff (its pretty cool check it out!)...
These articles might interest you:
Latency Compensating Methods in Client/Server In-game Protocol, Design and Optimization
Source Multiplayer Networking
You should check out Luigi Auriemmas papers on Half-Life. You'll find a packet decoder and some disassembled algorithms there, too.
Reverse engineering information on Half-Life 2 may be hard to come by, because of its relevance for cheating. I'd guess boards like mpcforum are your best bet.
This is a really complicated question, my suggestion would be to look at some of the open source network game engines:
http://www.hawksoft.com/hawknl/
http://www.zoidcom.com/
http://sourceforge.net/projects/opentnl
http://www.gillius.org/gne/
You could also look at the source code for the quake series upon which the original half life engine is based.
Though details might differ, the general framework is pretty old. Here's a quick overview:
In early fps games such as doom and Quake the player's position was updated only on the server's response to your move command. That is, you pressed the move-forward button and the client communicated that to the server, the server udpated your position on its memory and then relayed a new game-state to your client with your new position. This led to very laggy play: shooting, even moving in narrow corridors was a game of predicting lag.
Newer games let the client handle the player's shooting and movement by themselves. Though this led to lag-less movement and fire it opened more possibilities of cheating by hacking the client code. Now every player moves and fires independently on their own computer and communicates to the server what they have done. This only breaks down when two players bump into one another or try to catch a power up at the same time.
Now the server has this stream of client state coming from each player and has to sync them and make a coherent game out of them. The trick is to measure each player's latency. The ultimate goal is to be able to fire a very low latency weapon (such as sniper rifle or railgun) on an enemy moving sideways and have it hit correctly. If the latency from each player is know, suppose player A (latency 50ms) fires a gun on B (latency 60ms). To make a hit, the shot has to hit B where B was 60ms ago, from where A was 50ms ago.
That's a very rough overview but should give you the general idea.
I suggest that you look into Quake 1-3 engines. They are available with source code. Half-life's protocol might be a bit different but most likely close enough.

Resources