We wrote a game where you shoot targets as quickly as you can, and someone wrote a program to automatically shoot the targets (via simulated mouse movement/clicks).
How can I detect this? I thought of monitoring mouse speed, etc., but it seems too easy to get around. If it helps, the game runs on Windows / is written in C++. Is there no way to just tell that the movement isn't coming from hardware?
See How to detect if mouse click is legit or automated? Essentially it says that it can't be done reliably, because of several reasons, one of which is that the programmer will find a way to get around whatever you try to do.
As suggested, heuristics seem to be a viable option. Monitor variable mouse speed, miss rates and accuracy, pauses, sudden jumps, et c. The better the heuristics get, the more the programmer will have to make his bot closer to a real player, and the less successful he gets.
PS: Even for well known shooter games like Counter Strike, there are hacks that shoot everything perfectly for you , so they haven't figured out how / don't want to stop it either.
Related
So I play this game on safari (mac). Freeriderhd.com to be exact. I want to know if there is any way to lower the fps on that specific game while I am playing it on safari because it would make it a lot easier. If there is no way to do this, can I use a macro to click the spacebar an infinite amount of times with x amount of time in between each click? If someone can help with any of these questions that would be great. Thanks.
I don't really know - sorry - but this seems unlikely. FPS is usually controlled internally by the game software. I have seen a very few games that allow the user to change the frame rate, or at least the requested frame rate. This is usually for the purpose of raising the FPS, not lowering it. Usually the game has an optimum frame rate that it strives for. Especially browser-based ones (which I've written some of).
Your alternative sounds like a means to "bog down the browser" which might peg your processor; not a thing I would tempt the Fates with, personally.
Basically, what I want to achieve is to run a program in an environment which will give any value asked by the program on basis criteria decided by me. Say e.g., games regularly query system about the time so as to execute animations, now if I have a control over the values of time that are passed on to the program I could control the speed of the animation (and perhaps clear some difficult rounds easily :P).
On similar grounds keystrokes, mouse movements could also be controlled (I have tried Java's Robot class, but did not find it satisfactory: slows the system {or perhaps my implementation was bad}, the commands are executed on currently focused programs and can't be targeted specifically).
Any graceful way of doing it or some pointers on achieving it will be highly appreciated.
Okay, I'm going to sound like an idiot with this one. Here goes.
I've been doing iOS development for about a year now, but only tonight have I started doing anything OpenGL related. I've followed Jeff LaMarche's wonderful guide and I'm drawing a neat looking triangle, and I got it to flip around and stuff. I'm one entertained programmer.
Okay, here's the stupid question part: How can I set somewhere for OpenGL to perform glRotatef and glDrawArrays + friends continuously, or at a set frames per second? I've tried Googling it, but really can't come up with good search terms.
Thanks in advance, and get ready to field a ton more of these questions.
While the others make good suggestions for the general case of OpenGL ES, I know that you're probably working on iOS here, Will, so there's a better platform-specific alternative. In your case, I believe you'll be better served by CADisplayLink, which fires off callbacks that are synchronized with the refresh rate of the screen. Using this, you'll get far smoother updates than with a timer or some kind of polling within a loop.
This is particularly effective when combined with Grand Central Dispatch, as I describe in my answer here. When I switched from using a loop to CADisplayLink for updates, my rendering became much smoother on all iOS devices due to fewer dropped frames. Adding GCD on top of that made things even better.
You can refer to my Molecules code for an example of this in action (see the SLSMoleculeGLViewController for how my autorotation is animated with this). Apple's OpenGL ES application template also uses CADisplayLink for updates, last I checked.
You should read up on the concept of game loops.
http://entropyinteractive.com/2011/02/game-engine-design-the-game-loop/
is a good resource to get you started.
Well I'm not an expert on the subject but can't you just put the rotate/draw commands in a while loop that ends when a certain button is pressed or when a specific event occurs ?
Note: This might seem like a Super User question at first, but please read it completely -- it's a programming question.
So they removed GDI acceleration from Windows 7, and now the classic theme animations look horrible. And because it was a fundamental design change, there's no way to fix it, right?
Wrong!
I was really surprised today when I switched to Classic view (which turned off Aero) when VLC media player was running. Normally, the maximize/minimize animations look horrible (they barely even show up), but while VLC was running, the animations were perfect, just like on XP! As soon as I closed VLC, they became horrible again. (They were better when media was playing than when the player was idle.)
I'd reproduced this sometime before when I'd fired up a 3D game and noticed that the animations had improved, but I'd assumed that it it had been a DirectX-related issue. I'd tried to figure out which function calls had caused the improvement, but with no luck. So I was really surprised today when I noticed the same behavior with VLC, because it was not playing video, only audio (not even visualizations!) -- and yet playing audio improved my GDI graphics performance, making me think that maybe, just maybe, Windows 7 does have some sort of GDI acceleration after all. (?)
In the case this makes a difference, my graphics card is an NVIDIA GT 330M, and PowerMizer is off. I've controlled for every variable I can think of except for whether or not VLC was running, so I can pretty much rule out anything related to features of the graphics card.
So, now for my question:
Does anyone have any idea which API call(s) might be causing this improvement, and whether they are actually related to graphics or not?
I've tried making a program that calls IDirectDraw::CreateSurface and simply runs in the background (hoping that it would do the same thing as my 3D game did), but no; there wasn't any difference. I'm not even sure if it's a graphics-related API call that might be causing this, since like I said, VLC was playing music, not video. It's a mystery to me why the performance would improve when a multimedia app is running, so any insight to what's going on inside this would be appreciated. :)
It could just be a factor of the system clock tick period. Running VLC probably changes the clock tick to every 1ms, causing the animations to run more smoothly.
Use Clockres to check the system timer resolution with and without VLC running to see if it's making a difference.
See the timeBeginPeriod function for how to set the time period yourself. Keep in mind that the shorter the period, the less time your CPU will be able to sleep between ticks, and the hotter it will run.
I was wondering how the Half-Life 2 multiplayer protocol works in mods like Counter-Strike: Source or Day Of Defeat: Source. I believe that they use some kind of obfuscation and proprietary compression algorithm. I would like to know how different kinds of messages are encoded in a packet.
Half-Life 2, Counter-Strike:Source etc all use Valves Source engine. Valve has a developer wiki which covers a lot of stuff (its pretty cool check it out!)...
These articles might interest you:
Latency Compensating Methods in Client/Server In-game Protocol, Design and Optimization
Source Multiplayer Networking
You should check out Luigi Auriemmas papers on Half-Life. You'll find a packet decoder and some disassembled algorithms there, too.
Reverse engineering information on Half-Life 2 may be hard to come by, because of its relevance for cheating. I'd guess boards like mpcforum are your best bet.
This is a really complicated question, my suggestion would be to look at some of the open source network game engines:
http://www.hawksoft.com/hawknl/
http://www.zoidcom.com/
http://sourceforge.net/projects/opentnl
http://www.gillius.org/gne/
You could also look at the source code for the quake series upon which the original half life engine is based.
Though details might differ, the general framework is pretty old. Here's a quick overview:
In early fps games such as doom and Quake the player's position was updated only on the server's response to your move command. That is, you pressed the move-forward button and the client communicated that to the server, the server udpated your position on its memory and then relayed a new game-state to your client with your new position. This led to very laggy play: shooting, even moving in narrow corridors was a game of predicting lag.
Newer games let the client handle the player's shooting and movement by themselves. Though this led to lag-less movement and fire it opened more possibilities of cheating by hacking the client code. Now every player moves and fires independently on their own computer and communicates to the server what they have done. This only breaks down when two players bump into one another or try to catch a power up at the same time.
Now the server has this stream of client state coming from each player and has to sync them and make a coherent game out of them. The trick is to measure each player's latency. The ultimate goal is to be able to fire a very low latency weapon (such as sniper rifle or railgun) on an enemy moving sideways and have it hit correctly. If the latency from each player is know, suppose player A (latency 50ms) fires a gun on B (latency 60ms). To make a hit, the shot has to hit B where B was 60ms ago, from where A was 50ms ago.
That's a very rough overview but should give you the general idea.
I suggest that you look into Quake 1-3 engines. They are available with source code. Half-life's protocol might be a bit different but most likely close enough.