Anti flood : session or db stocking ips - session

right now I'm using an antiflood function in all my websites :
function flood($name,$time)
{
$name = 'tmptmptmp'.$name;
if(!isset($_SESSION[$name]))
{
$_SESSION[$name] = time();
return true;
}
else
{
if(time()-$time > $_SESSION[$name])
{
$_SESSION[$name] = time();
return true;
}
else
{
return false;
}
}
}
I use it this way :
if(flood('post',60)) do something;
else 'you're posting too fast';
Is this way safe ? Or do I need to replace it/complete it with a db table stocking ips and checking if they did a request earlier ?

It depends. How likely are your users going to clear their cookies to get past your anti-flood protection? I'll say that if they have to login again, 99% of the users won't even bother.
But sure, if you really want better method, store the ips in the DB. But even that can be defeated by getting a new IP.

Related

No definition found for Table yahoo.finance.xchange

I have a service which uses a Yahoo! Finance table yahoo.finance.xchange. This morning I noticed it has stopped working because suddenly Yahoo! started to return an error saying:
{
"error": {
"lang": "en-US",
"description": "No definition found for Table yahoo.finance.xchange"
}
}
This is the request URL. Interesting fact: if I try to refresh the query multiple times, sometimes I get back a correct response but this happen very rarely (like 10% of the time). Days before, everything was fine.
Does this mean Yahoo API is down or am I missing something because the API was changed? I would appreciate any help.
Since I have the same problem and that it started today too, that others came to post exactly in the same time as well, and that it still works most of the time, the only explanation I can find is that they have some random database errors on their end and we can hope that this will be solved soon. I also have a 20% rate of failures when refreshing the page of the query.
My guess is that they use many servers to handle the requests (let's say 8) and that one of them is empty or doesn't have that table for some reasons so whenever it directs the query to that server, the error is returned.
Temporary solution: Just modify your script to retry 3-4 times. That did it for me because among 5 attempts at least one succeeds.
I solve this issue by using quote.yahoo.com instead of the query.yahooapis.com service. Here's my code:
function devise($currency_from,$currency_to,$amount_from){
$url = "http://quote.yahoo.com/d/quotes.csv?s=" . $currency_from . $currency_to . "=X" . "&f=l1&e=.csv";
$handle = fopen($url, "r");
$exchange_rate = fread($handle, 2000);
fclose($handle );
$amount_to = $amount_from * $exchange_rate;
return round($amount_to,2);
}
EDIT the above no longer works. At this point, lets just forget about yahoo lol Use this instead
function convertCurrency($from, $to, $amount)
{
$url = file_get_contents('https://free.currencyconverterapi.com/api/v5/convert?q=' . $from . '_' . $to . '&compact=ultra');
$json = json_decode($url, true);
$rate = implode(" ",$json);
$total = $rate * $amount;
$rounded = round($total);
return $total;
}
Same error, i migrate to http://finance.yahoo.com
Here is C# example
private static readonly ILog Log = LogManager.GetCurrentClassLogger();
private int YahooTimeOut = 4000;
private int Try { get; set; }
public decimal GetRate(string from, string to)
{
var url =
string.Format(
"http://finance.yahoo.com/d/quotes.csv?e=.csv&f=sl1d1t1&s={0}{1}=X", from, to);
var request = (HttpWebRequest)WebRequest.Create(url);
request.UseDefaultCredentials = true;
request.ContentType = "text/csv";
request.Timeout = YahooTimeOut;
try
{
using (var response = (HttpWebResponse)request.GetResponse())
{
var resStream = response.GetResponseStream();
using (var reader = new StreamReader(resStream))
{
var html = reader.ReadToEnd();
var values = Regex.Split(html, ",");
var rate = Convert.ToDecimal(values[1], new CultureInfo("en-US"));
if (rate == 0)
{
Thread.Sleep(550);
++Try;
return Try < 5 ? GetRate(from, to) : 0;
}
return rate;
}
}
}
catch (Exception ex)
{
Log.Warning("Get currency rate from Yahoo fail " + ex);
Thread.Sleep(550);
++Try;
return Try < 5 ? GetRate(from, to) : 0;
}
}
I've got the same issue.
I need exchange rates in my app, so I decided to use currencylayer.com API instead - they give 168 currencies, including precious metals and Bitcoin.
I've also written a microservice using webtask.io to cache rates from currencylayer and do cross-rate calculations.
And I've written a blog post about it 🤓
Check it out if you want to run your own microservice, it's pretty easy 😉
I found solution, in my case, just change http to https and everything works fine.

How to prevent additional page requests after response sent

I have configured a listener on kernel.request which sets a new response with redirect when the session time has reached a certain value. The listener works fine and redirects to a certain page, on the next request, after the session has ended. But my problem is on the page I have many links and if I press multiple times the same link, the initial request with the redirect is cancelled/stopped and a new request is made with the last link pressed and so it passes my redirect even though the session has ended and is destroyed. So, my question is how to prevent additional requests/link presses after the firs request is made?
Here is my code:
public function onKernelRequestSession(GetResponseEvent $event)
{
$request = $event->getRequest();
$route = $request->get('_route');
$session = $request->getSession();
if ((false === strpos($route, '_wdt')) && ($route != null)) {
$session->start();
$time = time() - $session->getMetadataBag()->getCreated();
if ($route != 'main_route_for_idle_page') {
if (!$session->get("active") && $route == 'main_route_for_site_pages') {
$session->invalidate();
$session->set("active", "1");
} else {
if ($time >= $this->sessionTime) {
$session->clear();
$session->invalidate();
$event->setResponse(new RedirectResponse($this->router->generate('main_route_for_idle_page')));
}
}
} else {
if ($session->get("activ")) {
$session->clear();
$session->invalidate();
}
}
}
}
Thak you.
Idea #1: Simple incremental counter
Each request sends sequence number as param which is being verified as expected at the server.
Server increments the number and sends it back via response
the new number is used in future requests
Basically, if server expects the SEQUENCE number to be 2 and client sends 1 the request is to be rejected.
Idea #2: Unique hash each time
Similar to the idea above, but uses unique hashes to eliminate predictive nature of incremental sequence.
I resolved the issue using JQuery: when a link was pressed I disabled the other ones and so only one request is made from the page:
var isClicked = false;
$(".menu-link").click(function(e) {
if(!isClicked) {
isClicked = true;
} else {
e.preventDefault();
}
});
Thanks.

sync my localhost with my server by ftp

I want to sync my localhost (Windows) with my remote server in real time and automatically. So when I modify, create or delete a file this tool should update remote server automatically. This aplication must to keep both servers synchronized in real time. Please I really need your help. I tried FTPbox, but it doesn't update always, I need some better. I'm working on windows, but if exists some on linux is better.
Thanks
WinScp has a synchronization feature that does what you want.
For linux users, you can have a look here.
Try Dropbox or Google Drive if you don't need to synchronize too much information.
I'm assuming that you want to syncronize the databases and files. This was my way out, I hope it will be of help to someone.
The first code is local, and the other one is remote.
//make sure you are connected to your local database
<?php
//function to check internet connection.
function is_connected() {
if($connected = fsockopen("www.example.com", 80)){
// website, port (try 80 or 443)
if ($connected){
$is_conn = true; //action when connected
fclose($connected);
}
return $is_conn;
} else {
$is_conn = false; //action in connection failure
}
}
//if connected to internet, do the following...
if(is_connected()== true){
echo "connected";
ini_set('max_execution_time', 3000);//increase this incase of slow internet
$table_name = TableName::find_all();
//whatever way you find an array
//of all your entries on this particular
//table that you want to sync with the remote table.
$file = 'to_upload_local.php'; //a local file to put table contents into
$current = serialize($table_name);//serialize the table contents (google).
file_put_contents($file, $current);//put the serialized contents to the file.
$remote_file = 'public_html/to_upload_remote.php';//this is the file that is on the remote server that you want to overwrite with the local file to upload.
$ftp_server = "ftp.yourwebsite.org";// your ftp address
// set up basic connection
$conn_id = ftp_connect($ftp_server);
// login with username and password
$login_result = ftp_login($conn_id, "yourFTPUsername", "yourFTPPassword");
// turn passive mode on
ftp_pasv($conn_id, true);
// upload a file
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)){
echo "Upload Successful";
} else {
}
// close the connection
ftp_close($conn_id);
//this script called below is to update your remote database. Its in the next example
$call_script = file_get_contents('http://path_to_your_script');
} else {
//if not connected to internet,....
echo "offline";
}
?>
The online script that should do the work, (the one you called in the last line of the previous code) should look something like this:
//make sure you're connected to remote database
<?php
//this function should compare num_rows of your two
//databases values (local remote) and return the
//difference. It's used with array_udiff function.
function compare_objects($obj_a, $obj_b) {
return $obj_a->id - $obj_b->id;
}
//this function should compare contents of your two
//databases values (local remote) and return the
//difference. It's used with array_udiff function.
function comparison($obj_a, $obj_b){
if ($obj_a==$obj_b){
return 0;
}else{
return -1;
}
}
$file = '../to_upload_remote.php';//the uploaded file
$current = file_get_contents($file);//load the file
$array = unserialize($current);//unserialize to get the object array
$remote_table_name = remote_table_name::find_all();//get what you have in
//remote database
//if a new value is added, create a new entry to database with new vals
if($try_new = array_udiff($array, $remote_table_name, 'compare_objects')){
foreach($try_new as $entry){
$remote_table_name = new remote_table_name();
$remote_table_name->value = $entry->value;
//depending on the number of your columns,
//add values to remote table that were not there before.
//you can use any other suitable method to do this.
if($remote_table_name->save()){
echo "the remote_table_name was saved successfully";
}
}
} else {
echo "same number of rows";
}
//if some values are changed, update them with new vals
if($try_change = array_udiff($array, $remote_table_name, 'comparison')){
foreach($try_change as $entry){
$remote_table_name = remote_table_name::find_by_id($entry->id);
$remote_table_name->value = $entry->value;
//depending on the number of your columns,
//update values to remote table.
//you can use any other suitable method to do this.
if($remote_table_name->save()){
echo "the remote_table_name was saved successfully";
}
}
} else {
echo "All values match";
}
?>
So, any time the first code is executed, it reads the local table, takes all the values and puts them in the local file, uploads the local file and replaces one in the remote folder, calls a remote script to check the unserialized local table and compares it with the online table, then does the necessary.

Pagination in CodeIgniter, need to disable query strings so we can cache

We have a CI installation that has the following setting in our config...
$config['enable_query_strings'] = TRUE;
We need this in order for another area of our application to run correctly with a third party API. What's happening, however, is that pagination is defaulting to a query string method of doing pagination, which doesn't play well with caching.
Right now, they look like this...
http://localhost/something/?&page=6
It's not playing well with caching, mainly because every page URL is the same page to CI. My goal is to get switched over to the below example without messing with global settings for the rest of my application.
I've been trying for hours to find a way to disable the above setting only within this single part of the application, so that we can properly have separate URLs for the pagination, like this...
http://localhost/something/1
http://localhost/something/2
http://localhost/something/3
So far, I have been unable to overide that setting for this controller, and honestly, I'm not sure there's even a way to actually do it. Any help is appreciated. There's got to be some method of disabling a feature for a single controller somehow.
Could you use routing?
$route['something/page/(:num)'] = "something?&page=$1";
edit: to turn off pagination query strings with $config['enable_query_strings'] = TRUE;
system/libraries/Pagination.php
~line 134
change
if ($CI->config->item('enable_query_strings') === TRUE OR $this->page_query_string === TRUE)
{
if ($CI->input->get($this->query_string_segment) != 0)
{
$this->cur_page = $CI->input->get($this->query_string_segment);
// Prep the current page - no funny business!
$this->cur_page = (int) $this->cur_page;
}
}
else
{
if ($CI->uri->segment($this->uri_segment) != 0)
{
$this->cur_page = $CI->uri->segment($this->uri_segment);
// Prep the current page - no funny business!
$this->cur_page = (int) $this->cur_page;
}
}
to
if ($CI->uri->segment($this->uri_segment) != 0)
{
$this->cur_page = $CI->uri->segment($this->uri_segment);
// Prep the current page - no funny business!
$this->cur_page = (int) $this->cur_page;
}
~line 196
if ($CI->config->item('enable_query_strings') === TRUE OR $this->page_query_string === TRUE)
{
$this->base_url = rtrim($this->base_url).'&'.$this->query_string_segment.'=';
}
else
{
$this->base_url = rtrim($this->base_url, '/') .'/';
}
to
$this->base_url = rtrim($this->base_url, '/') .'/';
that might do it. Or maybe better form would be to hook into the page...
Simple solution...
$this->config->set_item('enable_query_strings',FALSE);
Just put this before you call your pagination logic in the controller. Thanks go to Taftse in the #codeigniter IRC channel for this simple override.

Simple Login Attempt counter using MVC 3 and Ajax

Ok so this is driving me nuts. I am probably tired and the answer is looking at me.
public ActionResult _Login(LoginViewModel loginViewModel)
{
if (User.Identity.IsAuthenticated)
{
return JavaScript("window.location=" + "'" + loginViewModel.ReturntUrl + "'");
}
if (ModelState.IsValid)
{
if (Session["loginCount"] == null) //setup the session var with 0 count
{
Session.Add("loginCount", 0);
}
_loginStatus = _authenticationService.Authenticate(loginViewModel.SiteLoginViewModel.EmailAddress,
loginViewModel.SiteLoginViewModel.Password);
if(!_loginStatus.UserExists)
{
ModelState.AddModelError("SiteLoginViewModel.EmailAddress", _loginStatus.ErrorMessage);
return PartialView();
}
// This will only be true if the user types in the correct password
if(!_loginStatus.IsAuthenticated)
{
Session["loginCount"] = (int)Session["loginCount"] + 1;
Response.Write(Session["loginCount"]); // Counter is incremented twice!!!!
//_userService.SetInvalidLoginAttempts(loginViewModel.SiteLoginViewModel.EmailAddress, 1);
ModelState.AddModelError("SiteLoginViewModel.EmailAddress", _loginStatus.ErrorMessage);
return PartialView();
}
// DELETE ANY OPENID Cookies
var openidCookie = new HttpCookie("openid_provider");
if (openidCookie.Value != null)
{
openidCookie.Expires = DateTime.Now.AddDays(-1d);
Response.Cookies.Add(openidCookie);
}
_userService.SetInvalidLoginAttempts(loginViewModel.SiteLoginViewModel.EmailAddress, 0);
SetAuthTicket(loginViewModel.SiteLoginViewModel.EmailAddress, _userService.GetUserId(loginViewModel.SiteLoginViewModel.EmailAddress),
loginViewModel.SiteLoginViewModel.RemeberLogin);
if (!string.IsNullOrEmpty(loginViewModel.ReturntUrl))
{
return JavaScript("window.location=" + "'" + loginViewModel.ReturntUrl + "'");
}
return JavaScript("location.reload(true)");
}
return PartialView();
}
This almost seems that the request is being processed twice however when i step through with the debugger I only see it once. Please ignore the non important parts of the ActionMethod
This looks like you are tying to code for stuff that you automatically get with .Net's Membership provider.
Your first line "User.Identity.IsAuthenticated" looks like you are using part of membership provider but it would seem the rest is trying to code around it.
Also, why are you returning javascript to direct the user's browser to a new URL? Regarless of what .net platform you are on there are plenty of ways to redirect the user's browser without having to return raw javascript, which in my book is REALLY BAD.
##
This fixed the problem and will be removed rather than commented out. Including this twice is very bad obviously :)

Resources