I want to log out the user if he is inactive for some specific duration. USing sess_expiration in config file, it gives the timing from login not from inactive state.
So how can I do this using codeigniter?
you can store the time in a session when the user logging in like this:
$_SESSION['loginTime'] = time();
and when the user do any action in the system, check if the user exceed the specified time
if($_SESSION['loginTime'] < time()+$yourtime){
logout();
}else{
$_SESSION['loginTime'] = time();
}
Related
I do not want the user to be logged out of the site even if the person is idle for, it is okay if the person is logged out if he has closed the browser.
session.gc_maxlifetime = 180000
session.gc_probability = 1
session.gc_divisor = 1
session.save_path = "/var/lib/php/session"
cookie_lifetime = 0
Is there any setting that i am missing?
Please help
To set the life time i have added the following code.
session_set_cookie_params(21600);
session_start();
You need extend your live time of cookie, remember that session id is stored in user webbrowser within cookie, set session.cookie_lifetime with a more big value too.
session_set_cookie_params(21600);
session_start();
21600 seconds is only 6 hours
Try setting to something bigger maybe even PHP_INT_MAX
Dont know whether it will help just wrote to give u the idea of how?....cookie are saved at user browser so ,
$cookieName = "userscookie";
$lifetime = time() + (60*60*24); // one day life
if(isset($_COOKIE[$cookieName])) {
$value = $_COOKIE[$cookieName];
// one day life from day of access
setcookie($cookieName, $value, $lifetime);
} else {
$value = "this value to store";
setcookie($cookieName, $value, $lifetime);
}
output:
Thankyou
I try to understand this Login example.
There is a procedure called "checkWithServerIfSessionIdIsStillLegal".
I am wondering how the server can validate if a session is still valid because the session id is always different when the user closes the browser.
Can someone explain me how does this work?
By setting session id in onModule load(when he logs in) as a cookie and checking again after he accessing login page.
String sessionID = result.getSessionId();
final long DURATION = 1000 * 60 * 60 * 24 * 1;
Date expires = new Date(System.currentTimeMillis() + DURATION);
Cookies.setCookie("sid", sessionID, expires, null, "/", false);
Here is the complete implemetation of checkWithServerIfSessionIdIsStillLegal(),which you are referring.
Take a look at the following link.
Cannot use same cookie for multiple GWT applications
This might solve your problem.
I need to renew a long-lived access token. I read
Renew long lived access token server side topic and wrote a code as follows:
<?php
$code = $_REQUEST["code"];
if(empty($code)) {
$dialog_url = "https://www.facebook.com/dialog/oauth?"
. "client_id=$app_id"
. "&redirect_uri=$my_url"
. "&scope=..."
;
echo("<script> top.location.href='" . $dialog_url . "'</script>");
}
else
{
$response = file_get_contents("https://graph.facebook.com/oauth/access_token?"
. "client_id=$app_id"
. "&redirect_uri=$my_url"
. "&client_secret=$app_secret"
. "&code=$code"
);
$params = null;
parse_str($response, $params);
$access_token=$params['access_token'];
$response = file_get_contents("https://graph.facebook.com/oauth/access_token?"
. "client_id=$app_id"
. "&client_secret=$app_secret"
. "&redirect_uri=$my_url"
. "&grant_type=fb_exchange_token"
. "&fb_exchange_token=$access_token"
);
}
?>
On the first invocation it acquires 60-days access token all right. I expect that on the next invocations it would acquire another (may be with the same name) 60-days tokens, and I would see in the Debugger https://developers.facebook.com/tools/debug that issue time and expiration time changes, but the times do not change. What's wrong with my scenario?
Have you compared the tokens to each other? Facebook will send you back the existing access token when the same call is made in less than 24 hours. Also, tokens are set to not expire if you have also requested page tokens for the user. See my answer here: Facebook Page Access Tokens - Do these expire? for more info on this subject.
One way you can be sure to get a new token each time is if you revoke access by making an http DELETE call to /PROFILE_ID/permissions and then requesting a new token. The only bad thing about this is it will require you to put the user through the oAuth dialog again.
Im reciveing 403 User Rate Limit Exceeded error making querys but I'm sure I'm not exceding.
In the past I've reach the rate limimt doing inserts and It was reflected in the job list as
[errorResult] => Array
(
[reason] => rateLimitExceeded
[message] => Exceeded rate limits: too many imports for this project
)
But in this case the jobs-list doesn't reflect the query (nor error or done), and studing the job-list i haven't reach the limits or have been close to reach it (no more than 4 concurrent querys and each processing 692297 Bytes)
I've the billing active, and I've make only 2.5K querys in the las 28 days.
Edit: The user limit is set up to 500.0 requests/second/user
Edit: Error code recived
User Rate Limit Exceeded User Rate Limit Exceeded
Error 403
Edit: code that I use to make the query jobs and get results
function query_data($project,$dataset,$query,$jobid=null){
$jobc = new JobConfigurationQuery();
$query_object = new QueryRequest();
$dataset_object = new DatasetReference();
$dataset_object->setProjectId($project);
$dataset_object->setDatasetId($dataset);
$query_object->setQuery($query);
$query_object->setDefaultDataset($dataset_object);
$query_object->setMaxResults(16000);
$query_object->setKind('bigquery#queryRequest');
$query_object->setTimeoutMs(0);
$ok = false;
$sleep = 1;
while(!$ok){
try{
$response_data = $this->bq->jobs->query($project, $query_object);
$ok = true;
}catch(Exception $e){ //sleep when BQ API not avaible
sleep($sleep);
$sleep += rand(0,60);
}
}
try{
$response = $this->bq->jobs->getQueryResults($project, $response_data['jobReference']['jobId']);
}catch(Exception $e){
//do nothing, se repite solo
}
$tries = 0;
while(!$response['jobComplete']&&$tries<10){
sleep(rand(5,10));
try{
$response = $this->bq->jobs->getQueryResults($project, $response_data['jobReference']['jobId']);
}catch(Exception $e){
//do nothing, se repite solo
}
$tries++;
}
$result=array();
foreach($response['rows'] as $k => $row){
$tmp_row=array();
foreach($row['f'] as $field => $value){
$tmp_row[$response['schema']['fields'][$field]['name']] = $value['v'];
}
$result[]=$tmp_row;
unset($response['rows'][$k]);
}
return $result;
}
Is there any other rate limits? or it is a bug?
Thanks!
You get this error trying to import CSV files right?
It could be one of these reasons:
Import Requests
Rate limit: 2 imports per minute
Daily limit: 1,000 import requests per day (including failures)
Maximum number of files to import per request: 500
Maximum import size per file: 4GB2
Maximum import size per job: 100GB2
The query() call is, in fact, limited by the 20-concurrent limit. The 500 requests / second / user limit in developer console is somewhat misleading -- this is just the number of total calls (get, list, etc) that can be made.
Are you saying that your query is failing immediately and never shows up in the job list?
Do you have the full error that is being returned? I.e does the 403 message contain any additional information?
thanks
I've solved the problem by using only one server to make the requests.
Looking what I was doing different in the night cronjobs (that never fail) the only diference was I was using only one client in one server instead of using diferent clients in 4 diferent servers.
Now I have only one script in one server that manages the same number of querys and now it never gets the User Rate Limit Exceded error.
I think there is a bug managing many clients or many active IPs at a time, althought the total number of threads never exceds 20.
I'm working on a blue green pattern for a system with continuous delivery. I would like to force the users to switch server after 30 min. I'm developing my application in JSF 2. Is there an easy way to make the sessions end after a certain time no matter if the user is active or not?
Implement a filter which does basically the following job:
HttpSession session = request.getSession();
if (session.isNew()) {
session.setAttribute("start", System.currentTimeMillis());
}
else {
long start = (Long) session.getAttribute("start");
if (System.currentTimeMillis() - start > (30 * 60 * 1000)) {
session.invalidate();
response.sendRedirect("expired.xhtml");
return;
}
}
chain.doFilter(request, response);
Map this on the servlet name or URL pattern of interest.
This is only sensitive to changes in system clock, make sure that your server runs UTC all the time. Otherwise better grab System#nanoTime() instead.