I'm trying to do a report of all the objects in all the projects we have in Cloud Storage of our Org. I'm using this repo from the Google Professionnal Services as it's doing exactly what we want: https://github.com/GoogleCloudPlatform/professional-services/tree/main/tools/gcs2bq
We want to use containers instead of just the go code on a Cloud Function for portability mainly.
Locally everything is good and the program behave as expected but when I try in Cloud Run things get tricky. From what I understand, the go part needs to listen to a port, which I added at the beginning of the main so the container can be deployed, which it is:
// Determine port for HTTP service
port := os.Getenv("PORT")
if port == "" {
port = "8080"
log.Printf("defaulting to port %s", port)
}
Start HTTP server.
log.Printf("listening on port %s", port)
if err := http.ListenAndServe(":"+port, nil); err != nil {
log.Fatal(err)
}
But as you can see in the repo, the first file called is the run.sh one. Which set environment variables and then call the main.go. It sucessfully complete it's task, which is get all the size of the different files. But after that the run.sh doesnt "resume" and go to the part where it uploads the data in a BigQuery table, which locally work.
Here is the part in the run.sh file where I have a problem. Note : I don't have errors from executing the ./gcs2bq Note 2 : Every environment variable has a correct value
./gcs2bq $GCS2BQ_FLAGS || error "Export failed!" 2 <- doesnt get past this line
gsutil mb -p "${GCS2BQ_PROJECT}" -c standard -l "${GCS2BQ_LOCATION}" -b on "gs://${GCS2BQ_BUCKET}" || echo "Info: Storage bucket already exists: ${GCS2BQ_BUCKET}"
gsutil cp "${GCS2BQ_FILE}" "gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}" || error "Failed copying ${GCS2BQ_FILE} to gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}!" 3
bq mk --project_id="${GCS2BQ_PROJECT}" --location="${GCS2BQ_LOCATION}" "${GCS2BQ_DATASET}" || echo "Info: BigQuery dataset already exists: ${GCS2BQ_DATASET}"
bq load --project_id="${GCS2BQ_PROJECT}" --location="${GCS2BQ_LOCATION}" --schema bigquery.schema --source_format=AVRO --use_avro_logical_types --replace=true "${GCS2BQ_DATASET}.${GCS2BQ_TABLE}" "gs://${GCS2BQ_BUCKET}/${GCS2BQ_FIL$
error "Failed to load gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME} to BigQuery table ${GCS2BQ_DATASET}.${GCS2BQ_TABLE}!" 4
gsutil rm "gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}" || error "Failed deleting gs://${GCS2BQ_BUCKET}/${GCS2BQ_FILENAME}!" 5
rm -f "${GCS2BQ_FILE}"
I'm kinda new to containers and Cloud Run and even after reading projects and documentation, I'm not sure what I'm doing wrong, Is it normal that the .sh is "stuck" when calling the main.go? I can provide more details/explaination if needed.
Okay so for anyone who encounter similar situation this is how I made it work for me.
The container isn't supposed to stop so no exit, it will just go back to the main function.
That means that when I called executable it just looped and never exited and completed the task. So the solution here is to "recode" everything past the call in golang directly into the main.go
Here the run.sh is then useless so I used another .go file that listen for http request and then call the code that gather data and send it to Bigquery.
I'm working with the LetsEncrypt dns-01 challenge system which entails dynamically creating a TXT record in Google Cloud DNS with specific content, so LE can assert proof of ownership for generating a wildcard certificate (so I can't use http-01). The problem is sometimes LE tells me to create a TXT record that starts with a "-", for example -E_DFDFHJKF1783FSHDJ. I cannot get the gcloud cli to properly accept this data no matter what I do.
Example:
gcloud dns record-sets transaction start --zone=myzone
gcloud dns record-sets transaction add "-E_ASDFSDF" --ttl=30 --zone=myzone --name=test --type=TXT
gcloud dns record-sets transaction remove "-A_DSFKHSDF" --ttl=30 --zone=myzone --name=test2 --type=TXT
If you run those commands and inspect the resulting transaction.yaml you can see whether it properly contains the right string. If it did it correct, you should see something like:
- kind: dns#resourceRecordSet
name: test.
rrdatas:
- '"ASDFASDF"'
ttl: 30
type: TXT
I am executing this via Node's child_process, but I have the issue even if I execute it directly from bash, so Node isn't really meaningful issue at the moment. I've tried echoing the value in. I've tried setting an environment variable and using that in the string.
No matter what I do I get an error like the following:
ERROR: (gcloud.dns.record-sets.transaction.add) unrecognized arguments: -E_ASDFSDF
It turns out some characters need to be escaped in the CLI. I can confirm that the following works:
gcloud dns --project=myprojectid record-sets transaction add "\-test123" --name=test.mydomain.com. --ttl=300 --type=TXT --zone=myzoneid
I have defined 3 steps in my deploy process:
Generate password (Run a Script)
Deploy admin API (Deploy an IIS Web Site)
Deploy public API (Deploy an IIS Web Site)
In step 1, I use following PowerShell script to generate random password:
[Reflection.Assembly]::LoadWithPartialName("System.Web")
$pwd = [System.Web.Security.Membership]::GeneratePassword(15,2)
Set-OctopusVariable -name "Password" -value $pwd -sensitive
There are some variables defined in the project under Variables section, and they correctly replace corresponding values in config files in step 2 and 3.
My question is, how to use Password variable from step 1 to replace corresponding fields in config files in steps 2 and 3?
You need to use the name of the step that the variable was created in when you retrieve it:
In the config file (in the package for steps 2 and 3) use a value like this:
#{Octopus.Action[NameOfStep1].Output.Password}
it's weird, when developing localhost, everything works fine, the default page shows.
after upload to server, it just show blank page !
it's driving me crazy !
echo 'outside route';
Route::get('/', function()
{
echo 'inside route';
return View::make('hello');
});
both echo works, but View::make('hello') just don't work, views/hello.php is the default file.
You might have to fix your permissions on the remote server, as it might be a cache issue.
1) Run recursive chmod on you storage path (*assuming you already have proper file ownage)
cd /path/to/laravel
chmod -R 755 app/storage
2) Clear cache with Artisan
php artisan cache:clear
3) Refresh page, should work now.
*if you are running the http server as different user (for example you're on Ubuntu and Apache runs as user www-data), you might want to set file ownage for Laravel app files as well
chown -R www-data .
EDIT:
Just a remark about your code example - remember that if you want to use Blade templating engine you have to name your files accordingly. If you want to have a blade template called 'something', you will place your code in app/views/something.blade.php and than reffer to it for example View::make('something').
I have this at the very top of my send.php file:
ob_start();
#session_start();
//some display stuff
$_SESSION['id'] = $id; //$id has a value
header('location: test.php');
And the following at the very top of my test.php file:
ob_start();
#session_start();
error_reporting(E_ALL);
ini_set('display_errors', '1');
print_r($_SESSION);
When the data sends to test.php, the following is displayed:
Array ( )
Warning: Unknown: open(/var/lib/php/session/sess_isu2r2bqudeosqvpoo8a67oj02, O_RDWR) failed: Permission denied (13) in Unknown on line 0
Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/var/lib/php/session) in Unknown on line 0
I've tried only using session_start(); but the results are the same.
Look at your message
So first thing it relate to permission
open(/var/lib/php/session/sess_isu2r2bqudeosqvpoo8a67oj02, O_RDWR) failed: Permission denied (13) in Unknown on line 0
you have to check file permission
change mode this /var/lib/php/session/
Second thing it relate to session.save_path
Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/var/lib/php/session) in Unknown on line 0
in php.ini
[Session]
; Handler used to store/retrieve data.
session.save_handler = files
; Argument passed to save_handler. In the case of files, this is the path
; where data files are stored. Note: Windows users have to change this
; variable in order to use PHP's session functions.
;
; As of PHP 4.0.1, you can define the path as:
;
; session.save_path = "N;/path"
;
; where N is an integer. Instead of storing all the session files in
; /path, what this will do is use subdirectories N-levels deep, and
; store the session data in those directories. This is useful if you
; or your OS have problems with lots of files in one directory, and is
; a more efficient layout for servers that handle lots of sessions.
;
; NOTE 1: PHP will not create this directory structure automatically.
; You can use the script in the ext/session dir for that purpose.
; NOTE 2: See the section on garbage collection below if you choose to
; use subdirectories for session storage
;
session.save_path = /tmp/ <= HERE YOU HAVE TO MAKE SURE
; Whether to use cookies.
session.use_cookies = 1
you have to change your session.save_path setting to the accessible dir, /tmp/ for example
How to change: http://php.net/session_save_path
Being on the shared host, it is advised to set your session save path inside of your home directory but below document root
also note that
using ob_start is unnecessary here,
and I am sure you put # operator by accident and already going to remove it forever, don't you?
This was a known bug in version(s) of PHP . Depending on your server environment, you can try setting the sessions folder to 777:
/var/lib/php/session (your location may vary)
I ended up using this workaround:
session_save_path('/path/not/accessable_to_world/sessions');
ini_set('session.gc_probability', 1);
You will have to create this folder and make it writeable. I havent messed around with the permissions much, but 777 worked for me (obviously).
Make sure the place where you are storing your sessions isn't accessible to the world.
This solution may not work for everyone, but I hope it helps some people!
You can fix the issue with the following steps:
Verify the folder exists with sudo cd /var/lib/php/session. If it does not exist then sudo mkdir /var/lib/php/session or double check the logs to make sure you have the correct path.
Give the folder full read and write permissions with sudo chmod 666 /var/lib/php/session.
Rerun you script and it should be working fine, however, it's not recommended to leave the folder with full permissions. For security, files and folders should only have the minimum permissions required. The following steps will fix that:
You should already be in the session folder so just run sudo ls -l to find out the owner of the session file.
Set the correct owner of the session folder with sudo chown user /var/lib/php/session.
Give just the owner full read and write permissions with sudo chmod 600 /var/lib/php/session.
NB
You might not need to use the sudo command.
Go to your PHP.ini file or find PHP.ini EZConfig on your Cpanel and set your session.save_path to the full path leading to the tmp file, i.e: /home/cpanelusername/tmp
please make sure the session.save_path is set correctly in the php.ini. php needs read/write access to the directory to which this variable is set.
more information: http://www.php.net/manual/en/session.configuration.php#ini.session.save-path
I had the same error everything was correct like the setting the folder permissions.
It looks like an bug in php in my case because when i delete my PHPSESSID cookie it was working again so aperently something was messed up and the session got removed but the cookie was still active so php had to define the cause differently and checking first if the session file is still they and give another error and not the permission error
When using latest WHM (v66.0.23) you may go to MultiPHP INI Editor choose PHP version and set session.save_path to default i.e. /var/cpanel/php/sessions/ea-php70 instead of previous simple tmp - this helped me to get rid of such errors.
When using the header function, php does not trigger a close on the current session. You must use session_write_close to close the session and remove the file lock from the session file.
ob_start();
#session_start();
//some display stuff
$_SESSION['id'] = $id; //$id has a value
session_write_close();
header('location: test.php');
check your cpanels space.remove unused file or error.log file & then try to login your application(This work for me);
I got these two error messages, along with two others, and fiddled around for a while before discovering that all I needed to do was restart XAMPP! I hope this helps save someone else from the same wasted time!
Warning: session_start(): open(/var/folders/zw/hdfw48qd25xcch5sz9dd3w600000gn/T/sess_f8bgs41qn3fk6d95s0pfps60n4, O_RDWR) failed: Permission denied (13) in /Applications/XAMPP/xamppfiles/htdocs/foo/bar.php on line 3
Warning: session_start(): Cannot send session cache limiter - headers already sent (output started at /Applications/XAMPP/xamppfiles/htdocs/foo/bar.php:3) in /Applications/XAMPP/xamppfiles/htdocs/foo/bar.php on line 3
Warning: Unknown: open(/var/lib/php/session/sess_isu2r2bqudeosqvpoo8a67oj02, O_RDWR) failed: Permission denied (13) in Unknown on line 0
Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/var/lib/php/session) in Unknown on line 0
I'm using php-5.4.45 and I got the same problem.
If you are a php-fpm user, try edit php-fpm.conf and change listen.owner and listen.group to the right one. My nginx user is apache, so here I change these to params to apache, then it works well for me.
For apache user, I guess you should edit your fast-cgi params refer the two params I mention above.
If you use a configured vhost and find the same error then you can override the default setting of php_value session.save_path under your <VirtualHost *:80>
#
# Apache specific PHP configuration options
# those can be override in each configured vhost
#
php_value session.save_handler "files"
php_value session.save_path "/var/lib/php/5.6/session"
php_value soap.wsdl_cache_dir "/var/lib/php/5.6/wsdlcache"
Change the path to your own '/tmp' with chmod 777.