Error while importing into SQL instance using .sql file - ruby

def import_sql()
#
# Authtication completed
# Authenticated obj = sqladmin
#
sql_import=Google::Apis::SqladminV1beta4::ImportContext.new
sql_import.database='postgres'
sql_import.file_type ='SQL'
sql_import.import_user ='postgres'
sql_import.uri='gs://sample-bucket-name/sample.sql'
operation=sqladmin.import_instance(instance_name,sql_import)
end
I am getting an error message:
Invalid request: One and only one file path must be specified when
importing data
I have only specified one path. Not sure where this error is coming from

Related

Hive Load Data:No files matching path file:/home/hive/sample.log

I am trying to load sample.log file on HDP-sandbox
My initial efforts
LOAD DATA LOCAL INPATH 'sample.log' OVERWRITE INTO TABLE logs;
It seems that path is not matching
Error: Error while compiling statement: FAILED: SemanticException Line 1:23 Invalid path ''sample.log'': No files matching path file:/home/hive/sample.log (state=42000,code=40000)
I logged out,moved to /root,then entered hive
0: jdbc:hive2://sandbox-hdp.hortonworks.com:2> LOAD DATA LOCAL INPATH '/root/Hadoop_Spark_Fundamentals_Code_Notes-V3.0/Lesson-6/Lesson-6.2_Hive/sample.log' OVERWRITE INTO TABLE logs;
Full path does not work either.
Error: Error while compiling statement: FAILED: SemanticException Line 1:23 Invalid path ''/root/Hadoop_Spark_Fundamentals_Code_Notes-V3.0/Lesson-6/Lesson-6.2_Hive/sample.log'': No files matching path file:/root/Hadoop_Spark_Fundamentals_Code_Notes-V3.0/Lesson-6/Lesson-6.2_Hive/sample.log (state=42000,code=40000)
It looks to me that it confuses /root and /home/hive.
How to set the proper path?
Your statement is being executed by user 'hive'. Make sure local file has permissions that allow 'hive' read access to it.

What is mean a valid name as a DNS name component?

I write command :
docker stack deploy --compose-file=docker-compose.yml testlogin/test and get this error - Error response from daemon: rpc error: code = 3 desc = name must be valid as a DNS name component
What name is considered valid?
As the error message indicates, the name of the stack is invalid. This is likely due to the / in the name.
The stack name may contain spaces or other special characters.

Failed with FTP palettes

For the palette "FTP Dir", I put "D:\FullPath\MyFolder" as Directory parameter. I get the following error after deployment :
Cannot perform FTP Operation: DIR. Error Info: Unexpected reply code. Returned Code: 550. Detail Description: CWD failed. "/D:/FullPath/MyFolder": directory not found. Expected Code: 250.
The name is concat with '/' and '\' are replaces by '/'. I develop on Windows, deploy on linux server and my FTP directory is on one Windows server.
When I try to use the 'FTP Put Palette', I have the error 'Invalid Filename'
Thank you.
You should not specify the absolute directory path in the Directory parameter but the path relative to the root directory of your FTP server.
In your case, assuming the FTP root directory is D:\FullPath and you want to access MyFolder, then you should set Directory to /MyFolder.

Checking the contents of an empty directory over FTP raises exception

I am not sure what I am missing, but I have FTP connection that I have to download the contents (if any). I run the following code:
connection.nlst('*.*').each do |entry|
connection.getbinaryfile(entry, downloaded_file_path)
end
The problem is when the folder is empty, it raises Net::FTPPermError Exception: 550 *: No such file or directory.. But when the folder has content, it works just fine.
I am not sure what to try, but here is some of the output to show that the connection is working:
> connection.list
["total 0"]
> connection.pwd
"/ftp/pub/Responses"
> connection.nlst
Net::FTPPermError Exception: 550 *: No such file or directory.
I would expect nlst to return an empty array, not raise an exception?

How to bypass permission denied error?

The following example writes a point shapefile to disc. However, I get an error when the script tries to write a shapefile to C:/. I am able to write to a external hard drive though (G:/). The following is the error I receive in R:
Error in file(out.name, "wb") : cannot open the connection In
addition: Warning message: In file(out.name, "wb") : cannot open file
'c:/test.shp': Permission denied
How can I bypass or resolve this error?
# available from: cran.r-project.org/web/packages/shapefiles/shapefiles.pdf
# Samples of using the convert.to.shapefile function to write out simple shapefiles
# from basic R data.frames
require(shapefiles)
require(maptools)
dd <- data.frame(Id=c(1,2),X=c(3,5),Y=c(9,6))
ddTable <- data.frame(Id=c(1,2),Name=c("Item1","Item2"))
ddShapefile <- convert.to.shapefile(dd, ddTable, "Id", 1)
write.shapefile(ddShapefile, "C:/test", arcgis=T)
shape <- readShapePoints("C:/test")
plot(shape)
Simple answer, do not write to the root-level directory of the system volume.
There are a few good reasons to create files/directories at the root of C:, but this isn't one of them. Use C:/Temp/test instead.

Resources