I want to get the data of a url and store it in a file. I used this command
wget -O output_file.txt "http://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&sensor=true"
and it worked fine , and the data got stored in output_file.txt. Now when I used another url like this :
wget -O output_file.txt "https://maps.googleapis.com/maps/api/place/nearbysearch/json?location=-33.8670522,151.1957362&radius=500&types=food&name=harbour&sensor=false&key=YOUR_API_KEY"
, its showing an error
Resolving maps.googleapis.com (maps.googleapis.com)... 74.125.135.95, 2404:6800:4001:c01::5f
Connecting to maps.googleapis.com (maps.googleapis.com)|74.125.135.95|:443... failed:
Connection refused. Connecting to maps.googleapis.com (maps.googleapis.com)|2404:6800:4001:c01::5f|:443... failed: Network is unreachable.
Also the data is not getting stored in the mentioned file. Also i tried with other url, it worked fine, except the google places api url just mentioned above. I would like to mention that the problematic url is working fine, if i just type it in some browser. But that way, i can only vbiew the data. But i want that data to be stored in some file.
all problems are in linux (ubuntu 12.04)
Related
I'm trying to connect to the oracle database via sqlplus hosted on a remote instance using command like this sqlplus user#hostname.com:port/SchemaName. And on typing password it throws weird client host issue. See the screenshot below.
The question is which host name its expecting me to put inside /etc/hosts?
Whereas I can telnet successfully to same instance without any issues. By the way I'm on MacOS 10.15.7.
Oracle clients were installed using this link. Can be seen from the screenshot below.
The installation was moved to the appropriate folder.
The $PATH was also exported to ~/.bash_profile file
I was able to resolve this issue by the solution mentioned on this link.
Using hostname command I was able to find host name of my machine, which was required to make a successful connection to the Oracle.
The screenshot below explains the process
The hostname entry looks like below
Running WSO2 Enterprise Integrator 6.5.0. on RHEL 7. We are in the proces of building flows to read files from an sftp server. But setting up the sftp connection towards a Windows SFTP server fails. We can access this Windows SFTP server correctly with Windows clients like FileZilla/WinSCP.
With netstat we see a connection is build towards the Windows SFTP server but the flow isn't moving - no files are being read. On the point of stopping the server the error as shown below is printed in the wso2carbon.log.
When setting up the connection towards a Linux sftp server ( Plain RHEL 7 box with SSHD ) we don't face any issues. We have the matching private key place under .ssh/id_rsa in the home dir of the user running WSO2 EI.
Searching for the error message ( see snippet below ) we should get it resolved by adding the transport.vfs.AvoidPermissionCheck=true parameter to the VFS URL but unfortunately this doesn't solve our issue.
This is the VFS URL we are using.
sftp://SFTPUSER#SERVER.ACMECORP.ORG/inputdir?transport.vfs.AvoidPermissionCheck=true;vfs.passive=true
Is this a configuration that should work and are we missing a configuration option? Or is this a bug in the WSO2 software?
These URL's mention the issue we are facing.
VFS2 Error cannot delete file and could not get the groups id of the current user (error code: -1)
https://issues.apache.org/jira/browse/VFS-617
https://github.com/wso2/product-ei/issues/3725
[2019-12-06 13:48:59,724] [-1] [] [vfs-Worker-2] ERROR {org.apache.synapse.transport.vfs.VFSTransportListener} - Error checking for existence and readability : sftp://SFTPUSER#SERVER.ACMECORP.ORG/inputdir?transport.vfs.AvoidPermissionCheck=true;vfs.passive=true
org.apache.commons.vfs2.FileSystemException: Could not determine if file "sftp://SFTPUSER#SERVER.ACMECORP.ORG/inputdir?transport.vfs.AvoidPermissionCheck=true;vfs.passive=true" is readable.
at org.apache.commons.vfs2.provider.AbstractFileObject.isReadable(AbstractFileObject.java:1494)
at org.apache.synapse.transport.vfs.VFSTransportListener.scanFileOrDirectory(VFSTransportListener.java:295)
at org.apache.synapse.transport.vfs.VFSTransportListener.poll(VFSTransportListener.java:188)
at org.apache.synapse.transport.vfs.VFSTransportListener.poll(VFSTransportListener.java:134)
at org.apache.axis2.transport.base.AbstractPollingTransportListener$1$1.run(AbstractPollingTransportListener.java:67)
at org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.jcraft.jsch.JSchException: Could not get the groups id of the current user (error code: -1)
at org.apache.commons.vfs2.provider.sftp.SftpFileSystem.getGroupsIds(SftpFileSystem.java:219)
at org.apache.commons.vfs2.provider.sftp.SftpFileObject.getPermissions(SftpFileObject.java:250)
at org.apache.commons.vfs2.provider.sftp.SftpFileObject.doIsReadable(SftpFileObject.java:264)
at org.apache.commons.vfs2.provider.AbstractFileObject.isReadable(AbstractFileObject.java:1492)
... 8 more
UPDATE
Using the same URL but then setting up the WSO2 flow to write a file towards the SFTP server works.
Got this resolved with support from WSO2.
The correct VFS url to use is.
sftp://SFTPUSER#SERVER.ACMECORP.ORG/inputdir?transport.vfs.AvoidPermissionCheck=true&vfs.passive=true So a '&' seperator instead of a ';'.
The documentation of WSO2 just is very fuzzy about the correct syntax to use.
They give different examples across their documentation.
https://docs.wso2.com/display/EI650/VFS+Transport
https://docs.wso2.com/display/EI650/File+Inbound+Protocol
https://docs.wso2.com/display/EI650/Configuring+File+Inbound+Protocol+for+FTP%2C+SFTP+and+FILE+Connections
I am using Mailgun to send mail and It works well on my local server, but on my Web Server, where the site is hosted on HackFlix, It produces this error after sending
GuzzleHttp\Exception\ConnectException
cURL error 7: (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
I have done many research to this problem and i didn't get any correct answer that solves cURL error 7. Some people say that i should ping or nslookup smtp.mailgun.org but I don't have SSH shell access on my server. Please, what is the solution to this problem?
I am testing out Lando for a new local dev setup.
Everything has gone well so far but I am running into an error that I don't get on my old Vagrant/VM environment, but do with Lando.
We have part of the site that uses Guzzle to fetch data from a web service. On Lando, I get a cURL error:
cURL error 7: Failed to connect to webservice.internalsite.com port 80: No route to host
How can I resolve this? When I try to ping that url, it says "Destination host unreachable". I am using Docker For Mac if that factors in.
The error in its entirety reads:
psql: could not connect to server: No such file or directory. Is the
server running locally and accepting connections on Unix domain socket
"/tmp/.s.PGSQL.5432"?
This is my second time setting up Postgresql via Homebrew on my Mac, and I have no clue what is going on. Previously, it had been working. At some point, I must've entered a command that messed things up. I'm not sure. Now, whenever I enter a SQL command from the command line, I receive the above message. I've run a command to check whether the server is running, and it apparently is not. If I attempt to start the server using
$ postgres -D /usr/local/pgsql/data
I receive the following error:
postgres cannot access the server configuration file
"/usr/local/pgsql/data/postgresql.conf": No such file or directory
I've uninstalled and reinstalled Postgresql via Homebrew, but the problem persists. I'm completely at a loss as to how to get this working. Any help would be appreciated.
your data directory is most likely wrong.
issue a "sudo find / -name "postgresql.conf" " on your terminal to see where your postgres file resides. Then, do an ls in the data directory. Use that in the -D option when starting postgres.