R Markdown / Tinytex install / mirrors / AWS EC2 Windows - amazon-ec2

I use RStudio from an AWS EC2 Windows. To get .pdf file from my rmarkdown I tried to install MiKTeX on my EC2 and faced some issue, so I decided to try the tinytex solution. I correctly installed the tinytex package but I had some trouble to install tinytex from it. For your info, following security reasons only selected websites are whitelisted. So to be able to install tinytex I asked for whitelisting:
yihui. org
miktex. org
ci.appveyor. com
www.ctan. org
but I still can't get tinytex installed on my EC2 Windows. I also downloaded separetely TinyTeX.zip to install it but tinytex:::install_windows_zip() is not available anymore.
When I try tinytex::install_tinytex() I got the following error:
trying URL 'https://yihui.org/tinytex/TinyTeX-1.zip'
trying URL 'https://yihui.org/tinytex/TinyTeX-1.zip'
trying URL 'https://yihui.org/tinytex/TinyTeX-1.zip'
Error in xfun::download_file(..., quiet = Sys.getenv("APPVEYOR") != "") :
No download method works (auto/wininet/wget/curl/lynx)
In addition: Warning messages:
1: In download.file(url, output, ..., method = method) :
InternetOpenUrl failed: 'A connection with the server could not be established'
2: In download.file(url, output, ..., method = method) :
URL 'https://appveyorcidatav2.blob.core.windows.net/yihui-27038/tinytex/1-0-1628/31j2ygh3xv2sk7xd/TinyTeX-1.zip?sv=2015-12-11&sr=c&sig=HJOnYMwO2zT2iFAlwT859FH53nzIHB0oV2RslIq990g%3D&st=2021-08-05T14%3A40%3A04Z&se=2021-08-05T14%3A46%3A04Z&sp=r': status was 'SSL connect error'
3: In download.file(url, output, ..., method = method) :
InternetOpenUrl failed: 'A connection with the server could not be established'
I am not sure to understand the error message, should I ask for whitelisting another websites?
Thanks in advance for any assistance on that.
Mhd

Related

HBaseTestingUtility failing on Windows 10 with UnsatisfiedLinkError

I'm trying to get the HBaseTestingUtility running on Windows 10.
I'm using hbase-client and hbase-testing-util with version 1.4.2.
When running:
HBaseTestingUtility hbaseUtility = new HBaseTestingUtility();
hbaseUtility.startMiniCluster(); //<- error thrown on this line
I get the below error:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609)
at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
...
I have downloaded winutils, and have set the following user variables:
hadoop.home.dir=C:\Users\bwatson\apps\hadoop-2.8.3
HADOOP_HOME=C:\Users\bwatson\apps\hadoop-2.8.3
but this does not make a difference.
The official documentation for the HBaseTestingUtility says that Cygwin is needed on Windows, but I cannot install that due to the admin restrictions on my work machine. Is there any other solution?
After some digging, I found a solution in https://stackoverflow.com/a/43484457/729819. I %HADOOP_HOME%/bin to PATH. Now I get another error but will raise another question for that.

Google Assistant Hotword Detection Not Working

I'm new to Google Assistant SDK. Recently I embedded Assistant to Raspberry Pi 3 B. I used the googlesamples-assistant-hotword sample. It worked fine. But next time I run the command I got some errors. I reinstall the sdk and tried, but got the same error. However googlesamples-assistant-pushtotalk still works. But googlesamples-assistant-hotword gives errors. How can I fix this. The errors I got are as follows;
(env) pi#raspberrypi:~$ googlesamples-assistant-hotword
ON_MUTED_CHANGED:
{u'is_muted': False}
ON_START_FINISHED
E0922 08:01:19.189206804 9868 handshake.c:128] Security handshake failed: {"created":"#1506047479.189164282","description":"Handshake read failed","file":"../../third_party/grpc/src/core/lib/security/transport/handshake.c","file_line":237,"referenced_errors":[{"created":"#1506047479.189160249","description":"FD shutdown","file":"../../third_party/grpc/src/core/lib/iomgr/ev_epoll_linux.c","file_line":1045}]}
E0922 08:01:20.189959242 9868 handshake.c:128] Security handshake failed: {"created":"#1506047480.189916251","description":"Handshake read failed","file":"../../third_party/grpc/src/core/lib/security/transport/handshake.c","file_line":237,"referenced_errors":[{"created":"#1506047480.189911706","description":"FD shutdown","file":"../../third_party/grpc/src/core/lib/iomgr/ev_epoll_linux.c","file_line":1045}]}
E0922 08:01:21.190931499 9868 handshake.c:128] Security handshake failed: {"created":"#1506047481.190904069","description":"Handshake read failed","file":"../../third_party/grpc/src/core/lib/security/transport/handshake.c","file_line":237,"referenced_errors":[{"created":"#1506047481.190900454","description":"FD shutdown","file":"../../third_party/grpc/src/core/lib/iomgr/ev_epoll_linux.c","file_line":1045}]}
[9782:9800:ERROR:speech_recognition_activity.cc(550)] S3 connection has timed out: No data from S3

plugin install through proxy

I am trying to install Graph-Aided Search to integrate Neo4j with ElasticSearch (2.3.1) as shown here. But when I try this command line:
plugin install com.graphaware.es/graph-aided-search/2.3.2.0
I get errors:
plugin install com.graphaware.es/graph-aided-search/2.3.2.0
-> Installing com.graphaware.es/graph-aided-search/2.3.2.0...
Trying https://download.elastic.co/com.graphaware.es/graph-aided-search/graph-aided-search-2.3.2.0.zip ...
Trying https://search.maven.org/remotecontent?filepath=com/graphaware/es/graph-aided-search/2.3.2.0/graph-aided-search-2.3.2.0.zip ...
Trying https://oss.sonatype.org/service/local/repositories/releases/content/com/graphaware/es/graph-aided-search/2.3.2.0/graph-aided-search-2.3.2.0.zip ...
Trying https://github.com/com.graphaware.es/graph-aided-search/archive/2.3.2.0.zip ...
Trying https://github.com/com.graphaware.es/graph-aided-search/archive/master.zip ...
ERROR: failed to download out of all possible locations..., use --verbose to get detailed information
And this when I add --verbose for more details:
plugin install com.graphaware.es/graph-aided-search/2.3.2.0 --verbose
-> Installing com.graphaware.es/graph-aided-search/2.3.2.0...
Trying https://download.elastic.co/com.graphaware.es/graph-aided-search/graph-aided-search-2.3.2.0.zip ...
Failed: SocketTimeoutException[connect timed out]
Trying https://search.maven.org/remotecontent?filepath=com/graphaware/es/graph-aided-search/2.3.2.0/graph-aided-search-2.3.2.0.zip ...
Failed: SocketTimeoutException[connect timed out]
Trying https://oss.sonatype.org/service/local/repositories/releases/content/com/graphaware/es/graph-aided-search/2.3.2.0/graph-aided-search-2.3.2.0.zip ...
Failed: SocketTimeoutException[connect timed out]
Trying https://github.com/com.graphaware.es/graph-aided-search/archive/2.3.2.0.zip ...
Failed: SocketTimeoutException[connect timed out]
Trying https://github.com/com.graphaware.es/graph-aided-search/archive/master.zip ...
Failed: SocketTimeoutException[connect timed out]
ERROR: failed to download out of all possible locations..., use --verbose to get detailed information
I looked for that error on google and I found that it might be because of the proxy, and since I am working behind a proxy I tried something that basically worked for others. This is the command line that I tried to install the plugin through proxy:
C:\dev\elasticsearch-2.3.1\bin> plugin -Dhttps.proxyHost=http://example.test.fr -Dhttps.proxyPort=3128 -Dhttps.proxyUser=SomeUser -Dhttps.proxyPassword=Password install com.graphaware.es/graph-aided-search/2.3.2.0
But still it didn't work, and I am still getting the same error. I forced the authentification to the proxy on Internet Explorer, something that usually works for me when I install packages with Maven. However, in that case it didn't work.
I am struggling to integrate my Neo4j database with ElasticSearch, I have been trying that for 4days now and it is taking all my time and cannot work without making this integration with success.
I really appreciate any help or clarification to resolve this error. Thank you.
[UPDATE]
When I try to replace the path, it is still not working. I've tried this command line:
plugin install C:\dev\graph-aided-search-master\target\releases\graph-aided‌​-search-2.4.1.4-SNAP‌​SHOT.zip
and I got this error message:
-> Installing C:\dev\graph-aided-search-master\target\releases\graph-aided‌​-search-2.4.1.4-SNAP‌​SHOT.zip...
ERROR: Invalid prefix or suffix
I put the graph-aided-search Zip file in the same folder as plugin file, so in C:\dev\elasticsearch-2.3.1\bin. Then I tried with the relative path, and I got this message:
> plugin install graph-aided-search-2.4.1.4-SNAPSHOT.zip --verbose
-> Installing graph-aided-search-2.4.1.4-SNAPSHOT.zip...
Trying https://download.elastic.co/elasticsearch/release/org/elasticsearch/plugin/graph-aided-search-2.4.1.4-SNAPSHOT.zip/2.3.1/graph-aided-search-2.4.1.4-SNAPSHOT.zip-2.3.1.zip ...
Failed: SocketTimeoutException[connect timed out]
ERROR: failed to download out of all possible locations..., use --verbose to get detailed information
You can download the plugin manually from the MavenRepository
https://mvnrepository.com/artifact/com.graphaware.es/graph-aided-search
The correct syntax is this one:
plugin install com.graphaware.es/graph-aided-search/2.3.2.0 -DproxyHost=exampleHost -DproxyPort=portNumber
It is working now!

pip can't find distributions from within virtualenv

I set up a new virtualenv. From within it, pip cannot find any distributions. Outside of the env, it can. Here's the output:
(wagon-admin)[me#pjs-macbook-pro wagon-admin]$ pip install Django
Downloading/unpacking Django
Could not fetch URL https://pypi.python.org/simple/Django/: There was a problem confirming the ssl certificate: <urlopen error [Errno 1] _ssl.c:480: error:0D0890A1:asn1 encoding routines:ASN1_verify:unknown message digest algorithm>
Will skip URL https://pypi.python.org/simple/Django/ when looking for download links for Django
Could not fetch URL https://pypi.python.org/simple/: There was a problem confirming the ssl certificate: <urlopen error [Errno 1] _ssl.c:480: error:0D0890A1:asn1 encoding routines:ASN1_verify:unknown message digest algorithm>
Will skip URL https://pypi.python.org/simple/ when looking for download links for Django
Cannot fetch index base URL https://pypi.python.org/simple/
Could not fetch URL https://pypi.python.org/simple/Django/: There was a problem confirming the ssl certificate: <urlopen error [Errno 1] _ssl.c:480: error:0D0890A1:asn1 encoding routines:ASN1_verify:unknown message digest algorithm>
Will skip URL https://pypi.python.org/simple/Django/ when looking for download links for Django
Could not find any downloads that satisfy the requirement Django
No distributions at all found for Django
Storing complete log in /Users/me/.pip/pip.log
I'm on OSX, and created the virtual environment using virtualenvwrapper. $ mkvirtualenv <env name>
This happens for all packages, not just django.
Edit: Only similar thing I've found in my searching: https://github.com/pypa/pip/issues/829
I had the same problem but realized I hadn't activated my virtualenv. Once I activated it, the installation worked. Not sure why.
Looking at the command line you pasted, it looks like you activated your env, but just wanted to note this for others who happen to stumble across this.
I updated to python 2.7 and everything works fine now.

HPCC/HDFS Connector

Does anyone know about HPCC/HDFS connector.we are using both HPCC and HADOOP.There is one utility(HPCC/HDFS connector) developed by HPCC which allows HPCC cluster to acess HDFS data
i have installed the connector but when i run the program to acess data from hdfs it gives error as libhdfs.so.0 doesn't exist.
I tried to build libhdfs.so using command
ant compile-libhdfs -Dlibhdfs=1
its giving me error as
target "compile-libhdfs" does not exist in the project "hadoop"
i used one more command
ant compile-c++-libhdfs -Dlibhdfs=1
its giving error as
ivy-download:
[get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
[get] To: /home/hadoop/hadoop-0.20.203.0/ivy/ivy-2.1.0.jar
[get] Error getting http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
to /home/hadoop/hadoop-0.20.203.0/ivy/ivy-2.1.0.jar
BUILD FAILED java.net.ConnectException: Connection timed out
any suggestion will be a great help
Chhaya, you might not need to build libhdfs.so, depending on how you installed hadoop, you might already have it.
Check in HADOOP_LOCATION/c++/Linux-<arch>/lib/libhdfs.so, where HADOOP_LOCATION is your hadoop install location, and arch is the machine’s architecture (i386-32 or amd64-64).
Once you locate the lib, make sure the H2H connector is configured correctly (see page 4 here).
It's just a matter of updating the HADOOP_LOCATION var in the config file:
/opt/HPCCSystems/hdfsconnector.conf
good luck.

Resources