Why Is :fs.start_link/2 Undefined? - interop

I'm attempting to use the fs library with Elixir. Elixir 1.2.5 and fs 0.9.2 on Windows 10. I've added fs to my dependencies in my mix.exs and it seems to be there (see below) but for some reason I keep getting an error trying to run :fs.start_link/2
I start iex -S mix and everything loads fine. I then try this: :fs.known_events() which returns: [:created, :modified, :removed, :renamed, :undefined] which is why I'm assuming that fs is getting loaded correctly.
But when I try :fs.start_link(:fs_watcher, "/users/ocaten~1") I get ** (UndefinedFunctionError) undefined function :fs.start_link/2 I tried :fs.start_link(:fs_watcher, '/users/ocaten~1') as well (char list vs. string) and get the same error. I checked the source of fs and there's definitely a start_link/2 function and it's exported so I'm really stumped by this.
Any suggestions as to how I might proceed?

It looks like :fs.start_link/2 was added on 11 Nov 2015 while version 0.9.2 was released on 23 Apr 2015. There is no release on hex.pm after 0.9.2 so you'll have to depend on the Github version if you want to use :fs.start_link/2:
mix.exs:
defp deps do
[{:fs, git: "https://github.com/synrc/fs"}]
end
Test:
iex(1)> :fs.start_link(:fs_watcher, "/tmp")
{:ok, #PID<0.168.0>}

Related

Go broken import

I'm trying to interact with helm via the go SDK and I'm getting the following error when I try to build my code:
../../../go/pkg/mod/github.com/deislabs/oras#v0.11.1/pkg/oras/push.go:52:31: not enough arguments in call to remotes.PushContent
have (context.Context, remotes.Pusher, v1.Descriptor, "github.com/containerd/containerd/content".Store, nil, func(images.Handler) images.Handler)
want (context.Context, remotes.Pusher, v1.Descriptor, "github.com/containerd/containerd/content".Store, *semaphore.Weighted, platforms.MatchComparer, func(images.Handler) images.Handler)
I've traced it down to package helm.sh/helm/v3/pkg/action :
$ go get helm.sh/helm/v3/pkg/action
# github.com/deislabs/oras/pkg/oras
../../../go/pkg/mod/github.com/deislabs/oras#v0.11.1/pkg/oras/push.go:52:31: not enough arguments in call to remotes.PushContent
have (context.Context, remotes.Pusher, v1.Descriptor, "github.com/containerd/containerd/content".Store, nil, func(images.Handler) images.Handler)
want (context.Context, remotes.Pusher, v1.Descriptor, "github.com/containerd/containerd/content".Store, *semaphore.Weighted, platforms.MatchComparer, func(images.Handler) images.Handler)
I suspect that this is related to this change: https://github.com/helm/helm/commit/663c5698878c959805de053116581d15673e1ce3
How do I fix this? I've tried using older versions of the helm package to no avail.
The signature for github.com/containerd/containerd/remotes.PushContent was changed incompatibly in commit f8c2f0, which was released in v1.5.0. (The containerd Go API appears to be unstable, despite its apparently-semantic version v1.5.5; see containerd#3554.)
The short-term fix is to go get -d github.com/containerd/containerd#v1.4 to downgrade to the latest v1.4.* until your dependencies are compatible with the latest release.
The longer-term fix appears to be helm commit 663c56, which migrates to a different oras library whose latest release is compatible with the containerd v1.5 API. As far as I can tell that commit has not yet been included in a helm release, but you may be able to try it out using go get -d helm.sh/helm/v3/pkg/action#main; see https://golang.org/doc/modules/managing-dependencies#repo_identifier.

Julia Error Building Mongo.jl Package on Windows 10

I installed Julia v0.6.2 on Windows 10.
Every other package was installed without a problem, but Mongo.jl give me this error while building it.
Pkg.build("Mongo")
INFO: Building LibBSON
===============================[ ERROR: LibBSON ]===============================
LoadError: None of the selected providers can install dependency libbson.
Use BinDeps.debug(package_name) to see available providers
while loading C:\Users\"user"\.julia\v0.6\LibBSON\deps\build.jl, in expression starting on line 27
================================================================================
INFO: Building Mongo
================================[ ERROR: Mongo ]================================
LoadError: None of the selected providers can install dependency libmongoc.
Use BinDeps.debug(package_name) to see available providers
while loading C:\Users\"user"\.julia\v0.6\Mongo\deps\build.jl, in expression starting on line 26
================================================================================
================================[ BUILD ERRORS ]================================
WARNING: LibBSON and Mongo had build errors.
- packages with build errors remain installed in C:\Users\ciko9\.julia\v0.6
- build the package(s) and all dependencies with Pkg.build("LibBSON", "Mongo")
- build a single package by running its deps/build.jl script
================================================================================
I already opened an issue on github, but I'd like to fix it asap. Any idea on how to solve this problem?
The solution is first to install manually the mongo c drivers.
Second, to copy the mongo-c-drivers folder to the mongo e libbson folder inside julia, then make a new file deps.jl for each package and write this:
# Macro to load a library
macro checked_lib(libname, path)
((VERSION >= v"0.4.0-dev+3844" ? Base.Libdl.dlopen_e : Base.dlopen_e)(path) == C_NULL) && error("Unable to load \n\n$libname ($path)\n\nPlease re-run Pkg.build(package), and restart Julia.")
quote const $(esc(libname)) = $path end
end
# Load dependencies
#checked_lib libbson "C:\\Users\\"userName"\\.julia\\v0.6\\LibBSON\\mongo-c-driver\\bin\\libbson-1.0.dll"
# Load-hooks

Error while connecting sparklyr to remote sparkR in Rstudio

I tried following command in my local RStudio session to connect to sparkR -
sc <- spark_connect(master = "spark://x.x.x.x:7077",
spark_home = "/home/hduser/spark-2.0.0-bin-hadoop2.7", version="2.0.0", config = list())
But, I am getting following error -
Error in start_shell(master = master, spark_home = spark_home, spark_version = version, :
SPARK_HOME directory '/home/hduser/spark-2.0.0-bin-hadoop2.7' not found
Any help?
Thanks in advance
may I ask you have you actually installed the spark into that folder?
Can you show the result of ls command in /home/ubuntu/ folder?
And sessionInfo() in R?
Let me please share with you how I am using the custom folder structure.
It is on Win, not Ubuntu but I guess it won't make much of the difference.
Using the most recent dev edition
If you would check on GitHub the RStudio guys are updating sparklyr almost every day fixing numerous reported bugs:
devtools::install_github("rstudio/sparklyr")
in my case only installation of sparklyr_0.4.12 has resolved problem with Spark 2.0 under Windows
Checking Spark availability
please check if version you're inquiring is available:
spark_available_versions()
You should see something like the line below, which indicates that the version you indend to use is actually available for your sparklyr package.
[13] 2.0.0 2.7 spark_install(version = "2.0.0", hadoop_version = "2.7")
Installation of Spark
Just to keep the order you may like to install spark in other location rather then home folder of RStudio cache.
options(spark.install.dir = "c:/spark")
Once you are sure the desire version is available it is time to install spark
spark_install(version = "2.0.0", hadoop_version = "2.7")
I'd check if it is install correctly (change it for shell ls if needed)
cd c:/spark
dir (in Win) | ls (in Ubuntu)
Now specify the location of the edition you want to use:
Sys.setenv(SPARK_HOME = 'C:/spark/spark-2.0.0-bin-hadoop2.7')
And finally enjoy the creation of connection:
sc <- spark_connect(master = "local")
I hope it helps.

Unable to run SparkR in Rstudio

I cant use sparkR in Rstudio because im getting some error: Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :
JVM is not ready after 10 seconds
I have tried to search for the solution but cant find one. Here is how I have tried to setup sparkR:
Sys.setenv(SPARK_HOME="C/Users/alibaba555/Downloads/spark") # The path to your spark installation
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library("SparkR", lib.loc="C/Users/alibaba555/Downloads/spark/R") # The path to the lib folder in the spark location
library(SparkR)
sparkR.session(master="local[*]",sparkConfig=list(spark.driver.memory="2g")*
Now execution starst with a message:
Launching java with spark-submit command
C/Users/alibaba555/Downloads/spark/bin/spark-submit2.cmd
sparkr-shell
C:\Users\ALIBAB~1\AppData\Local\Temp\Rtmp00FFkx\backend_port1b90491e4622
And finally after a few minutes it returns an error message:
Error in sparkR.sparkContext(master, appName, sparkHome,
sparkConfigMap, : JVM is not ready after 10 seconds
Thanks!
It looks like the path to your spark library is wrong. It should be something like: library("SparkR", lib.loc="C/Users/alibaba555/Downloads/spark/R/lib")
I'm not sure if that will fix your problem, but it could help. Also, what versions of Spark/SparkR and Scala are you using? Did you build from source?
What seemed to be causing my issues boiled down to the working directory of our users being a networked mapped drive.
Changing the working directory fixed the issue.
If by chance you are also using databricks-connect make sure that the .databricks-connect file is copied into the %HOME% of each user who will be running Rstudio or set up databricks-connect for each of them.

Ruby-Mp3Info error when saving/closing a file

I've upgraded to the new mp3info GEM and now there seems to be a problem when it comes to writing the changes to the MP3 tags. Ruby version 1.9.3, mp3info version 0.8.4, Windows 7 - 64 bit.
Simple program:
require 'mp3info'
mp3 = Mp3Info.open('a.mp3')
mp3.title = 'bogo'
mp3.close
Results in:
Errno::EACCES: Permission denied - (./.a.mp3.tmp, a.mp3)
from D:/Ruby193/lib/ruby/gems/1.9.1/gems/ruby-mp3info-0.8.4/lib/mp3info.rb:453:in `rename'
from D:/Ruby193/lib/ruby/gems/1.9.1/gems/ruby-mp3info-0.8.4/lib/mp3info.rb:453:in `close'
from (irb):6
from D:/Ruby193/bin/irb:12:in `<main>'
I've checked permissions on the file/folder, and I'm running as Administrator in the CMD shell.
Obsolete since ruby-mp3info version 0.8.8
This answer is obsolete since ruby-mp3info version 0.8.8
I think it is a bug in mp3info, (I checked the actual version 0.8.7).
The method Mp3Info#close does not always close the internal io.
If you modify mp3info.rb like this:
###>>>Approx line 370:
# Flush pending modifications to tags and close the file
# not used when source IO is a StringIO
def close
....
end
##### ###aprox line 465
#io.close unless #io.closed? ## <<<< add this.
#####
end
then the problem was solved for me.
This bug was reported and corrected in a pull request. The ruby-mp2info-gem version 0.8.8 (January 26, 2016 ) contains this correction.

Resources